共查询到20条相似文献,搜索用时 15 毫秒
1.
Frédéric Ghersi 《Environmental Modeling and Assessment》2014,19(5):345-360
This paper surveys the use made of modelling expertise in the recent literature focused on the policy making of low-carbon societies in Europe, both peer-reviewed and ‘grey’. The first section focuses on the prominent policy instrument of carbon pricing. It starts by analysing the somewhat confusing use made of carbon pricing modelling in policy reports emanating from the French and British governments, then reviews some modelling results on carbon pricing in a ‘second best’ world. The second section lists the impressive collection of more focused policy instruments that are advocated in both governmental and non-governmental literature. It insists on the contrast between the high degree of precision of some of these policy proposals and the limited modelling of their impacts, either from an environmental or an economic point of view. The third section concludes on recommendations to the policy modelling community inspired by this survey. Purposely avoiding the current controversies surrounding cost–benefit analysis, it advocates further applied research on the cost efficiency of carbon pricing trajectories (when flexibility); on the terra incognita beyond first best uniform pricing (where flexibility); on the elicitation of policy overlaps; and on the modelling of extended policy portfolios in comprehensive, consistent modelling frameworks. 相似文献
2.
Manju R. Agrawal John Boland Barbara Ridley 《Environmental Modeling and Assessment》2013,18(4):481-492
In the financial year 2011–2012, wind farms supplied 26 % of South Australia’s electricity demand according to the Australian Energy Market Operator’s report. This contribution has risen from zero in 2003. The operation of the electricity grid depends heavily on knowledge of the variability of supply. Wind farm output displays similar conditional volatility as financial market variables. In this paper, a new method of estimating wind farm output volatility on a 5-min time scale is developed through the use of higher-frequency wind farm output data. First, an autoregressive model for the high-frequency data is developed, and it is used to derive a volatility measure for 5-min data. The results are also true in certain general situations when the high-frequency data follow an autoregressive moving average process or exhibits long memory features. The methods described here are analogous to realised volatility measures used in financial series, except that wind farm output data are measured at uniform intervals, unlike random trading times for financial transactions. 相似文献
3.
Daniela Lagomarsino V. Tofani S. Segoni F. Catani N. Casagli 《Environmental Modeling and Assessment》2017,22(3):201-214
Classification and regression problems are a central issue in geosciences. In this paper, we present Classification and Regression Treebagger (ClaReT), a tool for classification and regression based on the random forest (RF) technique. ClaReT is developed in Matlab and has a simple graphic user interface (GUI) that simplifies the model implementation process, allows the standardization of the method, and makes the classification and regression process reproducible. This tool performs automatically the feature selection based on a quantitative criterion and allows testing a large number of explanatory variables. First, it ranks and displays the parameter importance; then, it selects the optimal configuration of explanatory variables; finally, it performs the classification or regression for an entire dataset. It can also provide an evaluation of the results in terms of misclassification error or root mean squared error. We tested the applicability of ClaReT in two case studies. In the first one, we used ClaReT in classification mode to identify the better subset of landslide conditioning variables (LCVs) and to obtain a landslide susceptibility map (LSM) of the Arno river basin (Italy). In the second case study, we used ClaReT in regression mode to produce a soil thickness map of the Terzona catchment, a small sub-basin of the Arno river basin. In both cases, we performed a validation of the results and a comparison with other state-of-the-art techniques. We found that ClaReT produced better results, with a more straightforward and easy application and could be used as a valuable tool to assess the importance of the variables involved in the modeling. 相似文献
4.
This paper proposes a statistical model for insurance claims arising from climatic events, such as tornadoes in the USA, that exhibit a large variability both in frequency and intensity. To represent this variability and seasonality, the claims process modelled by a Poisson process of intensity equal to the product of a periodic function, and a multifractal process is proposed. The size of claims is modelled in a similar way, with gamma random variables. This method is shown to enable simulation of the peak times of damage. A two-dimensional multifractal model is also investigated. The work concludes with an analysis of the impact of the model on the yield of weather bonds linked to damage caused by tornadoes. 相似文献
5.
S. D. Turner N. L. Rose B. Goldsmith J. M. Bearcock C. Scheib H. Yang 《Environmental monitoring and assessment》2017,189(5):241
Members of the public in England were invited in 2010 to take part in a national metals survey, by collecting samples of littoral sediment from a standing water body for geochemical analysis. To our knowledge, this is the first national sediment metals survey using public participation and reveals a snapshot of the extent of metals contamination in ponds and lakes across England. Hg, Ni, Cu, Zn and Pb concentrations exceeding sediment quality guidelines for the health of aquatic biota are ubiquitous in ponds and lakes, not just in areas with a legacy of industrial activity. To validate the public sampling approach, a calibration exercise was conducted at ten water bodies selected to represent a range of lakes found across England. Sediment concentrations of Hg, Ni, Cu, Zn and Pb were measured in samples of soil, stream and littoral and deep water sediment to assess inputs. Significant differences between littoral sediment metal concentrations occur due to local variability, but also organic content, especially in upland, peat soil catchments. Variability of metal concentrations between littoral samples is shown to be low in small (<20 ha) lowland lakes. Larger and upland lakes with more complex inputs and variation in organic content of littoral samples have a greater variability. Collection of littoral sediments in small lakes and ponds, with or without voluntary participation, can provide a reliable sampling technique for the preliminary assessment of metal contamination in standing waters. However, the heterogeneity of geology, soils and history/extent of metal contamination in the English landscape, combined with the random nature of sample collection, shows that systematic sampling for evaluating the full extent of metal contamination in lakes is still required. 相似文献
6.
Fabian Kesicki 《Environmental Modeling and Assessment》2013,18(1):27-37
Marginal abatement cost (MAC) curves are a useful policy tool to communicate findings on the technological structure and the economics of CO2 emissions reduction. However, existing ways of generating MAC curves do not display consistent technological detail and do not consider system-wide interactions and uncertainty in a structured manner. This paper details a new approach to overcome the present shortcomings by using an energy system model, UK MARKAL, in combination with index decomposition analysis. In addition, this approach allows different forms of uncertainty analysis to be used in order to test the robustness of the MAC curve. For illustration purposes, a sensitivity analysis concerning fossil fuel prices is applied to the transport sector of the UK. The resulting MAC curves are found to be relatively robust to different fuel costs at higher CO2 tax levels. The new systems-based approach improves MAC curves through the avoidance of an inconsistent emissions baseline, the incorporation of system-wide interactions and the price responsiveness of demand. 相似文献
7.
Locating and forecasting water needs can assist the location of water in dry regions, and improve the management of reservoirs and the canal network. Satellite, ground data, and agrometeorological data were combined to forecast the volume of irrigation water needed during 1993 and 1994 in an irrigation district of 327 km2 located in the Ebro basin, Spain. The main crops were rice, alfalfa plus forage, winter cereals (barley and wheat), sunflower and maize. Their extent was estimated every year by frame area sampling and a regression estimator with satellite data. Initial crop area statistics were obtained by expansion of the sample areas to the entire study area and then a regression estimator with the multitemporal supervised classification of two Landsat-5 TM images was applied. This procedure improved the precision of the estimates by expansion. Net water requiremets (m3 ha-1) of the above mentioned crops were computed from reference evapotranspiration estimates, crop coefficients and effective precipitation. These computations were performed for an average year, i.e. by using long-term averaged meteorological data. Crop hectarage and net crop water requirements were multiplied to obtain, for the entire study area, the volume (hm3 106 m3) of the net crop water requirements. After subtraction of water taken directly from the rivers and non-productive sunflower, the irrigation water volumes were estimated. The comparison of these forecasts with the volumes of water invoiced by the Ebro Basin Water Authority confirmed the feasibility of forecasting the volume of water applied to an individual irrigation district. This is an objective and practical method for estimating the irrigation water volume applied in an irrigated area. 相似文献
8.
Emmanuelle Cam John R. Sauer James D. Nichols James E. Hines Curtis H. Flather 《Environmental monitoring and assessment》2000,63(1):81-94
Species richness of local communities is a state variable commonly used in community ecology and conservation biology. Investigation of spatial and temporal variations in richness and identification of factors associated with these variations form a basis for specifying management plans, evaluating these plans, and for testing hypotheses of theoretical interest. However, estimation of species richness is not trivial: species can be missed by investigators during sampling sessions. Sampling artifacts can lead to erroneous conclusions on spatial and temporal variation in species richness. Here we use data from the North American Breeding Bird Survey to estimate parameters describing the state of bird communities in the Mid-Atlantic Assessment (MAIA) region: species richness, extinction probability, turnover and relative species richness. We use a recently developed approach to estimation of species richness and related parameters that does not require the assumption that all the species are detected during sampling efforts. The information presented here is intended to visualize the state of bird communities in the MAIA region. We provide information on 1975 and 1990. We also quantified the changes between these years. We summarized and mapped the community attributes at a scale of management interest (watershed units). 相似文献
9.
In this study, we present the digital evaluation of Landsat TM data and field spectral measurements for retrieving chlorophyll-a (chl-a) concentration and trophic state index in Lake Chagan of Northeast China. Chl-a concentration of the lake can be estimated from the band ratio (TM4/TM3) and the field spectral data at 670 nm (absorption
peak) and at 700 nm (reflectance peak). The results show that the best determination coefficient (R
2) is 0.67 from the TM data, by which chl-a distribution can be mapped. Based on chl-a determination from laboratory analysis, field spectral and TM data, the modified trophic state index (TSIM) was applied to assess the lake’s trophic state. With the available data in Lake Chagan, each algorithm demonstrates the
similar result for assessing the lake’s chl-a and trophic state. Our results indicate that Landsat TM and field spectral data could be used effectively to determine chl-a concentration and evaluate the trophic state of Lake Chagan in the study. 相似文献
10.
Alexander Baklanov 《Environmental monitoring and assessment》2000,65(1-2):181-189
Different urban air pollution problems deal with complex structure of air flows and turbulence. For such problems the Computer Fluid Dynamics (CFD) methods become widely used. However, this approach despite a number of advantages has some problems. Experience of use of CFD tools for development of models and suggestions of their applications for a local scale air pollution over a complex terrain and stable stratification are discussed in this paper, including: Topography and complex geometry: choose of the co-ordinate system and computer grid; Turbulence closure for air pollution modelling: modified k- model for stable stratified ABL; Boundary conditions for vertical profiles of velocity for stable-stratified atmosphere; Effects of the radiation and thermal budget of inclined surfaces to dispersion of pollutants; Artificial sources of air dynamics and circulation.Some examples of CFD applications for air pollution modelling for a flat terrain, mountainous area, mining open cast and indoor ventilation are discussed. Modified k- model for stably-stratified ABL is suggested. Due to the isotropic character of the k- model a combination of it in vertical with the sub-grid turbulence closure in horizontal can be more suitable for ABL. An effective scheme of boundary conditions for velocity profiles, based on the developed similarity theory for stable-stratified ABL, is suggested. Alongside with the common studies of atmospheric dispersion, the CFD methods have also demonstrated a good potential for studying anthropogenic and artificial-ventilation sources of air dynamic and circulation in local-scale processes of air pollution. 相似文献
11.
Mathew R. Heal Trygve Tunes Iain J. Beverland 《Environmental monitoring and assessment》2000,62(3):333-340
By extending the method of Stedman (1998), daily dataof atmospheric concentrations of gravimetricPM10, black smoke (BS) and sulphate aerosol (SA)from national networks were analysed to determine thetrends in time of the contribution of different sources of particulate matter to total PM10 measured in central Edinburgh. Since BS is an indicator of combustion-related primary sources of particulate matter, the quantity obtained by subtraction of daily BS from daily PM10 is indicative of the contribution to total PM10 from other primary sources and from secondary aerosol. This PM10-BS statistic was regressed on SA, since SA is an indicator of variation in secondary aerosol source. For Edinburgh, SA is a considerably better indicator of PM10-BS during summer than winter (reflecting the much greater photochemical generation of secondary aerosol in summer) and there is evidence that the contribution of other secondary aerosol (presumably nitrate aerosol) has increased relative to SA between 1992 and 1997. The concentration of non-combustion primary particulate material (marine aerosol, suspended dust) to PM10 in Edinburgh has not changed over this period but is about twice that calculated as the U.K. national average. The increasing input to PM10 from secondary aerosol sources at regional rather than urban scale has important implications for ensuring local air quality compliance. The method should have general applicability to other locations. 相似文献
12.
Eco-Efficiency of Electric and Electronic Appliances: A Data Envelopment Analysis (DEA) 总被引:2,自引:0,他引:2
Y. Barba-Gutiérrez B. Adenso-Díaz S. Lozano 《Environmental Modeling and Assessment》2009,14(4):439-447
Several papers have studied the eco-efficiency of manufacturing systems to address strategic socioeconomic issues in the context
of sustainability analysis. Their goal has been to take into account not only environmental impact aspects throughout the
whole life cycle but also to incorporate the associated economic value as well, thus, giving a comprehensive vision of both
factors. This paper focuses on different commonplace household electric appliances, comparing their eco-efficiency computed
using a data envelopment analysis model. We consider the retail price as a measure of the product’s economic value and the
ecopoint LCA score as the assessment of its environmental impact. We conclude that cell phones and the bulky analyzed appliances
have the highest eco-efficiency scores, whereas the rest would require a more environmentally friendly redesign and/or an
increase in their perceived value to improve their eco-efficiency. 相似文献
13.
Environmental Modeling & Assessment - Soil salinity and alkalinity seriously threaten crop production, soil productivity, and sustainable agriculture, especially in arid and semi-arid areas,... 相似文献
14.
NALCO – the largest exporter of aluminium in India has a power plant of 720 MW capacity in Nandira watershed in Angul district of Orissa. The power plant utilises local coal to generate thermal power and disposes of large amount of ash which accumulates in slurry form at nearby two ash ponds. These ash ponds were breached on 31 December 2000, causing ash accumulation for entire regime of the Nandira river. An attempt has been made towards preparation of recovery and rehabilitation plan for NALCO using temporal Remote Sensing data and GIS. Indian remote sensing satellite data for pre-breach condition 12 December 2000, during breach event 31 December 2000 and post-breach condition 4 and 6 January 2001 has been digitally analysed for Nandira watershed. The satellite data of coarse spatial resolution provides the absence and presence of fresh sediment deposition along Nandira watershed and Brahmani river pertaining to pre-breach and post-breach conditions respectively on regional scales. The temporal comparison of fine resolution has clearly highlighted the aerial extent of damage caused by the disaster for entire watershed on local scales. The GIS has helped in demarcation of freshly accumulated ash at interval of 500 m along the river length as well as in delineation of maximum ash accumulation across the river width. The study has clearly demonstrated the use of temporal Remote Sensing data in conjunction with GIS for disaster management in terms of recovery and rehabilitation plan preparation of the Nandira watershed. 相似文献
15.
Nabil Semmar Maurice Jay Muhammad Farman Maurice Roux 《Environmental Modeling and Assessment》2008,13(1):17-33
The quantitative assessment of plant diversity and its monitoring with time represent a key environmental issue for management
and conservation of natural resources. Assessment of plant diversity could be based on chemical analyses of secondary metabolites
(e.g. flavonoids, terpenoids), because of the substantial quantitative and qualitative between-individual variability in such
compounds. At a geographical scale, the plant populations become widely dispersed, and their monitoring from numerous routine
individual analyses could become restricting. To overcome such constraint, this study develops a multivariate calibration
model giving the relative frequency of a particular taxon from a simple high-performance liquid chromatography (HPLC) analysis
of a plant mixture. The model was built from a complete set of mixtures combining different taxons, according to an experimental
design (Scheffé’s matrix). For each mixture, a reference HPLC pattern was simulated by averaging the individual HPLC profiles
of the constitutive taxons. The calibration models, based on Bayesian discriminant analysis (BDA), gave statistical relationships
between the contributions of each taxon in mixtures and reference HPLC patterns of these mixtures. Finally, these models were
validated on new mixtures by using outside plants. This new biodiversity survey approach is illustrated on four chemical taxons
(four chemotypes) of Astragalus caprinus (Fabaceae). The more differentiated the taxon, the better predicted its contributions (in mixtures) were by BDA calibration
model. This new approach could be very useful for a global routine survey of plant diversity. 相似文献
16.
17.
18.
Anthony R. Olsen Blaine D. Snyder Leanne L. Stahl Jennifer L. Pitt 《Environmental monitoring and assessment》2009,150(1-4):91-100
The National Lake Fish Tissue Study (NLFTS) was the first survey of fish contamination in lakes and reservoirs in the 48 conterminous states based on a probability survey design. This study included the largest set (268) of persistent, bioaccumulative, and toxic (PBT) chemicals ever studied in predator and bottom-dwelling fish species. The U.S. Environmental Protection Agency (USEPA) implemented the study in cooperation with states, tribal nations, and other federal agencies, with field collection occurring at 500 lakes and reservoirs over a four-year period (2000–2003). The sampled lakes and reservoirs were selected using a spatially balanced unequal probability survey design from 270,761 lake objects in USEPA’s River Reach File Version 3 (RF3). The survey design selected 900 lake objects, with a reserve sample of 900, equally distributed across six lake area categories. A total of 1,001 lake objects were evaluated to identify 500 lake objects that met the study’s definition of a lake and could be accessed for sampling. Based on the 1,001 evaluated lakes, it was estimated that a target population of 147,343 (±7% with 95% confidence) lakes and reservoirs met the NLFTS definition of a lake. Of the estimated 147,343 target lakes, 47% were estimated not to be sampleable either due to landowner access denial (35%) or due to physical barriers (12%). It was estimated that a sampled population of 78,664 (±12% with 95% confidence) lakes met the NLFTS lake definition, had either predator or bottom-dwelling fish present, and could be sampled. 相似文献
19.
This paper constructs a system dynamics model for simulating the impact of different strategies on urban traffic’s energy consumption and carbon emissions. Based on a case study in Beijing, the model includes three subsystems: (1) urban traffic, (2) population and economy, and (3) energy consumption and carbon emissions. First, the model is used to decompose the impact of different vehicles on energy consumption and carbon emissions. Decomposition results show that private cars have the most significant impact on urban traffic’s energy consumption and carbon emissions; however, total vehicle kilometers traveled by private cars are the smallest among four trip modes. Then, the model is used to simulate different urban traffic policies. Policies are categorized as follows: (a) driving restrictions on vehicle registration numbers, (b) a scheme for vehicle registrations via a lottery system, and (c) development of public transportation infrastructures. Scenario simulation results show that all those measures can reduce energy consumption and carbon emissions. Though the last strategy (c) contains several delays, its effect is more stable and far-reaching. Finally, some recommendations about easing traffic pressure and reducing traffic emissions are given. 相似文献
20.
Bechir Raggad 《Environmental Modeling and Assessment》2018,23(1):99-116
Climate change is one of the most fiercely debated scientific issues in recent decades, and the changes in climate extremes are estimated to have greater negative impacts on human society and the natural environment than the changes in mean climate. Extreme value theory is a well-known tool that attempts to best estimate the probability of adversarial risk events. In this paper, the focus is on the statistical behaviour of extreme maximum values of temperature. Under the framework of this theory, the methods of block maxima and threshold exceedances are employed. Due to the non-stationary characteristic of the series of temperature values, the generalized extreme value distribution and the generalized Pareto distribution were extended to the non-stationary processes by including covariates in the parameters of the models. For the purpose of obtaining an approximately independent threshold excesses, a declustering method was performed and then the de-clustered peaks were fitted to the generalized Pareto distribution. The stationary Gumbel distribution was found a reasonable model for the annual block maxima; however, a non-stationary generalized extreme value distribution with quadratic trend in the location was recommended for the half-yearly period. The findings also show that there is an improvement in modelling daily maxima temperature when it is applied to the declustered series and the given model outperforms the non-stationary generalized Pareto distribution models. Furthermore, the retained generalized Pareto distribution model proved better than the generalized extreme value distribution. Estimates of the return levels obtained from both extreme value models show that new records on maximum temperature event could appear within the next 20, 50 and 100 years. 相似文献