首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Current coastal spatial planning in Sweden uses simple methods to account for how flood risks increase owing to sea level rise. Those methods, however, fail to account for several important aspects of sea level rise, such as: projection uncertainty, emission scenario uncertainty and time dependence. Here, enhanced methods that account for these uncertainties are applied at several locations along the coast. The relative importance of mean sea level rise and extreme events for flood risk is explored for different timeframes. A general conclusion for all locations is that, extreme events dominate the flood risk for planning periods lasting a few decades. For longer planning periods, lasting toward the end of the century, the flood risk is instead dominated by the risk of high sea level rise. It is argued that these findings are important for assessments of future flood risk, and that they should be reflected in coastal spatial planning.  相似文献   

2.
Simulating uncertainty in climate-pest models with fuzzy numbers   总被引:2,自引:0,他引:2  
Inputs in climate-pest models are commonly expressed as point estimates ('crisp' numbers), which implies perfect knowledge of the system in study. In reality, however, all model inputs harbor some level of uncertainty. This is particularly true for climate change impact assessments where the inputs (i.e., climate projections) are highly uncertain. In this study, uncertainties in climate projections were expressed as 'fuzzy' numbers; these are uncertain numbers for which one knows that there is a range of possible values and that some values are 'more possible' than others. A generic pest risk model incorporating the combined effects of temperature, soil moisture, and cold stress was implemented in a fuzzy spreadsheet environment and run with three climate scenarios: (1) present climate (control run); (2) crisp climate change; and (3) fuzzy climate change. Under the crisp climate change scenario, winter and summer temperatures and precipitation were altered using best estimates (averaged predictions from the 1995 assessment report of the Intergovernmental Panel on Climate Change [IPCC]). Under the fuzzy scenario, climate changes were expressed as triangular fuzzy numbers, utilizing the extremes (lowest and highest predictions from the IPCC report) in addition to the best estimates. Under each scenario, environmental favorability was calculated for six locations in two geographical regions (Central North America and Southern Europe) with two hypothetical pest species having temperate or mediterranean climate requirements. Simulations with the crisp climate change scenario suggested only minor changes in overall environmental favorability compared with the control run. When simulations were conducted with the fuzzy climate change scenario, however, important changes in environmental favorability emerged, particularly in Southern Europe. In that region, the possibility of considerably increased winter precipitation led to increased values of environmental favorability. However, the simulations also showed that this result harbored a very broad range of possible outcomes. The results support the notion that uncertainty in climate change projections must be reduced before reliable impact assessments can be achieved.  相似文献   

3.
Most simple models for the meso-scale transport of gaseous pollutants are unable to take into account the change of direction of the wind vector with height above the surface, and this can lead to errors in predicting ground level concentrations. Expressions are derived for the ground level trajectory of a diffusing cloud taking the height of emission and the depth of the mixing layer into account. These are used to define an effective wind direction dependent upon travel time which can be used to improve dispersion estimates.  相似文献   

4.
Combustion of coal, oil, and natural gas, and to a lesser extent deforestation, land-cover change, and emissions of halocarbons and other greenhouse gases, are rapidly increasing the atmospheric concentrations of climate-warming gases. The warming of approximately 0.1-0.2 degrees C per decade that has resulted is very likely the primary cause of the increasing loss of snow cover and Arctic sea ice, of more frequent occurrence of very heavy precipitation, of rising sea level, and of shifts in the natural ranges of plants and animals. The global average temperature is already approximately 0.8 degrees C above its preindustrial level, and present atmospheric levels of greenhouse gases will contribute to further warming of 0.5-1 degrees C as equilibrium is re-established. Warming has been and will be greater in mid and high latitudes compared with low latitudes, over land compared with oceans, and at night compared with day. As emissions continue to increase, both warming and the commitment to future warming are presently increasing at a rate of approximately 0.2 degrees C per decade, with projections that the rate of warming will further increase if emission controls are not put in place. Such warming and the associated changes are likely to result in severe impacts on key societal and environmental support systems. Present estimates are that limiting the increase in global average surface temperature to no more than 2-2.5 degrees C above its 1750 value of approximately 15 degrees C will be required to avoid the most catastrophic, but certainly not all, consequences of climate change. Accomplishing this will require reducing emissions sharply by 2050 and to near zero by 2100. This can only be achieved if: (1) developed nations move rapidly to demonstrate that a modem society can function without reliance on technologies that release carbon dioxide (CO2) and other non-CO2 greenhouse gases to the atmosphere; and (2) if developing nations act in the near-term to sharply limit their non-CO2 emissions while minimizing growth in CO2 emissions, and then in the long-term join with the developed nations to reduce all emissions as cost-effective technologies are developed.  相似文献   

5.
Klijn F  de Bruijn KM  Knoop J  Kwadijk J 《Ambio》2012,41(2):180-192
Climate change and sea level rise urge low-lying countries to draft adaption policies. In this context, we assessed whether, to what extent and when the Netherlands’ current flood risk management policy may require a revision. By applying scenarios on climate change and socio-economic development and performing flood simulations, we established the past and future changes in flood probabilities, exposure and consequences until about 2050. We also questioned whether the present policy may be extended much longer, applying the concept of ‘policy tipping points’. Climate change was found to cause a significant increase of flood risk, but less than economic development does. We also established that the current flood risk management policy in the Netherlands can be continued for centuries when the sea level rise rate does not exceed 1.5 m per century. However, we also conclude that the present policy may not be the most attractive strategy, as it has some obvious flaws.  相似文献   

6.
A receptor model for predicting future PM10 concentrations has been developed within the framework of the UK Airborne Particles Expert Group and applied during the recently completed review of the UK National Air Quality Strategy. The model uses a combination of measured PM10, oxides of nitrogen and particulate sulphate concentrations to provide daily estimates of the contributions to total particle concentrations from primary combustion, secondary and other (generally coarse) particle sources. Projections of past and future concentrations of PM10 are estimated by applying appropriate reductions to the current concentrations of the three components based on an understanding of the likely impact of current policies on future levels. Projections have been derived from 1996, 1997 and 1998 monitoring data and compared with UK national air quality objectives and European Union limit values. One of the key uncertainties within the receptor modelling method is the assignment of the residual PM10, remaining after the assignment of primary combustion and secondary particle contributions, to the ‘other’ particle fraction. An examination of the difference between measured PM10 and PM2.5 concentrations confirms our assignment of the bulk of this residual to coarse particles. Projections based on 1996 monitoring data are the highest and those based on 1998 monitoring data are the lowest. Whilst there is considerable difference between these projections they are consistent with measured concentrations for previous years. All three projections suggest that with current agreed policies the EU annual mean limit value will be achieved. The 24-h mean limit value is projected to be achievable when projections are derived from 1997 and 1998 data, but not from 1996 data. All three projections suggest that with current agreed policies the central London site will not achieve the provisional 1997 UK National Air Quality Strategy objective.  相似文献   

7.
Abstract

Although emission inventories are the foundation of air quality management and have supported substantial improvements in North American air quality, they have a number of shortcomings that can potentially lead to ineffective air quality management strategies. Major reductions in the largest emissions sources have made accurate inventories of previously minor sources much more important to the understanding and improvement of local air quality. Changes in manufacturing processes, industry types, vehicle technologies, and metropolitan infrastructure are occurring at an increasingly rapid pace, emphasizing the importance of inventories that reflect current conditions. New technologies for measuring source emissions and ambient pollutant concentrations, both at the point of emissions and from remote platforms, are providing novel approaches to collecting data for inventory developers. Advances in information technologies are allowing data to be shared more quickly, more easily, and processed and compared in novel ways that can speed the development of emission inventories. Approaches to improving quantitative measures of inventory uncertainty allow air quality management decisions to take into account the uncertainties associated with emissions estimates, providing more accurate projections of how well alternative strategies may work. This paper discusses applications of these technologies and techniques to improve the accuracy, timeliness, and completeness of emission inventories across North America and outlines a series of eight recommendations aimed at inventory developers and air quality management decision-makers to improve emission inventories and enable them to support effective air quality management decisions for the foreseeable future.  相似文献   

8.
Abstract

Combustion of coal, oil, and natural gas, and to a lesser extent deforestation, land-cover change, and emissions of halocarbons and other greenhouse gases, are rapidly increasing the atmospheric concentrations of climate-warming gases. The warming of approximately 0.1–0.2 °C per decade that has resulted is very likely the primary cause of the increasing loss of snow cover and Arctic sea ice, of more frequent occurrence of very heavy precipitation, of rising sea level, and of shifts in the natural ranges of plants and animals. The global average temperature is already approximately 0.8 °C above its preindustrial level, and present atmospheric levels of greenhouse gases will contribute to further warming of 0.5–1 °C as equilibrium is re-established. Warming has been and will be greater in mid and high latitudes compared with low latitudes, over land compared with oceans, and at night compared with day. As emissions continue to increase, both warming and the commitment to future warming are presently increasing at a rate of approximately 0.2 °C per decade, with projections that the rate of warming will further increase if emission controls are not put in place. Such warming and the associated changes are likely to result in severe impacts on key societal and environmental support systems. Present estimates are that limiting the increase in global average surface temperature to no more than 2–2.5 °C above its 1750 value of approximately 15 °C will be required to avoid the most catastrophic, but certainly not all, consequences of climate change. Accomplishing this will require reducing emissions sharply by 2050 and to near zero by 2100. This can only be achieved if: (1) developed nations move rapidly to demonstrate that a modern society can function without reliance on technologies that release carbon dioxide (CO2) and other non-CO2 greenhouse gases to the atmosphere; and (2) if developing nations act in the near-term to sharply limit their non-CO2 emissions while minimizing growth in CO2 emissions, and then in the long-term join with the developed nations to reduce all emissions as cost-effective technologies are developed.  相似文献   

9.
Quantitative methods for characterizing variability and uncertainty were applied to case studies of oxides of nitrogen and total organic carbon emission factors for lean-burn natural gas-fueled internal combustion engines. Parametric probability distributions were fit to represent inter-engine variability in specific emission factors. Bootstrap simulation was used to quantify uncertainty in the fitted cumulative distribution function and in the mean emission factor. Some methodological challenges were encountered in analyzing the data. For example, in one instance, five data points were available, with each data point representing a different market share. Therefore, an approach was developed in which parametric distributions were fitted to population-weighted data. The uncertainty in mean emission factors ranges from as little as approximately +/-10% to as much as -90 to +180%. The wide range of uncertainty in some emission factors emphasizes the importance of recognizing and accounting for uncertainty in emissions estimates. The skewness in some uncertainty estimates illustrates the importance of using numerical simulation approaches that do not impose restrictive symmetry assumptions on the confidence interval for the mean. In this paper, the quantitative method, the analysis results, and key findings are presented.  相似文献   

10.
A theory for the rise of a plume in a horizontal wind is proposed in which it is assumed that, for some distance downwind of a high stack, the effects of atmospheric turbulence may be ignored in comparison with the effects of turbulence generated by the plume. The theory, an extension of the local similarity ideas used by Morton, Taylor, and Turner,1 has two empirical parameters which measure the rate that surrounding fluid is entrained into the plume. Laboratory measurements of buoyant plume motion in laminar unstratified cross flow are used to estimate the empirical parameters. Using this determination of the parameters in the theory, the trajectories of atmospheric plumes may be predicted. To make such a prediction, the observed wind velocity and temperature as functions of altitude, and flow conditions at the stack orifice, are used in numerically integrating the equations. The resulting trajectories are compared with photographs, made by Leavitt, et al.,2 of TVA, of plumes from 500 to 600 ft high stacks. Within 10 stack heights downwind of the stack, the root mean square discrepancy between the observed height of the trajectory above ground level and the theoretical value is 14%, which is about the uncertainty in the observed height. The maximum plume rise within the field of observation is within 15% of that predicted by the present theory.  相似文献   

11.
Abstract

Quantitative methods for characterizing variability and uncertainty were applied to case studies of oxides of nitrogen and total organic carbon emission factors for lean-burn natural gas-fueled internal combustion engines. Parametric probability distributions were fit to represent inter-engine variability in specific emission factors. Bootstrap simulation was used to quantify uncertainty in the fitted cumulative distribution function and in the mean emission factor. Some methodological challenges were encountered in analyzing the data. For example, in one instance, five data points were available, with each data point representing a different market share. Therefore, an approach was developed in which parametric distributions were fitted to population-weighted data. The uncertainty in mean emission factors ranges from as little as ~±10% to as much as -90 to 21+180%. The wide range of uncertainty in some emission factors emphasizes the importance of recognizing and accounting for uncertainty in emissions estimates. The skewness in some uncertainty estimates illustrates the importance of using numerical simulation approaches that do not impose restrictive symmetry assumptions on the confidence interval for the mean. In this paper, the quantitative method, the analysis results, and key findings are presented.  相似文献   

12.
13.
Because of the considerable uncertainties associated with modeling complex ecosystem processes, it is essential that every effort be made to test model performance prior to relying on model projections for assessment of future surface water chemical response to environmental perturbation. Unfortunately, long-term chemical data with which to validate model performance are seldom available. The authors present here an evaluation of historical acidification of lake waters in the northeastern United States, and compare historical changes in a set of lakes to hindcasts from the same watershed model (MAGIC) used to estimate future changes in response to acidic deposition. The historical analyses and comparisons with MAGIC model hindcasts and forecasts of acid-base response demonstrate that the acidic and low-ANC lakes in this region are responsive to strong acid inputs. However, the model estimates suggest lakewater chemistry is more responsive to atmospheric inputs of sulfur than do the estimates based on paleolimnological historical analyses. A 'weight-of-evidence approach' that incorporates all available sources of information regarding acid-base response provides a more reasonable estimate of future change than an approach based on model projections alone. The results of these analyses have important implications for predicting future surface water chemical change in response to acidic deposition, establishing critical loads of atmospheric pollutants, and other environmental assessment activities where natural variation often exceeds the trends under investigation (high noise-to-signal ratio). Under these conditions, it is particularly important to evaluate future model projections in light of historical trends data.  相似文献   

14.
Uncertainty in the distribution of hydraulic parameters leads to uncertainty in flow and reactive transport. Traditional stochastic analysis of solute transport in heterogeneous media has focused on the ensemble mean of conservative-tracer concentration. Studies in the past years have shown that the mean concentration often is associated with a high variance. Because the range of possible concentration values is bounded, a high variance implies high probability weights on the extreme values. In certain cases of mixing-controlled reactive transport, concentrations of conservative tracers, denoted mixing ratios, can be mapped to those of constituents that react with each other upon mixing. This facilitates mapping entire statistical distributions from mixing ratios to reactive-constituent concentrations. In perturbative approximations, only the mean and variance of the mixing-ratio distribution are used. We demonstrate that the second-order perturbative approximation leads to erroneous or even physically impossible estimates of mean reactive-constituent concentrations when the variance of the mixing ratio is high and the relationship between the mixing ratio and the reactive-constituent concentrations strongly deviates from a quadratic function. The latter might be the case in biokinetic reactions or in equilibrium reactions with small equilibrium constant in comparison to the range of reactive-constituent concentrations. When only the mean and variance of the mixing ratio is known, we recommend assuming a distribution that meets the known bounds of the mixing ratio, such as the beta distribution, and mapping the assumed distribution of the mixing ratio to the distributions of the reactive constituents.  相似文献   

15.
Air emission inventories in North America: a critical assessment   总被引:1,自引:0,他引:1  
Although emission inventories are the foundation of air quality management and have supported substantial improvements in North American air quality, they have a number of shortcomings that can potentially lead to ineffective air quality management strategies. Major reductions in the largest emissions sources have made accurate inventories of previously minor sources much more important to the understanding and improvement of local air quality. Changes in manufacturing processes, industry types, vehicle technologies, and metropolitan infrastructure are occurring at an increasingly rapid pace, emphasizing the importance of inventories that reflect current conditions. New technologies for measuring source emissions and ambient pollutant concentrations, both at the point of emissions and from remote platforms, are providing novel approaches to collecting data for inventory developers. Advances in information technologies are allowing data to be shared more quickly, more easily, and processed and compared in novel ways that can speed the development of emission inventories. Approaches to improving quantitative measures of inventory uncertainty allow air quality management decisions to take into account the uncertainties associated with emissions estimates, providing more accurate projections of how well alternative strategies may work. This paper discusses applications of these technologies and techniques to improve the accuracy, timeliness, and completeness of emission inventories across North America and outlines a series of eight recommendations aimed at inventory developers and air quality management decision-makers to improve emission inventories and enable them to support effective air quality management decisions for the foreseeable future.  相似文献   

16.
Bergvall M  Grip H  Sjöström J  Laudon H 《Ambio》2007,36(6):512-519
Contaminant transport is generally considered to be a key factor when assessing and classifying the environmental risk of polluted areas. In the study presented here, a steady-state approach was applied to obtain estimates of the transit time and concentration of the pesticide metabolite BAM (2,6-dichlorobenzoamide) at a site where it is contaminating a municipal drinking water supply. A Monte Carlo simulation technique was used to quantify the uncertainty of the results and to evaluate the sensitivity of the used parameters. The adopted approach yielded an estimated median transit time of 10 y for the BAM transport from the polluted site to the water supply. Soil organic carbon content in the unsaturated zone and the hydraulic conductivity in the saturated zone explained 44% and 23% of the uncertainty in the transit time estimate, respectively. The sensitivity analysis showed that the dilution factor due to regional groundwater flow and the soil organic carbon content at the polluted site explained 53% and 31% of the uncertainty of concentration estimates, respectively. In conclusion, the adopted steady-state approach can be used to obtain reliable first estimates of transit time and concentration, but to improve concentration predictions of degrading contaminants, a dynamic model is probably required.  相似文献   

17.
为分析参数不确定性对填埋场渗漏风险评估结果的影响,构建了填埋场地下水污染风险评价的物理过程模型,在此基础上,分别采用模糊理论和概率理论刻画模糊不确定性参数和随机不确定性参数,同时采用基于随机理论的Monte Carlo方法模拟模糊不确定参数,最终构建了基于模糊随机耦合的填埋场地下水污染风险评价方法。采用该模型对东北某一般工业固废填埋场进行案例研究,结果表明,实测浓度在模型模拟的的浓度区间(10%~90%分位值浓度)之内。说明本模型构建的模糊-随机耦合的地下水污染风险评价模型能较准确地预测地下水中污染物实际浓度,可以用于填埋场地下水污染风险评价.风险评估结果表明,该填埋场地下水的潜在污染物为As和Mn,其中As为主要健康风险物质,其非致癌风险值超过风险可接受水平的概率为22%,致癌风险超过10-4的概率为33%,超过10-5的概率为86%,应该采取措施控制含As填埋废物中As的溶出,降低其环境风险;Mn的非致癌风险值小于风险可接受水平的概率为100%,无风险。  相似文献   

18.
Many traditional water quality standards are based on extreme percentiles; often, there is the risk of making wrong decisions with these standards because of high estimation uncertainty. Standards expressed as fuzzy intervals in the form of [trigger, enforcement limitation] make it possible to control the risks for the discharger and the consumer simultaneously. With fuzzy interval compliance, corrective action is initiated when the trigger is exceeded; noncompliance is declared when the enforcement limitation is exceeded. Fuzzy intervals would digest the risks that are inherent when a single enforcement limitation is used to determine compliance; the risks can be further lowered when the fuzzy intervals are based on less extreme percentiles. This paper proposes several alternatives to using a single extreme percentile standard for regulating water quality or waste discharge. A case study using municipal effluent water quality data was included that suggests methods to determine compliance with fuzzy interval standards.  相似文献   

19.
A weight-of-evidence approach was used by the US National Acid Precipitation Assessment Program (NAPAP) to assess the sensitivity of chemistry and biology of lakes and streams to hypothesized changes in sulfate deposition over the next 50 years. The analyses focused on projected effects in response to differences in the magnitude and the timing of changes in sulfate deposition in the north-eastern United States, the Mid-Appalachian Highlands, and the Southern Blue Ridge Province. A number of tools was used to provide the weight of evidence that is required to have confidence in an assessment that has many uncertainties because of the complexity of the systems for which the projections of future conditions were made and because of limited historical data. The MAGIC model provided the projections of chemical changes in response to alternative deposition scenarios. Projected chemical conditions were input into biological models that evaluate effects on fish populations. The sensitivity of water chemistry and brook trout resources to the hypothesized changes in deposition was found to be greatest in the Adirondacks and Mid-Atlantic Highlands. Under the hypothesized sulfur deposition reduction scenarios, chemical conditions suitable for fish were projected to improve 20-30 years sooner than with the scenario that assumed no new legislated controls. Other lines of evidence, e.g. other models, field observations, and paleolimnological findings, were used to evaluate uncertainty in the projections. Model parameter/calibration uncertainty for the chemical models and population sampling uncertainty were explicitly quantified. Model structural uncertainties were bracketed using model comparisons, recent measured changes, and paleolimnological reconstructions of historical changes in lake chemistry.  相似文献   

20.
Measurements of horizontal wind speed frequency distributions made within and above a sorghum and a spring barley crop are reported. Analysis showed that fast gusts of wind occurred within both canopies and that gusts much greater than the local mean value occurred within the canopies with a greater frequency than above the canopies. The shape of the wind speed frequency distributions were similar in both crops and the distributions for barley were shown to be fitted well by extreme value distributions and log normal distributions for wind speed of up to five times the mean value.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号