首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
This paper presents a new concept to include uncertainty management in energy and environmental planning models developed in algebraic modeling languages. SETSTOCH is a tool for linking algebraic modeling languages with specialized stochastic programming solvers. Its main role is to retrieve from the modeling language a dynamically ordered core model (baseline scenario) that is sent automatically to the stochastic solver. The case presented herein concerns such a study realized with the IEAMARKAL model used by many research teams around the world.  相似文献   

2.
Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter ??t,τ, beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.  相似文献   

3.
There is great inconsistency in the use of the terms pulse and press when describing types of perturbations. This is due primarily to a failure to distinguish between the cause and the effect of the perturbation in question. The cause and effect may be either short- or long-term and clearly one may be short-term and the other long-term. Distinction between these two types of disturbance is crucial for management to prevent further impact. Thus, it is important to describe separately these two aspects of a perturbation. Here, we define a protocol for sampling perturbations which enables the cause and effect to be distinguished between short- or long-term. Existing (i.e., already established) assemblages and newly-established assemblages are sampled and compared among disturbed and control locations. Existing assemblages may have been affected by past (pulse) disturbances and/or ongoing (press) disturbances, whereas the establishment of new assemblages can only be influenced by ongoing disturbances. We describe the procedures for assessing impacts of estuarine marinas as an illustration of the issues to be considered in any habitat. Settlement plates and defaunated sediment are suggested for sampling the establishment of new assemblages in aquatic environments.  相似文献   

4.
Natural attenuation (NA) is a catchall explanation for the overall decay and slowed movement of the contaminants in the subsurface. One direct support to NA is to demonstrate that contaminant concentrations from monitoring wells located near the source are decreasing over time. The decrease is summarily expressed in terms of an apparent half-life that is determinedfrom the line best fitting the observed log-transformed concentration data and time. This simple (time-only) decay modelassumes other factors are invariant, and so is flawed when complicating factors – such as a fluctuating water table – are present. A history of the water-table fluctuation can track changes in important NA factors like recharge, groundwater flow direction and velocity, as well as other non-NA factors like volume of water in and purged from the well before a sample is collected. When the trend in the concentrations is better associated with the water table rising or falling, any conclusionabout degradation rate may be premature. We develop simple regressions to predict contaminant concentration (c) by two line models: one involving time (c c(t)), and another involving groundwater elevation (c c(z)). We develop a third model that includesboth factors (c c(t, z)). Using an F-test to compare the fits to the models, we determine which modelis statistically better in explaining the observed concentrations. We applied the test to sites where benzene degradation rates had previously been estimated. The F-testcan be used to determine the suitability of applying non-parametric statistics, like the Mann-Kendall, to the concentration data, because the result from the F-test canindicate instability of the contaminant plume that may bemasked when the water table fluctuates.  相似文献   

5.
Climate-economic modeling often relies on macroeconomic integrated assessment models (IAMs) that in general try to capture how the combined system reacts to different policies. Irrespective of the specific modeling approach, IAMs suffer from two notable problems. First, although policies and emissions are dependent on individual or institutional behavior, the models are not able to account for the heterogeneity and adaptive behavior of relevant actors. Second, the models unanimously consider mitigation actions as costs instead of investments: an arguable definition, given that all other expenditures are classified as investments. Both are challenging if the long-term development of climate change and the economy shall be analyzed. This paper therefore proposes a dynamic agent-based model, based on the battle of perspectives approach (Janssen [1]; Janssen and de Vries [2]; Geisendorf [3, 4]) that details the consequences of various behavioral assumptions. Furthermore, expenditures for climate protection, e.g., the transition of the energy system to renewables, are regarded as investments in future technologies with promising growth rates and the potential to incite further growth in adjoining sectors (Jaeger et al. [5]). The paper analyzes how a different understanding of climate protection expenditures changes the system’s dynamic and, thus, the basis for climate policy decisions. The paper also demonstrates how erroneous perceptions impact on economic and climate development, underlining the importance to acknowledge heterogeneous beliefs and behavior for the success of climate policy.  相似文献   

6.
A research strategy based upon models of intermediate complexity addressing crucial aspects of global environmental change is presented. The key idea behind that strategy is to compress system complexity either by formal techniques such that first-order aspects are preserved, or to employ semi-qualitative schemes to describe and simulate the dominant dynamical patterns identified by panoramic inspection.Specific realizations of the overall heuristic philosophy are introduced as elements of a comprehensive research program on global change. Topics encompass global climate modeling, a decision analysis framework for managing the global warming problem by balancing adaptation and mitigation efforts, a generic approach to integrated regional climate impact assessment and its implementation in specific regions, as well as a new technique to link regional and global patterns of environmental change by using advanced modeling tools.  相似文献   

7.
Many governments use technology incentives as an important component of their greenhouse gas abatement strategies. These carrots are intended to encourage the initial diffusion of new, greenhouse-gas-emissions-reducing technologies, in contrast to carbon taxes and emissions trading which provide a stick designed to reduce emissions by increasing the price of high-emitting technologies for all users. Technology incentives appear attractive, but their record in practice is mixed and economic theory suggests that in the absence of market failures, they are inefficient compared to taxes and trading. This study uses an agent-based model of technology diffusion and exploratory modeling, a new technique for decision-making under conditions of extreme uncertainty, to examine the conditions under which technology incentives should be a key building block of robust climate change policies. We find that a combined strategy of carbon taxes and technology incentives, as opposed to carbon taxes alone, is the best approach to greenhouse gas emissions reductions if the social benefits of early adoption sufficiently exceed the private benefits. Such social benefits can occur when economic actors have a wide variety of cost/performance preferences for new technologies and either new technologies have increasing returns to scale or potential adopters can reduce their uncertainty about the performance of new technologies by querying the experience of other adopters. We find that if decision-makers hold even modest expectations that such social benefits are significant or that the impacts of climate change will turn out to be serious then technology incentive programs may be a promising hedge against the threat of climate change.  相似文献   

8.
The numerical treatment of a regional air pollution model (such models are, as a rule, described mathematically by systems of partial differential equations) leads to the solution of very large computational problems. The chemical submodel of an air pollution model is normally the most timeconsuming part of the computational work. The application of appropriate discretization and splitting procedures reduces the chemical submodel to a large number of relatively small ODE systems (one such system per gridpoint). In the process of searching for efficient numerical algorithms for the chemical submodels one can carry out experiments by using only one such ODE system in order to facilitate the work. This approach has been used in connection with a particular chemical scheme, the condensed CBM IV scheme, which is used in several large air pollution models. Six integration algorithms have been tested on a set of typical scenarios (consisting of different starting concentrations and/or of different values of the emissions). The advantages and the disadvantages of the algorithms tested are discussed. The final decision about the most efficient algorithm, among the algorithms tested, should be made after a second series of experiments. The coupling of the chemical process with the transport of air pollution (on, at least, a twodimensional domain) together with the application of highspeed computers has to be studied in the second series of experiments, which will be performed in a subsequent paper.  相似文献   

9.
The issues surrounding the anticipated impacts of the enhanced greenhouse effect are likely to form a significant part of the research activities in the coming years. In this paper we have adopted a control theoretic approach to the analysis of one of the world's best known integrated models of the enhanced greenhouse effect: the Dutch IMAGE 1.0 (Integrated Model to Assess the Greenhouse Effect) model. The paper demonstrates that optimisation methodologies can be applied to integrated models to enhance their interpretative power. This is accomplished by providing a mechanism whereby optimal emission allocation strategies can be formulated from existing models. One result of particular interest is that the analysis confirms that the earlier human input into the climate system can be stabilised, the higher the levels of CO2 emissions can be permitted and still achieve specific long-term environmental target.  相似文献   

10.
Three statistical models are used to predict the upper percentiles of the distribution of air pollutant concentrations from restricted data sets recorded over yearly time intervals. The first is an empirical quantile-quantile model. It requires firstly that a more complete date set be available from a base site within the same airshed, and secondly that the base and restricted data sets are drawn from the same distributional form. A two-sided Kolmogorov-Smirnov two-sample test is applied to test the validity of the latter assumption, a test not requiring the assumption of a particular distributional form. The second model represents the a priori selection of a distributional model for the air quality data. To demonstrate this approach the two-parameter lognormal, gamma and Weibull models and the one-parameter exponential model were separately applied to all the restricted data sets. A third model employs a model identification procedure on each data set. It selects the best fit model.  相似文献   

11.
This paper briefly reviews the process of exotic pest risk assessments and presents some examples of emerging opportunities for spatial bioclimatic modeling of exotic species in Canada. This type of analysis can support risk assessments but does not replace the need for on-going high quality field-based observations to validate and update models. Bioclimatic analysis of several exotic pests is provided to illustrate both opportunities and limits. A link is demonstrated to the National Forest Inventory to characterize timber volumes at risk for one exotic species. Challenges' are both scientific and administrative. More accessible and current field survey data are required to improve models. Our experience is that for many exotic species, historical, and even current, data are not always digital or quality controlled for taxonomic identity and accurate geo-referencing. This inhibits their use for integrated spatial modeling applications.  相似文献   

12.
Numerical models are often used to evaluate the potential impact of human alternation of natural water bodies and to help the design of the alternation to mitigate its impacts. In the past decade, three-dimensional hydrodynamic and reactive transport modeling has matured from a research subject to a practical analysis technology. This paper presents a practical study in which a three-dimensional hydrodynamic and water quality model [hydrodynamic eutrophication model (HEM-3D)] was applied to determine the optimal location for treated wastewater discharged from marine outfall system in the Keelung harbor and the adjacent coastal sea. First, model validation was conducted with respect to surface elevation, current, and water quality variables measured in the Keelung harbor station and its coastal sea. The overall performance of the model was in qualitative agreement with the available field data. The model was then used to evaluate several scenarios of the locations from marine outfall system. Based on model simulation results, a location at the northeast of Ho-Ping Island was recommended for adoption because the environmental impact is smaller than any other alternative.
Wen-Cheng LiuEmail:
  相似文献   

13.
It is well known that the commonly used k- turbulence models yield inaccurate predictions for complex flow fields. One reason for this inaccuracy is the misrepresentation of Reynolds stress differences. Nonlinear turbulence models are capable to overcome this weakness while being not considerably more complex. However no comprehensive studies are known which analyze the performance of nonlinear turbulence models for three-dimensional flows around building-shaped structures. In the present study the predictions of the flow around a surface-mounted cube using three nonlinear two-equation turbulence models are discussed. The results are compared with predictions of the standard k- turbulence model and wind tunnel measurements. It is shown that the use of nonlinear turbulence models can be beneficial in predicting wind flows around buildings.  相似文献   

14.
In the work ozone data from the Liossion monitoring station of the Athens/PERPA network are analysed. Data cover the months May to September for the period 1987–93. Four statistical models, three multiple regression and one ARIMA (0,1,2), for the prediction of the daily maximum 1-hour ozone concentrations are developed. All models together, with a persistence forecast, are evaluated and compared with the 1993's data, not used in the models development. Validation statistics were used to assess the relative accuracy of models. Analysis, concerning the models' ability to forecast real ozone episodes, was also carried out. Two of the three regression models provide the most accurate forecasts. The ARIMA model had the worst performance, even lower than the persistence one. The forecast skill of a bivariate wind speed and persistence based regression model for ozone episode days was found to be quite satisfactory, with a detection rate of 73% and 60% for O3 >180 g m-3 and O3 >200 g m-3, respectively.  相似文献   

15.
In this paper we present a new approach for modeling environmental problem as a bilevel programming problem. To the authors best knowledge, this is the first attempt to use bilivel techniques to tackle such problems. We derive at solution to help decision makers to cope with environmental policy issues. San Francisco, Bay Area is used as a real world example with the solution to their environmental problem.California is presently faced with a serious deficit of solid waste treatment and disposal facilities. Federal legislation has sought to compel the States to assure the capacity to treat and dispose of their own wastes and the California Legislature has enacted laws requiring the counties to initiate programs so that they can treat and dispose of their own wastes. Neither the federal nor the State programs have met with success in California. California continues to ship greater and greater amounts of waste out-of-state, and the majority of California counties have not instituted plans acceptable to the State government regarding the treatment and disposal of their own wastes.In the few cases where sitting and licensing programs have been proposed, the policy-makers charged with their evaluation have proceeded with largely intuitive, non-quantitative evaluation of policy options, often ignoring most of the financial and environmental implication of their decisions.We have developed a strategic management decision model that can evaluate multiple solid waste management options from both economic and environmental standpoints. Examples of problems a quantitative model might evaluate include the economic and environmental impacts of multiple treatment or disposal facilities as opposed to only one site; the environmental impact of taxing dirty waste streams, thus encouraging waste treatment and/or minimization on-site; and the social risk resulting from transportation risks assuming one or more multiple treatment or disposal sites or the use of alternative transportation routes.Because of extensive information presently available for the San Francisco Bay region, we have investigated the regional waste management problem there under several different treatment and disposal scenarios. As appropriate, results from this regional model and from authors earlier work [1] will be applied to California as a whole.  相似文献   

16.
This paper describes the use of statistical regression models to characterize temporal trends in groundwater monitoring data collected between 1980 and 1990 on 15 wells and 13 parameters (195 cases in all) at the KL Avenue landfill site in Kalamazoo County, Michigan. This site was used as a municipal landfill prior to 1980, then was placed on the Superfund site list in 1982 after ground-water contamination was found.Six temporal regression trend models were defined using linear and quadratic regression models. These trends were used to classify each of the 195 cases as: improving, deteriorating, or stable over the 1980–1990 time period. Using these classifications it was determined that there were more than twice as many improving cases as deteriorating conditions at the KL site during this time period. These models provide a method for visualizing and interpreting trends in ground-water quality at individual well locations within the contaminant plume and for assessing the chemical trend behavior of the overall plume. The improving, deteriorating, and stable trend categories were developed for two purposes. The first purpose is to facilitate comprehension of information contained in large amounts of water quality data. The second is to assist communication among the many different groups of people who recommend actions, including remediation responsibilities at Superfund sites, like the KL site.A normal probability model was used in the trend classifications. This model contained provisions to accommodate nondetect data and other abnormal laboratory determinations which can influence the trend selection process. The robustness of this classification procedure was examined using a lognormal probability model. The overall conclusions about the KL site using the lognormal model were similar to those obtained using the normal model. However, some individual trend indications were different using the lognormal model. The Shapiro-Wilk test was used to check the adequacy of both the normal and lognormal models. The lognormal model was found to be a somewhat more adequate model for fitting the KL site data, but was not found to be superior to the normal model for each case.The normal and lognormal models were both found to be suitable for determining overall trend conditions at this site. Both models are recommended for these purposes assuming an understanding of the statistical constraints and hydrochemical context. However, it is recommended that the search for more adequate trend models continues.  相似文献   

17.
PREDICTING CHANGE IN NON-LINEAR SYSTEMS   总被引:1,自引:0,他引:1  
Complex systems are characterizedby surprising switches to new behaviours. Evaluating and predicting these changes demands anunderstanding of the behaviour of the whole system. The combined ecosystem-climate system shows chaoticor pseudorandom behaviour, stochastic or trulyrandom behaviour, as well as simple bifurcation andsemi-stability. Semistability involves the suddenchange from a destabilized attractor to a newstable attractor which may occur after an apparentlyunpredictable time delay. We present some recentresults for analyzing time series data and for usingsimulations of non-linear models to predict these changes.  相似文献   

18.
Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a sensitivity analysis. A comprehensive review is presented of more than a dozen sensitivity analysis methods. This review is intended for those not intimately familiar with statistics or the techniques utilized for sensitivity analysis of computer models. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.  相似文献   

19.
Starting from the basic assumption of the syndrome concept that essentially all of the present problematic civilization–nature interactions on the global scale can be subdivided into a limited number of typical patterns, the analysis of the response of these patterns (syndromes) to climate change can make a major contribution to climate impact research, surmounting the difficulties of more common sectoral ceteris paribus impact studies with respect to their systemic integration. In this paper we investigate in particular the influence of climate on the regional proneness or disposition towards one of the most important syndromes with respect to famines and malnutrition, the Sahel Syndrome. It describes the closely interlinked natural and socioeconomic aspects of rural poverty driven degradation of soil and vegetation on marginal sites. Two strategies of global climate impact assessment on a spatial 0.5°×0.5° grid were pursued: (a) As a measure for the climate sensitivity of the regional proneness, the absolute value of the gradient of the disposition with respect to the global field of 3} 12 monthy normals of temperature, irradiation and precipitation is calculated. (b) The disposition was evaluated for two different climate forecasts under doubled atmospheric CO2 concentration. For both strategies two new quantitative global models were incorporated in a fuzzy-logic-based algorithm for determining the disposition towards the Sahel Syndrome: a neural-net-based model for plant productivity and a waterbalance model which calculates surface runoff considering vertical and lateral fluxes, both driven by the set of 36 monthly climatological normals and designed to allow very fast global numerical evaluation.Calculation (b) shows that the change in disposition towards the Sahel Syndrome crucially depends on the chosen climate forecast, indicating that the disagreement of climate forecasts is propagated to the impact assessment of the investigated socio-economic pattern. On the other hand the regions with a significant increase in disposition in at least one of the climate scenario-based model runs form a subset of the regions which are indicated by the local climate sensitivity study (a) as highly sensitive – illustrating that the gradient measure applied here provides a resonable way to calculate an upper limit or worst case of negative climate impact. This method is particularly valuable in the case of uncertain climate predictions as, e.g., for the change in precipitation patterns.  相似文献   

20.
In order to assess the effects of the execution of the Port of Bilbao Enlargement Project, epifauna living on hard substrata and environmental parameters were quantitatively investigated from 1994 to 1996. A programme of repeated non-destructive sampling at 8 stations was carried out during the construction period of a breakwater and the filling operations on the shoreline. A correlation analysis was used as a method to extract potential indicator species of particular environmental conditions measured in the field. We postulate that the remaining species (about 80% of the total species data set), insensitive to any of the investigated environmental factors, were unnecessary for the purposes of assessing the environmental impact caused by the port building works. Classification and ordination techniques were then conducted at two contrasting levels by using the full species data set and the selected faunal indicator data subset. All plots showed separation of sampling sites in 3 major groups, which were easily related to the perturbations caused by a siltation gradient from the estuary mouth. This suggests that the amount of effort required in the enumeration of all the organisms sampled may be dramatically reduced by identifying only faunal indicators of environmental discontinuities in the field. So far, the engineering works developed on the western side of the bay have not caused dramatic temporal changes in species composition, or at least they have not had an effect that was larger than the variations detected among the study sites due to siltation from the estuary mouth.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号