首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A number of key policy insights have emerged from the application of large-scale economic/energy models, such as integrated assessment models for climate change. These insights have been particularly powerful in those instances when they are shared by all or most of the existing models. On the other hand, some results and policy recommendations obtained from integrated assessment models vary widely from model to model. This can limit their usability for policy analysis. The differences between model results are mostly due to different underlying assumptions about exogenous processes, about endogenous processes and the dynamics among them, differences in value judgments, and different approaches for simplifying model structure for computational purposes. Uncertainty analyses should be performed for the dual purpose of clarifying the uncertainties inherent in model results and improving decision making under uncertainty. This paper develops a unifying framework for comparing the different types of uncertainty analyses through their objective functions, categorizes types of uncertainty analyses that can be performed on large models, and compares different approaches to uncertainty analysis by explaining underlying assumptions, suitability for different model types, and advantages and disadvantages. The appendix presents a summary of integrated assessment models for climate change that explicitly account for uncertainty.  相似文献   

2.
Natural capital models attempt to remediate the relationship between economics and ecology either by conjoining models and theories from each discipline or by finding a type of phenomena that can be meaningfully measured by both fields. The developmentof a widely accepted model which integrates economics and ecologyhas eluded researchers since the early 1970s. This paper offers an historical and philosophical perspective on some of the conceptual problems or obstacles that hinder the development ofnatural capital models. In particular, the disciplinary assumptions of economic science and ecological science are examined and it is argued that these assumptions are antithetical. Hence, the development of an effective and acceptednatural capital model will require that economics and ecology reconsider their self-conceptions as sciences. For the purposesof theoretical research and practical policy, the paper cautionsagainst confusing the issue of whether or not economic models accord with ecological models with the issue of whether or not economic activities accord with ecological realities.  相似文献   

3.
One of the principal tools used in the integrated assessment (IA) of environmental science, technology and policy problems is integrated assessment models (IAMs). These models are often comprised of many sub‐models adopted from a wide range of disciplines. A multi‐disciplinary tool kit is presented, from which three decades of IA of global climatic change issues have tapped. A distinction between multi‐ and inter‐disciplinarity is suggested, hinging on the synergistic value added for the latter. Then, a hierarchy of five generations of IAMs are proposed, roughly paralleling the development of IAMs as they incorporated more components of the coupled physical, biological and social scientific disciplines needed to address a “real world” problem like climatic change impacts and policy responses. The need for validation protocols and exploration of predictability limits is also emphasized. The critical importance of making value‐laden assumptions highly transparent in both natural and social scientific components of IAMs is stressed, and it is suggested that incorporating decision‐makers and other citizens into the early design of IAMs can help with this process. The latter could also help IA modelers to offer a large range of value‐containing options via menu driven designs. Examples of specific topics which are often not well understood by potential users of IAMs are briefly surveyed, and it is argued that if the assumptions and values embedded in such topics are not made explicit to users, then IAMs, rather than helping to provide us with refined insights, could well hide value‐laden assumptions or conditions. In particular, issues of induced technological change, timing of carbon abatement, transients, surprises, adaptation, subjective probability assessment and the use of contemporary spatial variations as a substitute for time evolving changes (what I label “ergodic economics”) are given as examples of problematic issues that IA modelers need to explicitly address and make transparent if IAMs are to enlighten more than they conceal. A checklist of six practices which might help to increase transparency of IAMs is offered in the conclusions. Incorporation of decision‐makers into all stages of development and use of IAMs is re‐emphasized as one safeguard against misunderstanding or misrepresentation of IAM results by lay audiences.  相似文献   

4.
A new Swiss TIMES (The Integrated MARKAL–EFOM System) electricity model with an hourly representation of inter-temporal detail and a century-long model horizon has been developed to explore the TIMES framework’s suitability as a long-term electricity dispatch model. To understand the incremental insights from this hourly model, it is compared to an aggregated model with only two diurnal timeslices like in most MARKAL/TIMES models. Two scenarios have been analysed with both models to answer the following questions: Are there differences in model solutions? What are the benefits of having a high number of timeslices? Are there any computational limitations? The primary objective of this paper is to understand the differences between the solutions of the two models, rather than Swiss policy implication or potential uncertainties in input parameters and assumptions. The analysis reveals that the hourly model offers powerful insights into the electricity generation schedule. Nevertheless, the TIMES framework cannot substitute for a dispatch model because some features cannot be represented; however, the long model time horizon and integrated system approaches of TIMES provide features not available in conventional dispatch models. The methodology of the model development and insights from the model comparison are described.  相似文献   

5.
The review discusses six major public domain water quality models currently available for rivers and streams. These major models, which differ greatly in terms of processes they represent, data inputs requirements, assumptions, modeling capability, their strengths and weaknesses, could yield useful results if appropriately selected for the desired purposes. The public domain models, which are most suitable for simulating dissolved oxygen along rivers and streams, chosen in this review are simulation catchment (SIMCAT), temporal overall model for catchments (TOMCAT), QUAL2Kw, QUAL2EU, water quality analysis simulation program (WASP7), and quality simulation along rivers (QUASAR). Each of these models is described based on a consistent set of criteria-conceptualization, processes, input data, model capability, limitations, model strengths, and its application. The results revealed that SIMCAT and TOMCAT are over-simplistic but useful to quickly assess impact of point sources. The QUAL2Kw has provision for conversion of algal death to carbonaceous biochemical oxygen demand (CBOD) and thus more appropriate than QUAL2EU, where macrophytes play an important interaction. The extensive requirement of data in WASP7 and QUASAR is difficult to justify the time and costs required to set up these complex models. Thus, a single model could not serve all wide range of functionalities required. The choice of a model depends upon availability of time, financial cost and a specific application. This review may help to choose appropriate model for a particular water quality problem.  相似文献   

6.
Renewable energy continues to grow globally, and the number of offshore wind farms is set to increase. Whilst wind energy developments provide energy security and reduced carbon budgets, they may impact bird populations through collision mortality, habitat modification and avoidance. To date, avian collision mortality has received the most attention and collision risk models have been developed to estimate the potential mortality caused by wind turbines. The utility of these models relies not only on their underlying assumptions but also on the data available to ensure the predictions are informative. Using a stochastic collision risk model (sCRM; based on the Band collision risk model) as an example, we explore the importance of bird flight speed and consider how the assumptions of the model influence the sensitivity to flight speed. Furthermore we explore the consequences of using site-specific GPS-derived flight speed rather than a standard generic value, with Lesser Black-backed Gulls Larus fuscus as an example, and consider how this generic value is currently used. We found that the model was most sensitive to the parameters of bird density, non-avoidance rate and percentage of birds at collision risk height, as well as bird flight speed. Using site-specific flight speed data derived from GPS tags rather than a standard value reduced the predicted number of collisions. We highlight that within the model, both the estimation of the probability of collision (PColl) and the flux of birds are sensitive to the bird flight speed; this sensitivity acts in opposite directions but the two do not necessarily balance out. Therefore, when the sCRM is used as generally done, there is little difference in collision estimates if airspeeds (bird flight speed relative to air through which it is moving) are used rather than groundspeeds (bird flight speed relative to ground). Estimates of seabird collision rates in relation to offshore wind farms are impacting future offshore wind development. By using site specific flight speed estimates and, accounting for different speeds in relation to wind direction, we demonstrate that cumulative collision estimates can be affected, highlighting the need for more representative flight speed data and where possible site-specific data.  相似文献   

7.
In this paper, we use a stochastic integrated assessment model to evaluate the effects of uncertainty about future carbon taxes and the costs of low-carbon power technologies. We assess the implications of such ambiguity on the mitigation portfolio under a variety of assumptions and evaluate the role of emission performance standards and renewable portfolios in accompanying a market-based climate policy. Results suggest that climate policy and technology uncertainties are important with varying effects on all abatement options. The effect varies with the technology, the type of uncertainty, and the level of risk. We show that carbon price uncertainty does not substantially change the level of abatement, but it does have an influence on the mitigation portfolio, reducing in particular energy R&D investments in advanced technologies. When investment costs are uncertain, investments are discouraged, especially during the early stages, but the effect is mitigated for the technologies with technological learning prospects. Overall, these insights support some level of regulation to encourage investments in coal equipped with carbon capture and storage and clean energy R&D.  相似文献   

8.
A developed instantaneous emission model is applied to predict emission factors for small vehicle fleets for quality assessment. Extensive vehicle measurements of pre-Euro-1 gasoline, Euro-3 gasoline, and Euro-2 diesel vehicles are available. The data were used to develop individual vehicle emission models for each car. The prediction quality for each vehicle category was determined by averaging the results obtained from the individual vehicle models. The results show that the prediction quality is improved in comparison with the individual vehicles, even with a small number of vehicles in a specific category. This indicates that the errors in the individual models are mainly random and that prediction quality, when applied to fleets of cars, is exceptionally high.  相似文献   

9.
Using annual data from 1970 to 2014, this paper examines the effects of globalization on CO2 emissions in Japan while accounting for economic growth and energy consumption as potential determinants of carbon emissions. The structural breaks and asymmetries arising due to policy shifts require attention, and hence, an asymmetric threshold version of the ARDL model is utilized. The results show the presence of threshold asymmetric cointegration between variables. Threshold-based positive and negative shocks arising from globalization increase carbon emissions, while the impact of the latter is more profound. Energy consumption (economic growth) also has a significant positive effect on carbon emissions. Globalization, economic growth, and energy consumption significantly increase carbon emissions in the short run. We suggest that policy makers in Japan consider globalization and energy consumption as policy tools in formulating their policies regarding protecting sustainable environmental quality in the long run. Otherwise, the Japanese economy may continue to face environmental consequences such as undesirable climate change and massive warming at the micro and macro levels as a result of potential shocks arising from globalization and energy consumption.  相似文献   

10.
In the context of an increasing reliance on predictive computer simulation models to calculate potential project impacts, it has become common practice in impact assessment (IA) to call on proponents to disclose uncertainties in assumptions and conclusions assembled in support of a development project. Understandably, it is assumed that such disclosures lead to greater scrutiny and better policy decisions. This paper questions this assumption. Drawing on constructivist theories of knowledge and an analysis of the role of narratives in managing uncertainty, I argue that the disclosure of uncertainty can obscure as much as it reveals about the impacts of a development project. It is proposed that the opening up of institutional spaces that can facilitate the negotiation and deliberation of foundational assumptions and parameters that feed into predictive models could engender greater legitimacy and credibility for IA outcomes.  相似文献   

11.
This paper presents a method for appropriate coupling of deterministic and statistical models. In the decision-support system for the Elbe river, a conceptual rainfall-runoff model is used to obtain the discharge statistics and corresponding average number of flood days, which is a key input variable for a rule-based model for floodplain vegetation. The required quality of the discharge time series cannot be determined by a sensitivity analysis because a deterministic model is linked to a statistical model. To solve the problem, artificial discharge time series are generated that mimic the hypothetical output of rainfall-runoff models of different accuracy. The results indicate that a feasible calibration of the rainfall-runoff model is sufficient to obtain consistency with the vegetation model in view of its sensitivity to changes in the number of flood days in the floodplains.  相似文献   

12.
A water quality monitoring network (WQMN) must be designed so as to adequately protect the water quality in a catchment. Although a simulated annealing (SA) method was previously applied to design a WQMN, the SA method cannot ensure the solution it obtained is the global optimum. Therefore, two new linear optimization models are proposed in this study to minimize the deviation of the cost values expected to identify the possible pollution sources based on uniform cost (UC) and coverage elimination uniform cost (CEUC) schemes. The UC model determines the expected cost values by considering each sub-catchment being covered by which station, while the CEUC model determines the coverage of each station by eliminating the area covered by any upstream station. The proposed models are applied to the Derchi reservoir catchment in Taiwan. Results show that the global optimal WQMN can be effectively determined by using the UC or CEUC model, for which both results are better than those from the SA method, especially when the number of stations becomes large.  相似文献   

13.
Climate-economic modeling often relies on macroeconomic integrated assessment models (IAMs) that in general try to capture how the combined system reacts to different policies. Irrespective of the specific modeling approach, IAMs suffer from two notable problems. First, although policies and emissions are dependent on individual or institutional behavior, the models are not able to account for the heterogeneity and adaptive behavior of relevant actors. Second, the models unanimously consider mitigation actions as costs instead of investments: an arguable definition, given that all other expenditures are classified as investments. Both are challenging if the long-term development of climate change and the economy shall be analyzed. This paper therefore proposes a dynamic agent-based model, based on the battle of perspectives approach (Janssen [1]; Janssen and de Vries [2]; Geisendorf [3, 4]) that details the consequences of various behavioral assumptions. Furthermore, expenditures for climate protection, e.g., the transition of the energy system to renewables, are regarded as investments in future technologies with promising growth rates and the potential to incite further growth in adjoining sectors (Jaeger et al. [5]). The paper analyzes how a different understanding of climate protection expenditures changes the system’s dynamic and, thus, the basis for climate policy decisions. The paper also demonstrates how erroneous perceptions impact on economic and climate development, underlining the importance to acknowledge heterogeneous beliefs and behavior for the success of climate policy.  相似文献   

14.
Acid mine drainage (AMD) is a global problem that may have serious human health and environmental implications. Laboratory and field tests are commonly used for predicting AMD, however, this is challenging since its formation varies from site-to-site for a number of reasons. Furthermore, these tests are often conducted at small-scale over a short period of time. Subsequently, extrapolation of these results into large-scale setting of mine sites introduce huge uncertainties for decision-makers. This study presents machine learning techniques to develop models to predict AMD quality using historical monitoring data of a mine site. The machine learning techniques explored in this study include artificial neural networks (ANN), support vector machine with polynomial (SVM-Poly) and radial base function (SVM-RBF) kernels, model tree (M5P), and K-nearest neighbors (K-NN). Input variables (physico-chemical parameters) that influence drainage dynamics are identified and used to develop models to predict copper concentrations. For these selected techniques, the predictive accuracy and uncertainty were evaluated based on different statistical measures. The results showed that SVM-Poly performed best, followed by the SVM-RBF, ANN, M5P, and KNN techniques. Overall, this study demonstrates that the machine learning techniques are promising tools for predicting AMD quality.  相似文献   

15.
Integrated assessment (IA) can be defined as a structured process of dealing with complex issues, using knowledge from various scientific disciplines and/or stakeholders, such that integrated insights are made available to decision makers (J. Rotmans, Enviromental Modelling and Assessment 3 (1998) 155). There is a growing recognition that the participation of stakeholders is a vital element of IA. However, only little is known about methodological requirements for such participatory IA and possible insights to be gained from these approaches. This paper summarizes some of the experiences gathered in the ULYSSES project, which aims at developing procedures that are able to bridge the gap between environmental science and democratic policy making for the issue of climate change. The discussion is based on a total of 52 IA focus groups with citizens, run in six European and one US city. In these groups, different computer models were used, ranging from complex and dynamic global models to simple accounting tools. The analysis in this paper focuses on the role of the computer models. The findings suggest that the computer models were successful at conveying to participants the temporal and spatial scale of climate change, the complexity of the system and the uncertainties in our understanding of it. However, most participants felt that the computer models were less instrumental for the exploration of policy options. Furthermore, both research teams and participants agreed that despite considerable efforts, most models were not sufficiently user-friendly and transparent for being accessed in an IA focus group. With that background, some methodological conclusions are drawn about the inclusion of the computer models in the deliberation process. Furthermore, some suggestions are made about how given models should be adapted and new ones developed in order to be helpful for participatory IA. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

16.
In this study, an algorithm combining a multi-objective genetic algorithm (GA)-based optimization model and a water quality simulation model is developed for determining a trade-off curve between objectives related to the allocated water quantity and quality. To reduce the run-time of the GA-based optimization model, the main problem is decomposed to long-term and annual optimization models. The reliability of water supply is considered to be the objective function in the long-term stochastic optimization model, but the objective functions of the annual models are related to both the allocated water quantity and quality. The operating policies obtained using this long-term model provide the time series of the optimum reservoir water storages at the beginning and the end of each water year. In the next step, these optimal reservoir storage values are considered as constraints for water storage in the annual reservoir operation optimization models. The epsilon-constraint method is then used to develop a trade-off curve between the reliability of water supply and the average allocated water quality. The Young conflict resolution theory, which incorporates the existing conflicts among decision-makers and stakeholders, is used for selecting the best solution on the trade-off curve. The monthly reservoir operating rules are then calculated using an Adaptive Neuro-Fuzzy Inference System, which is trained using the optimal operating policies. The proposed model is applied to the 15-Khordad Reservoir in the central part of Iran. The results show that this simplified procedure does not reduce the accuracy of the reservoir operating policies and it can effectively reduce the computational burden of the previously developed models.  相似文献   

17.
臭氧数值预报模型综述   总被引:12,自引:8,他引:4  
光化学大气质量模型在研究臭氧(O_3)污染以及O_3预报方面具有核心作用,是O_3污染防治决策者的有力工具。文章结合目前中国及国际区域尺度光化学大气质量预报模型的研究与应用,重点论述与O_3有关的大气化学过程在数值预报模型中的数学表达和计算方法,阐述大气物理与大气化学过程在主流大气质量数值预报模型中的实现方法及其优势和缺陷,介绍用于数值预报模型的大气物理过程和湍流参数化方案的最新进展。就当前O_3数值模拟的主要输入资料进行讨论,强调那些易被忽视但又显著影响模型预报能力和效果的诸多因素以及模型效果评估的重要性。结合O_3与复合型大气污染的关系,强调区域大气质量数值预报模型的发展趋势与方向以及在大气环境管理方面的意义和作用。  相似文献   

18.
The possibility of acquiring real-time concentration data is leading many indoor air quality and health researchers to the use of particle measuring instruments instead of the classic filtration approach. This paper summarizes a checklist of characteristics that have to be considered on the selection of such instruments and checks the compliance of three air monitoring devices suitable for environmental exposure researches. An evaluation table with desirable instrument technical, economic, and logistics characteristics was summarized in a checklist, and spec sheets of three air monitoring devices suitable for environmental exposure researches were checked. Technical, economic, and logistics aspects have to be considered. Suitability, measurement range, accuracy, resolution, and robustness are indispensable metrological characteristics. Only one instrument was in comply with it. A popular air monitoring device among environmental exposure researchers was checked and it failed the accuracy check. When selecting a particle measuring instrument, technical, economic, and logistics aspects have to be considered. Suitability, measurement range, accuracy, resolution, and robustness are indispensable metrological characteristics. When selecting an instrument for a study, a lack of information on the quality of results is a strong indication that it should not be considered, as study's response may be compromised.  相似文献   

19.
20.
OSPM - A Parameterised Street Pollution Model   总被引:3,自引:0,他引:3  
For many practical applications, as e.g. in support of air pollution management, numerical models based on solution of the basic flow and dispersion equations are still too complex. Alternative are models that are basically parameterised semi-empirical models making use of a priori assumptions about the flow and dispersion conditions. However, these models must, be thoroughly tested and their performance and limitations carefully documented. The Danish Operational Street Pollution Model (OSPM) belongs to this category of parameterised models. In the OSPM, concentrations of exhaust gases are calculated using a combination of a plume model for the direct contribution and a box model for the recirculating part of the pollutants in the street. Parameterisation of flow and dispersion conditions in street canyons was deduced from extensive analysis of experimental data and model tests. Results of these tests were used to further improve the model performance, especially with regard to different street configurations and a variety of meteorological conditions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号