首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Abstract: While training a Neural Network to model a rainfall‐runoff process, generally two aspects are considered: its capability to be able to describe the complex nature of the processes being modeled and the ability to generalize so that novel samples could be mapped correctly. The general conclusion is that, the smallest size network capable of representing the sample distribution is the best choice, as far as generalization is concerned. Oftentimes input variables are selected a priori in what is called an explanatory data analysis stage and are not part of the actual network training and testing procedures. When they are, the final model will have only a “fixed” type of inputs, lag‐space, and/or network structure. If one of these constituents was to change, one would obtain another equally “optimal” Neural Network. Following Beven and others' generalized likelihood uncertainty estimate approach, a methodology is introduced here that accounts for uncertainties in network structures, types of inputs, and their lag‐space relationships by looking at a population of Neural Networks rather than target in getting a single “optimal” network. It is shown that there is a wide array of networks that provide “similar” results, as seen by a likelihood measure, for different types of inputs, lag‐space, and network size combinations. These equally optimal networks expose the range of uncertainty in streamflow predictions and their expected value results in a better performance than any of the single network predictions.  相似文献   

2.
ABSTRACT: The effects of changing nutrient inputs through land use management, waste water treatment, or effluent diversion are not clear, and managers are discovering that decisions which were effective in reversing eutrophication for one lake are often unsuccessful when applied to another. Simple empirical relationships are often used to predict the impact of management decisions. Errors in estimation could result in either substantial costs for overdesign or failure to meet desired eutrophication levels. This paper presents and illustrates a methodology to evaluate the impact of land use and water resource management decisions on lake eutrophication. The problems of worth of additional information, and uncertainty of estimates were handled within a cost-effectiveness framework. The probability of exceeding a critical level of eutrophication was considered as a measure of effectiveness. The cost criterion is the expected value of opportunity costs, costs of analysis and costs of additional information. Uncertainty analysis techniques were used to estimate the effectiveness of various management alternatives. Bayesian methods can be utilized to determine the worth of additional information. The methodology was applied to Beseck Lake, Connecticut, and the cost and effectiveness measures estimated for a number of land management alternatives. Worth of additional information was not determined in this initial effort in uncertainty analysis for lake eutrophication management.  相似文献   

3.
Narrowing the decision space is crucial in water quality management at the meso-scale for developing countries, where a lack of data and financial budgets prevent the development of appropriate management plans and result in serious water quality degradation in many rivers. In this study, a framework for handling this task is proposed, comprising a lumped water quality model, with sensitivity and uncertainty analyses, and a management domain, including loss estimation and value of information analysis. Through a case study with linear alkylbenzene sulfonate (LAS) in the Yodo River, it is found that non-point sources and flow rate are factors that influence LAS concentration at the hot spot location. By considering the entire process of water quality management planning, we identify that the definition of the cost function of LAS treatment determines the appropriate estimation for the expected loss in reducing LAS under uncertain water quality conditions. The value of information analysis with “expected value of including uncertainty” and “expected value of perfect information” further helps estimate the benefit of including uncertainty in decision-making and the financial cost for obtaining more information regarding inputs that have been previously prioritized.  相似文献   

4.
As a proactive step towards understanding future waste management challenges, this paper presents a future oriented material flow analysis (MFA) used to estimate the volume of lithium-ion battery (LIB) wastes to be potentially generated in the United States due to electric vehicle (EV) deployment in the near and long term future. Because future adoption of LIB and EV technology is uncertain, a set of scenarios was developed to bound the parameters most influential to the MFA model and to forecast “low,” “baseline,” and “high” projections of future end-of-life battery outflows from years 2015 to 2040. These models were implemented using technology forecasts, technical literature, and bench-scale data characterizing battery material composition. Considering the range from the most conservative to most extreme estimates, a cumulative outflow between 0.33 million metric tons and 4 million metric tons of lithium-ion cells could be generated between 2015 and 2040. Of this waste stream, only 42% of the expected materials (by weight) is currently recycled in the U.S., including metals such as aluminum, cobalt, copper, nickel, and steel. Another 10% of the projected EV battery waste stream (by weight) includes two high value materials that are currently not recycled at a significant rate: lithium and manganese. The remaining fraction of this waste stream will include materials with low recycling potential, for which safe disposal routes must be identified. Results also indicate that because of the potential “lifespan mismatch” between battery packs and the vehicles in which they are used, batteries with high reuse potential may also be entering the waste stream. As such, a robust end-of-life battery management system must include an increase in reuse avenues, expanded recycling capacity, and ultimate disposal routes that minimize risk to human and environmental health.  相似文献   

5.
ABSTRACT

This paper presents a qualitative case study of community participation in local air quality management in Nottingham (UK). We analyse Nottingham’s response to a “clean air zone” mandate: despite national government and local community support of this congestion charging policy, the City Council rejected the measure. We focus on the policy framing, with data from policy documents, interviews with government and non-government actors, and observation in local activities. We found that community groups build links with local government in two ways: (1) as a coalition against the national government and austerity measures, and (2) as “neutral”, non-expert communicators of air pollution as an “invisible” policy problem. We show how this invisibility plays a significant role in factors such as trust, risk, responsibility, and policy communication. This research has theoretical implications for the communication of air pollution and practical implications for cities looking to implement similar transport-oriented strategies.  相似文献   

6.
This paper discusses the sensitivity of the value of information to the risk aversion in two-action decision problems when the initial wealth is uncertain. We demonstrate that there is no general monotonicity between information value and the Arrow–Pratt risk aversion in this setting. We then show that monotonicity exists in the sense of Rubinstein’s measure of risk aversion when the lottery is independent of the initial wealth. Finally, we show that if the lottery is dependent on the initial wealth, then Ross’s measure of risk aversion is needed to characterize this monotonic relation. Our results explain the shape of the sensitivity analysis curve of the value of information to risk aversion and interpret various measures of risk aversion based on their monotonicity with information value.  相似文献   

7.
This article describes and proposes the “environmental subsidiarity principle” as a guiding ethical value in forestry governance. Different trends in environmental management such as local participation, decentralization or global governance have emerged in the last two decades at the global, national and local level. This article suggests that the conscious or unconscious application of subsidiarity has been the ruling principle that has allocated the level at which tasks have been assigned to different agents. Based on this hypothesis this paper describes the principle of subsidiarity and its application to environmental policies within forest governance and proposes the “environmental subsidiarity” principle as a critical conceptual tool for sustainable resource management. The paper explains as an example how “environmental subsidiarity” is the key principle that can link payment for ecosystem services (PES) with environmental public policies and applies this principle with all its political consequences to reducing emissions from deforestation and forest degradation, and enhancing forest carbon stocks in developing countries (REDD+) architecture. It concludes by showing how the adoption of “environmental subsidiarity” as a ethical principle could help to maximize benefits to all stakeholders involved in PES schemes such as REDD+.  相似文献   

8.
This work presents a method to aid in the prioritization of research within a scientific domain. The domain is encoded into a directed network in which nodes represent factors in the domain, and directed links between nodes represent known or hypothesized causal relationships between the factors. Each link is associated with a numeric weight that indicates the degree of understanding of that hypothesis. Increased understanding of hypotheses is represented by higher weights on links in the network. Research is prioritized by calculating optimal allocations of limited research resources across all links in the network that maximize the degree of overall knowledge of the research domain. We quantify the level of knowledge of individual nodes (factors) in the map by a network centrality measure that reflects in dependencies between knowledge level of nodes and the knowledge level of their parent nodes in the map. We analyzed a funded research proposal concerning the fate and transport of nanomaterials in the environment to illustrate the method.  相似文献   

9.
Risk management of chemicals requires information about their adverse effects such as toxicity and persistence, for example. Testing of chemicals allows for improving the information base for regulatory decision‐making on chemicals' production and use. Testing a large number of chemicals with limited time and resources forces a prioritization of chemicals. This paper proposes a decision model that provides a ranking of chemicals according to “urgency to test”. The model adopts a value‐of‐information approach describing the expected welfare gains from regulatory actions that respond to test information. We determine the value‐of‐information of tests revealing chemicals' levels of toxicity and persistence. We compare our findings to the prioritization of chemicals in the new European Chemicals Regulation “REACH”, where several tens of thousands of chemicals are to be tested in order to fill existing information gaps and to implement more effective risk management. We find that the main lines of chemicals' prioritization under REACH receive backing from our decision model. However, prioritization for testing can be further improved by accounting for testing costs and the sensitivity of regulatory action with respect to the test information.  相似文献   

10.
Electric utility companies in the United States and Canada are participating in a unique environmental benchmarking program (EBP) designed to comparatively assess environmental performance. Each company annually provides data on emissionslwastes, compliance, and other environmental aspects to an independent consultant, Research Triangle Institute (RTI), whose staff compiles the data and computes approximately 80 discrete environmental performance metrics for each company. RTI provides a confidential report for each participant presenting the results of the assessment. In addition, RTI conducts a follow-up assessment to determine “best-practices” of each of the top performers for each metric. The EBP provides each participant with useful information on its strengths and weaknesses relative to the other companies in the program as well as ideas on how to improve its environmental performance. Annual participation in the program allows a company to measure improvement in performance on an annual basis. This article summarizes the evolution of the EBP and describes the study methods and reporting approach. Our goal in sharing this information is to demonstrate the usefulness of the program and encourage further participation by all major North American electric utilities.  相似文献   

11.
Europe is severely affected by alien invasions, which impact biodiversity, ecosystem services, economy, and human health. A large number of national, regional, and global online databases provide information on the distribution, pathways of introduction, and impacts of alien species. The sufficiency and efficiency of the current online information systems to assist the European policy on alien species was investigated by a comparative analysis of occurrence data across 43 online databases. Large differences among databases were found which are partially explained by variations in their taxonomical, environmental, and geographical scopes but also by the variable efforts for continuous updates and by inconsistencies on the definition of “alien” or “invasive” species. No single database covered all European environments, countries, and taxonomic groups. In many European countries national databases do not exist, which greatly affects the quality of reported information. To be operational and useful to scientists, managers, and policy makers, online information systems need to be regularly updated through continuous monitoring on a country or regional level. We propose the creation of a network of online interoperable web services through which information in distributed resources can be accessed, aggregated and then used for reporting and further analysis at different geographical and political scales, as an efficient approach to increase the accessibility of information. Harmonization, standardization, conformity on international standards for nomenclature, and agreement on common definitions of alien and invasive species are among the necessary prerequisites.  相似文献   

12.
The International Energy Agency (IEA), together with the International Atomic Energy Agency (IAEA), UN‐DESA, Eurostat and the European Environmental Agency, has recently published a comprehensive joint‐agency overview of energy indicators for sustainable development. The IEA's contribution to this publication is based on the IEA energy indicator approach. This approach has been developed and used by the IEA over a number of years. The indicators advocated by the IEA are relatively disaggregated to allow for meaningful analysis of sustainability issues in the energy sector. Using a decomposition approach helps reveal the causal links between human/economic driving forces, energy use and emissions. This article presents examples of IEA's work with indicators and an overview of the methodology used, including an explanation of the link to sustainable development. It also provides an example of a simplified indicator analysis of India, to illustrate the importance of improved data systems in developing indicators that can provide meaningful policy analysis.  相似文献   

13.
Geological surveys worldwide are involved with research in support of sustainable mineral resource development.The socio-economic benefits to be derived from these activities, however, continue to raise organisational and government sector questions. Fundamental questions include whether or not the resources committed are appropriate and in economic balance with the total benefits to be derived. Another question concerns the degree to which such services should be funded by the community at large. These questions in turn raise important issues regarding the role and cost of geological surveys, the impact of their services, and how they should maximise community benefit from their activities and expertise. To assess the value of geoscientific information,standard valuation processes need to be modified. This paper reports on a methodology designed to quantify the 'worth' of programmes upgrading regional geoscientific infrastructure. An interdisciplinary approach is used to measure the impact of geoscientific information using quantitative resource assessment, computer-based mineral potential modelling, statistical analysis and risk quantification to model decision-processes and assess the impact of additional data. These modelling stages are used to address problems of complexity,uncertainty and credibility in the valuation of geoscientific data. A case study demonstrates the application of the methodology to generate a dollar value for current regional data upgrade programmes in the Geological Survey of Queensland. The results obtained are used for strategic planning of future data acquisition programmes aimed at supporting mineral resource management and stimulating effective exploration activity.  相似文献   

14.
In light of the rapid and continuous growth of the built environment worldwide and its attendant ecological impact, increasing the size and distribution of open space in urban areas is recognised as one effective way to reconcile the social and ecological objectives of society. However, there is no simple and objective indicator to measure open space that can be used for creating and maintaining sustainable landscapes. The paper introduces a metric, open space index (OSI), that measures the amount of space unpenetrated by the built environment. The metric is calculated by measuring the shortest distance between any location and the nearest built environment using a Geographic Information System. The metric is illustrated using two counties of the greater Twin Cities Metropolitan Region of Minnesota. The sensitivity of OSI to the size, shape, and spatial configuration of the built environment suggests that the metric can serve as an important planning tool for reconciling conservation and development in a wide range of contexts.  相似文献   

15.
ABSTRACT The problem of water resources management can be viewed as one requiring the existence and application of some type of “collective decision” mechanism. Currently, the general water resource decision problem is solved using an “individual decision” format without explicit consideration of the dominant social decision system. This paper demonstrates the need for blending technical planning activities with organized societal processes and then proposes a specific public decision framework to satisfy this requirement. The key element in this planning framework is a generalized “bargaining arena” which serves to link technical activities with the social system. Using this bargaining device we can (1) specify policy at a local level, (2) incorporate “social decision” rules into the planning process, and (3) provide local access to the decision process. A simple case of regional water quality management is used to describe the application of this planning procedure and to offer encouragement for successful use in more complex real-world cases.  相似文献   

16.
Global climate change will influence environmental conditions including temperature, surface radiation, soil moisture, and sea level, and it will also significantly impact regional-scale hydrologic processes such as evapotranspiration (ET), precipitation, runoff, and snowmelt. The quantity and quality of water available for drinking and other domestic usage is also likely to be affected by changes in these processes. Consequently, it is necessary to assess and reflect upon the challenges ahead for water infrastructure and the general public in metropolitan regions. One approach to the problem is to use index-based assessment, forecasting and planning. The drought indices previously developed were not developed for domestic water supplies, and thus are insufficient for the purpose of such an assessment. This paper aims to propose and develop a “Metropolitan Water Availability Index (MWAI)” to assess the status of both the quantity and quality of available potable water sources diverted from the hydrologic cycle in a metropolitan region. In this approach, the accessible water may be expressed as volume per month or week (i.e., m3/month or m3/week) relative to a prescribed historical record, and such a trend analysis may result in final MWAI values ranging from ?1 to +1 for regional water management decision making. The MWAI computation uses data and information from both historical point measurements and spatial remote-sensing based monitoring. Variables such as precipitation, river discharge, and water quality changes at drinking water plant intakes at specific locations are past “point” measurements in MWAI calculations. On the other hand, remote sensing provides information on both spatial and temporal distributions of key variables. Examples of remote-sensing images and sensor network technologies are in-situ sensor networks, ground-based radar, air-borne aircraft, and even space-borne satellites. A case study in Tampa Bay, Florida is described to demonstrate the short-term assessment of the MWAI concept at a practical level. It is anticipated that such a forecasting methodology may be extended for middle-term and long-term water supply assessment.  相似文献   

17.
The management of the plastic fraction is one of the most debated issues in the discussion on integrated municipal solid waste systems. Both material and energy recovery can be performed on such a waste stream, and different separate collection schemes can be implemented. The aim of the paper is to contribute to the debate, based on the analysis of different plastic waste recovery routes. Five scenarios were defined and modelled with a life cycle assessment approach using the EASEWASTE model. In the baseline scenario (P0) the plastic is treated as residual waste and routed partly to incineration with energy recovery and partly to mechanical biological treatment. A range of potential improvements in plastic management is introduced in the other four scenarios (P1–P4). P1 includes a source separation of clean plastic fractions for material recycling, whereas P2 a source separation of mixed plastic fraction for mechanical upgrading and separation into specific polymer types, with the residual plastic fraction being down-cycled and used for “wood items”. In P3 a mixed plastic fraction is source separated together with metals in a “dry bin”. In P4 plastic is mechanically separated from residual waste prior to incineration.A sensitivity analysis on the marginal energy was carried out. Scenarios were modelled as a first step assuming that marginal electricity and heat were based on coal and on a mix of fuels and then, in the sensitivity analysis, the marginal energy was based on natural gas.The study confirmed the difficulty to clearly identify an optimal strategy for plastic waste management. In fact none of the examined scenarios emerged univocally as the best option for all impact categories. When moving from the P0 treatment strategy to the other scenarios, substantial improvements can be obtained for “Global Warming”. For the other impact categories, results are affected by the assumption about the substituted marginal energy. Nevertheless, irrespective of the assumptions on marginal energy, scenario P4, which implies the highest quantities of specific polymer types sent to recycling, resulted the best option in most impact categories.  相似文献   

18.
ABSTRACT: This paper presents a simple methodology, using the entropy concept, to estimate regional hydro logic uncertainty and information at both gaged and ungaged grids in a basin. The methodology described in this paper is applicable for (a) the selection of the optimum station from a dense network, using maximization of information transmission criteria, and (b) expansion of a network using data from an existing sparse network by means of the information interpolation concept and identification of the zones from minimum hydrologic information. The computation of single and joint entropy terms used in the above two cases depends upon single and multivariable probability density functions. In this paper, these terms are derived for the gamma distribution. The derived formulation for optimum hydrologic network design was tested using the data from a network of 29 rain gages on Sleeper River Experimental Watershed. For the purpose of network reduction, the watershed was divided into three subregions, and the optimum stations and their locations in each subregion were identified. To apply the network expansion methodology, only the network consisting of 13 stations was used, and feasible triangular elements were formed by joining the stations. Hydrologic information was calculated at various points on the line segments, and critical information zones were identified by plotting information contours. The entropy concept used in this paper, although derived for single and bivaviate gamma distribution, is general in type and can easily be modified for other distributions by a simple variable transformation criterion.  相似文献   

19.
A study was made to analyze and modify procedures used for stream assimilation capacity and point source wasteload allocation calculations. This paper describes the sources and types of information collected and the analysis of alternative computation methods developed during the study. The calculation of stream assimilation capacity or Total Maximum Daily Load (TMDL), will depend upon assumed stream flows, quality standards, reaction rates, and modeling procedures. The “critical conditions” selected for TMDL calculations usually are low flows and warm temperatures. The complexity of water quality models used for TMDL and allocation calculations can range from simple, complete mixing to calibrated and verified mathematical models. A list of 20 wasteload allocation (WLA) methods was developed. Five of these WLA's were applied to an example stream to permit comparisons based on cost, equity, efficient use of stream assimilation capacity, and sensitivity to fundamental stream quality data. Based on insensitivity to data errors and current use by several states, the WLA method of “equal percent treatment” was preferable in the example stream.  相似文献   

20.
Softening drinking water before distribution yields advantages with environmental impact, such as lower household products consumption, less scaling in piping and machines, and the avoidance of decentralized, domestic softeners. Central softening is under consideration in Flanders by the largest water supplier, VMW (Dutch acronym for “Flemish Company for Water Supply”), to deliver soft (15 °F) water to their customers. A case study is presented for a region with hard water (47 °F). The chosen technique is the pellet reactor, based on precipitation of CaCO3 by NaOH addition. This softening operation has possibly large impact on the environment and the water consumption pattern.A cost-benefit analysis has been made to estimate the added value of central softening, by investigating the impact on the drinking water company, on their customers, on employment, on environment, on health, etc. The analysis for the region of study revealed benefits for customers which were higher than the costs for the drinking water company. However, pricing of drinking water remains an important problem.A sensitivity analysis of these results has also been made, to evaluate the impact of important hypothesis, and to be able to expand this study to other regions. The conclusions for this part show that softening is beneficial if water hardness is to be decreased by at least 5 °F.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号