首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper presents a new concept to include uncertainty management in energy and environmental planning models developed in algebraic modeling languages. SETSTOCH is a tool for linking algebraic modeling languages with specialized stochastic programming solvers. Its main role is to retrieve from the modeling language a dynamically ordered core model (baseline scenario) that is sent automatically to the stochastic solver. The case presented herein concerns such a study realized with the IEAMARKAL model used by many research teams around the world.  相似文献   

2.
Previous studies revealed triple bottom line cannot entirely cover the concept of corporate sustainability. This study thus uses sustainable resource management (SRM) to improve corporate sustainability (CS) performance considering the socio-environmental, socio-economical, and eco-efficiency aspects. In this study, the vague set (VS) theory and the technique for order preference by similarity to ideal solution (TOPSIS) method are integrated as a hybrid decision-making tool by which social media data can be transformed into entropy weights. The results indicate eco-efficiency and society should be prioritized to improve the corporate sustainability performance. Specifically, the aspects should be promoted through encouraging environmental innovation, redesigning consumers' offer, raising support of the institutions and policy measures, and organizing synergetic involvement. The contributions of this study are three-fold: (i) establishing a comprehensive framework for guiding firms make effective improvements; (ii) developing a hybrid VS-TOPSIS method to process the assessment data and social media information and address the interrelationships; (3) identifying the decisive SRM criteria to precisely guide the Chinese automobile industry towards CS under severe resource constraints.  相似文献   

3.
The slacks-based measure (SBM) model based on the constant returns to scale has achieved some good results in addressing the undesirable outputs, such as waste water and water gas, in measuring environmental efficiency. However, the traditional SBM model cannot deal with the scenario in which desirable outputs are constant. Based on the axiomatic theory of productivity, this paper carries out a systematic research on the SBM model considering undesirable outputs, and further expands the SBM model from the perspective of network analysis. The new model can not only perform efficiency evaluation considering undesirable outputs, but also calculate desirable and undesirable outputs separately. The latter advantage successfully solves the "dependence" problem of outputs, that is, we can not increase the desirable outputs without producing any undesirable outputs. The following illustration shows that the efficiency values obtained by two-stage approach are smaller than those obtained by the traditional SBM model. Our approach provides a more profound analysis on how to improve environmental efficiency of the decision making units.  相似文献   

4.
Climate change has become one of the most significant environmental issues, of which about 40% come from the building sector. In particular, complex building projects with various functions have increased, which should be managed from a program-level perspective. Therefore, this study aimed to develop a program-level management system for the life-cycle environmental and economic assessment of complex building projects. The developed system consists of three parts: (i) input part: database server and input data; (ii) analysis part: life cycle assessment and life cycle cost; and (iii) result part: microscopic analysis and macroscopic analysis. To analyze the applicability of the developed system, this study selected ‘U’ University, a complex building project consisting of research facility and residential facility. Through value engineering with experts, a total of 137 design alternatives were established. Based on these alternatives, the macroscopic analysis results were as follows: (i) at the program-level, the life-cycle environmental and economic cost in ‘U’ University were reduced by 6.22% and 2.11%, respectively; (ii) at the project-level, the life-cycle environmental and economic cost in research facility were reduced 6.01% and 1.87%, respectively; and those in residential facility, 12.01% and 3.83%, respective; and (iii) for the mechanical work at the work-type-level, the initial cost was increased 2.9%; but the operation and maintenance phase was reduced by 20.0%. As a result, the developed system can allow the facility managers to establish the operation and maintenance strategies for the environmental and economic aspects from a program-level perspective.  相似文献   

5.
This paper reports an approach to the assessment of the validity of environmental monitoring data--a 'data filter'. The strategy has been developed through the UK National Marine Analytical Quality Control (AQC) Scheme for application to data collected for the UK National Marine Monitoring Plan, although the principles described are applicable more widely. The proposed data filter is divided into three components: Part A, 'QA/QC'--an assessment of the laboratory's practices in Quality Assurance/Quality Control; Part B, 'fitness for purpose'--an evaluation of the standard of accuracy that can be demonstrated by activities in (A), in relation to the intended application of the data; and Part C, the overall assessment on which data will be accepted as usable or rejected as being of suspect quality. A pilot application of the proposed approach is reported. The approach described in this paper is intended to formalise the assessment of environmental monitoring data for fitness for a chosen purpose. The issues important to fitness for purpose are discussed and assigned a relative priority order on which to judge the reliability/usefulness of monitoring data.  相似文献   

6.
7.
The need to better address uncertainties in environmental assessment (EA) is well known, but less known is how those involved in, or affected by, EA processes understand and perceive uncertainties and how uncertainties are considered and disclosed. Based on a survey of 77 Canadian EA practitioners, regulators, and interest groups, this paper explores uncertainties in the EA process, uncertainty consideration and disclosure in EA practice and decision-making, and opportunities for improved disclosure. Nearly 80% of participants indicated that all EAs contain uncertainty; however, uncertainty disclosure was described as poor. Only 15% indicated that uncertainties are sufficiently acknowledged in practice and, when disclosed, considered by decision makers. Perceptions about uncertainty differed significantly between those who conducted EAs compared to those potentially affected by development, suggesting that either communication about uncertainty is poor, or participants' understandings about what is considered ‘good’ practice are very different. Almost half of the participants believe that there is overconfidence in impact predictions and mitigation measures, and the majority indicated that if uncertainties were more openly reported then EA would be a better tool for informing decisions. Most participants did not believe that EAs that openly disclose uncertainties lack credibility; and contrary to proponents' tendencies to limit disclosure, participants perceived limited risk of disclosure in terms of project approval. The majority of participants did not believe that there was sufficient guidance available on how to report uncertainties, or on how to use that information in decision-making. Results indicate a substantial need to better understand how uncertainties are viewed and dealt with in EA; the importance of uncertainty disclosure and consideration in EA; and the risks and benefits of uncertainty disclosure to proponents, decision makers, and the public. We identify several opportunities for improving the practice of uncertainty consideration and disclosure.  相似文献   

8.
The main problem of traditional methods of environmental impact assessment (EIA) is that in most of the existing algorithms and methods, such as Leopold, Folchi and RIAM, the main attention is to the destructive effects of the proposed plan, and the advantages of the industrial project are less noticeable. This has led to a permanent challenge between environmental organizations and industrial stakeholders. Data envelopment analysis (DEA) is a new approach of assessing the industrial units. Besides, it considers the positive economic and social impacts of the project and provides a comprehensive assessment of the industrial unit. With this approach, the environmental impacts of an industrial unit have been considered as “inputs” and its positive economic and social impacts considered as the “outputs” of the DEA models. Therefore, the problem of impact assessment changes into a DEA model. In the present study, the Alborz Sharghi Coal washing plant in northern Iran has been considered as a case study for implementing the DEA-EIA approach, and 19 plant activities and 11 environmental components have been used to evaluate the environmental effects of the plant. To solve the EIA problem, two commonly used DEA approaches, called CRS (constant returns to scale) and VRS (variable returns to scale), have been used. The DEA results identified the critical environmental components of the plant that should be considered seriously. Also, drawing the “potential improvement” diagram in the DEA method is an effective tool for determining the high risk activities of the factory and applying them in development plans. Besides, using the VRS model with maximize-output approach showed that some of the plant activities had the most differences with optimal mode and these components should be considered in future development plans. Finally, it can be concluded that, assessing the environmental impacts of the mineral industries with VRS maximize-output approach, is closer to the concept of sustainable development and cost-benefit analysis.  相似文献   

9.
In this paper we show the possibility of using expert system tools for environmental data management. We describe the domain indenpendent expert system shell SAK and Knowledge EXplorer, a system which learns rules from data. We show the functionality of Knowledge EXplorer on an example of water quality evaluation.  相似文献   

10.
Quantitative inference from environmental contaminant data is almost exclusively from within the classic Neyman/Pearson (N/P) hypothesis-testing model, by which the mean serves as the fundamental quantitative measure, but which is constrained by random sampling and the assumption of normality in the data. Permutation/randomization-based inference originally forwarded by R. A. Fisher derives probability directly from the proportion of the occurrences of interest and is not dependent upon the distribution of data or random sampling. Foundationally, the underlying logic and the interpretation of the significance of the two models vary, but inference using either model can often be successfully applied. However, data examples from airborne environmental fungi (mold), asbestos in settled dust, and 1,2,3,4-tetrachlorobenzene (TeCB) in soil demonstrate potentially misleading inference using traditional N/P hypothesis testing based upon means/variance compared to permutation/randomization inference using differences in frequency of detection (Δf d). Bootstrapping and permutation testing, which are extensions of permutation/randomization, confirm calculated p values via Δf d and should be utilized to verify the appropriateness of a given data analysis by either model.  相似文献   

11.
Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take place within an EIA setting. More specifically, we have identified uncertainties involved in each decision-making step and discussed the extent to which these can be treated and managed in the context of an activity or project that may have environmental impacts. To further demonstrate the relevance of the techniques identified, we have examined the extent to which the EIA guidelines currently used in Colombia consider and provide guidance on managing the uncertainty involved in these assessments. Some points that should be considered in order to provide greater robustness in impact assessments in Colombia have been identified. These include the management of stakeholder values, the systematic generation of project options, and their associated impacts as well as the associated management actions, and the evaluation of uncertainties and assumptions. We believe that the relevant and specific techniques reported here can be a reference for future evaluations of other EIA guidelines in different countries.  相似文献   

12.
Environmental practices in knowledge management capability (EKMC) is a complex and uncertainty concept that is difficult to determine based on a firm’s real situation because measuring EKMC requires a set of qualitative and quantitative measurement. The objective of this study is to develop a cause and effect model in uncertainty using the fuzzy set theory and Decision Making Trial and Evaluation Laboratory (DEMATEL) method. A framework for evaluating EKMC is proposed. An approach of fuzzy linguistic is proposed to evaluate the firm EKMC. The evaluation results of EKMC obtained through the proposed approach are objective and unbiased due to two reasons. Firstly, the results are generated by a group of experts in the presence of motile attributes. Secondly, the fuzzy linguistic approach has more advantage to reduce distortion and losing of information. Through evaluating the result of EKMC, managers could judge the necessity to improve the EKMC and determine which criteria are the needed directions to improve. The managerial implication and conclusions are discussed.  相似文献   

13.
Environmental flows (Eflow, hereafter) are the flows to be maintained in the river for its healthy functioning and the sustenance and protection of aquatic ecosystems. Estimation of Eflow in any river stretch demands consideration of various factors such as flow regime, ecosystem, and health of river. However, most of the Eflow estimation studies have neglected the water quality factor. This study urges the need to consider water quality criterion in the estimation of Eflow and proposes a framework for estimating Eflow incorporating water quality variations under present and hypothetical future scenarios of climate change and pollution load. The proposed framework is applied on the polluted stretch of Yamuna River passing through Delhi, India. Required Eflow at various locations along the stretch are determined by considering possible variations in future water quantity and quality. Eflow values satisfying minimum quality requirements for different river water usage classes (classes A, B, C, and D as specified by the Central Pollution Control Board, India) are found to be between 700 and 800 m3/s. The estimated Eflow values may aid policymakers to derive upstream storage-release policies or effluent restrictions. Generalized nature of this framework will help its implementation on any river systems.  相似文献   

14.
The identification of pollution levels by numerical classification-ordination and the statistical confirmation of the detected trends-were attempted in a eutrophication assessment study. Special emphasis was placed on the importance of data scaling and the selection of a distance coefficient that would accentuate discrete states within the system. Among metric, binary and ordinal variable scaling, ordinal numbers showed the maximum sensitivity in discriminating pollution levels; the observed trends were further enhanced by using the absolute distance coefficient as a resemblance measure. The eutrophic patterns identified were statistically confirmed by a non-parametric permutation test. Finally a step-by-step multivariate procedure is proposed for assessing environmental quality in aquatic ecosystems.  相似文献   

15.
The combination of lognormally distributed quantities of interest with normally distributed random measurement error produces data that follow a compound normal-lognormal (NLN) distribution. When the measurement error is large enough, such data do not approximate normality, even after a logarithmic transformation. This paper reports the results of a search for a transformation method for NLN data that is not only technically appropriate, but easy to implement as well. Three transformation families were found to work relatively well. These families are compared in terms of success in achieving normality and robustness, using simulated NLN data and actual environmental data believed to follow a NLN distribution. The exponential family of transformations was found to give the best overall results. This work was supported by the U.S. Department of Energy, Office of Environmental Restoration and Waste Management, under DOE Idaho Field Office Contract DE-AC07-76ID01570.  相似文献   

16.
The presence of contaminants in environmental media as well as the desire to maintain a high level of economic activity has led to an important and difficult decision‐making problem for both public policy decision makers and the general public. In one sense, all of the interested parties are likely to be concerned about the potential health risks posed by the presence of contaminants in environmental media and the need to design/implement policies for their mitigation and/or removal. At the same time, however, there also appear to be concerns about the cost of these policies. These costs could be measured in terms of the potential losses in economic activity that are likely to occur when a policy is adopted. The policy can be selected from a range of alternatives, with the choice being driven in part by the stakeholders represented in the policy decision problem. In this case, two general goals might be considered: minimizing environmental risk and minimizing the economic impact of the policy considered. These two objectives are likely to be viewed as conflicting goals, and the nature of the tradeoffs between them must be taken into account in the policy selection process. This paper presents the development of a zero–one weighted goal programming model that can be used to select a preferred policy that minimizes surface and groundwater contamination as well as the economic costs of environmental policy selection. A safety rule model is developed first and then extended to the zero–one weighted goal programming formulation. The stochastic aspects of these structures are emphasized throughout. The paper also addresses a number of issues related to implementation of the model.  相似文献   

17.
Long-term water quality monitoring is of high value for environmental management as well as for research. Artificial level shifts in time series due to method improvements, flaws in laboratory practices or changes in laboratory are a common limitation for analysis, which, however, are often ignored. Statistical estimation of such artefacts is complicated by the simultaneous existence of trends, seasonal variation and effects of other influencing factors, such as weather conditions. Here, we investigate the performance of generalised additive mixed models (GAMM) to simultaneously identify one or more artefacts associated with artificial level shifts, longitudinal effects related to temporal trends and seasonal variation, as well as to model the serial correlation structure of the data. In the same model, it is possible to estimate separate residual variances for different periods so as to identify if artefacts not only influence the mean level but also the dispersion of a series. Even with an appropriate statistical methodology, it is difficult to quantify artificial level shifts and make appropriate adjustments to the time series. The underlying temporal structure of the series is especially important. As long as there is no prominent underlying trend in the series, the shift estimates are rather stable and show less variation. If an artificial shift occurs during a slower downward or upward tendency, it is difficult to separate these two effects and shift estimates can be both biased and have large variation. In the case of a change in method or laboratory, we show that conducting the analyses with both methods in parallel strongly improves estimates of artefact effects on the time series, even if certain problems remain. Due to the difficulties of estimating artificial level shifts, posterior adjustment is problematic and can lead to time series that no longer can be used for trend analysis or other analysis based on the longitudinal structure of the series. Before carrying out a change in analytic method or laboratory, it should be considered if this is absolutely necessary. If changes cannot be avoided, the analysis of the two methods considered, or the two laboratories contracted, should be run in parallel for a considerable period of time so as to enable a good assessment of changes introduced to the data series.  相似文献   

18.
Climate change impact assessment is subject to a range of uncertainties due to both incomplete and unknowable knowledge. This paper presents an approach to quantifying some of these uncertainties within a probabilistic framework. A hierarchical impact model is developed that addresses uncertainty about future greenhouse gas emissions, the climate sensitivity, and limitations and unpredictability in general circulation models. The hierarchical model is used in Bayesian Monte-Carlo simulations to define posterior probability distributions for changes in seasonal-mean temperature and precipitation over the United Kingdom that are conditional on prior distributions for the model parameters. The application of this approach to an impact model is demonstrated using a hydrological example. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

19.
In estimating spatial means of environmental variables of a region from datacollected by convenience or purposive sampling, validity of the results canbe ensured by collecting additional data through probability sampling. Theprecision of the estimator that uses the probability sample can beincreased by interpolating the values at the nonprobability sample points tothe probability sample points, and using these interpolated values as anauxiliary variable in the difference or regression estimator. Theseestimators are (approximately) unbiased, even when the nonprobability sampleis severely biased such as in preferential samples. The gain in precisioncompared to the estimator in combination with Simple Random Samplingis controlled by the correlation between the target variable andinterpolated variable. This correlation is determined by the size (density)and spatial coverage of the nonprobability sample, and the spatialcontinuity of the target variable. In a case study the average ratio of thevariances of the simple regression estimator and estimator was 0.68for preferential samples of size 150 with moderate spatial clustering, and0.80 for preferential samples of similar size with strong spatialclustering. In the latter case the simple regression estimator wassubstantially more precise than the simple difference estimator.  相似文献   

20.
Strategic environmental assessment (SEA) inherently needs to address greater levels of uncertainty in the formulation and implementation processes of strategic decisions, compared with project environmental impact assessment. The range of uncertainties includes internal and external factors of the complex system that is concerned in the strategy. Scenario analysis is increasingly being used to cope with uncertainty in SEA. Following a brief introduction of scenarios and scenario analysis, this paper examines the rationale for scenario analysis in SEA in the context of China. The state of the art associated with scenario analysis applied to SEA in China was reviewed through four SEA case analyses. Lessons learned from these cases indicated the word “scenario” appears to be abused and the scenario-based methods appear to be misused due to the lack of understanding of an uncertain future and scenario analysis. However, good experiences were also drawn on, regarding how to integrate scenario analysis into the SEA process in China, how to cope with driving forces including uncertainties, how to combine qualitative scenario storylines with quantitative impact predictions, and how to conduct assessments and propose recommendations based on scenarios. Additionally, the ways to improve the application of this tool in SEA were suggested. We concluded by calling for further methodological research on this issue and more practices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号