首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Abstract: The National Research Council recommended Adaptive Total Maximum Daily Load implementation with the recognition that the predictive uncertainty of water quality models can be high. Quantifying predictive uncertainty provides important information for model selection and decision‐making. We review five methods that have been used with water quality models to evaluate model parameter and predictive uncertainty. These methods (1) Regionalized Sensitivity Analysis, (2) Generalized Likelihood Uncertainty Estimation, (3) Bayesian Monte Carlo, (4) Importance Sampling, and (5) Markov Chain Monte Carlo (MCMC) are based on similar concepts; their development over time was facilitated by the increasing availability of fast, cheap computers. Using a Streeter‐Phelps model as an example we show that, applied consistently, these methods give compatible results. Thus, all of these methods can, in principle, provide useful sets of parameter values that can be used to evaluate model predictive uncertainty, though, in practice, some are quickly limited by the “curse of dimensionality” or may have difficulty evaluating irregularly shaped parameter spaces. Adaptive implementation invites model updating, as new data become available reflecting water‐body responses to pollutant load reductions, and a Bayesian approach using MCMC is particularly handy for that task.  相似文献   

2.
Dual-permeability models have been developed to account for the significant effects of macropore flow on contaminant transport, but their use is hampered by difficulties in estimating the additional parameters required. Therefore, our objective was to evaluate data requirements for parameter identification for predictive modeling with the dual-permeability model MACRO. Two different approaches were compared: sequential uncertainty fitting (SUFI) and generalized likelihood uncertainty estimation (GLUE). We investigated six parameters controlling macropore flow and pesticide sorption and degradation, applying MACRO to a comprehensive field data set of bromide andbentazone [3-isopropyl-1H-2,1,3-benzothiadiazin-4(3H)-one-2,2dioxide] transport in a structured soil. The GLUE analyses of parameter conditioning for different combinations of observations showed that both resident and flux concentrations were needed to obtain highly conditioned and unbiased parameters and that observations of tracer transport generally improved the conditioning of macropore flow parameters. The GLUE "behavioral" parameter sets covered wider parameter ranges than the SUFI posterior uncertainty domains. Nevertheless, estimation uncertainty ranges defined by the 5th and 95th percentiles were similar and many simulations randomly sampled from the SUFI posterior uncertainty domains had negative model efficiencies (minimum of -3.2). This is because parameter correlations are neglected in SUFI and the posterior uncertainty domains were not always determined correctly. For the same reasons, uncertainty ranges for predictions of bentazone losses through drainflow for good agricultural practice in southern Sweden were 27% larger for SUFI compared with GLUE. Although SUFI proved to be an efficient parameter estimation tool, GLUE seems better suited as a method of uncertainty estimation for predictions.  相似文献   

3.
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in quantifying input uncertainty even with little information. Uncertainties in forest carbon budget projections were examined with Monte Carlo analyses of the model FORCARB. We identified model sensitivity to range, shape, and covariability among model probability density functions, even under conditions of limited initial information. Distributional forms of probabilities were not as important as covariability or ranges of values. Covariability among FORCARB model parameters emerged as a very influential component of uncertainty, especially for estimates of average annual carbon flux.  相似文献   

4.
Abstract: A mix of causative mechanisms may be responsible for flood at a site. Floods may be caused because of extreme rainfall or rain on other rainfall events. The statistical attributes of these events differ according to the watershed characteristics and the causes. Traditional methods of flood frequency analysis are only adequate for specific situations. Also, to address the uncertainty of flood frequency estimates for hydraulic structures, a series of probabilistic analyses of rainfall‐runoff and flow routing models, and their associated inputs, are used. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated to evaluate the probability of floods. Therefore, the objectives of this study were to develop a flood frequency curve derivation method driven by multiple random variables and to develop a tool that can consider the uncertainties of design floods. This study focuses on developing a flood frequency curve based on nonparametric statistical methods for the estimation of probabilities of rare floods that are more appropriate in Korea. To derive the frequency curve, rainfall generation using the nonparametric kernel density estimation approach is proposed. Many flood events are simulated by nonparametric Monte Carlo simulations coupled with the center Latin hypercube sampling method to estimate the associated uncertainty. This study applies the methods described to a Korean watershed. The results provide higher physical appropriateness and reasonable estimates of design flood.  相似文献   

5.
ABSTRACT. Recent advances in water quality modelling have pointed out the need for stochastic models to simulate the probabilistic nature of water quality. However, often all that is needed is an estimate of the uncertainty in predicting water quality variables. First order analysis is a simple method of providing an estimate in the uncertainty in a deterministic model due to uncertain parameters. The method is applied to the simplified Streeter-Phelps equations for DO and BOD; a more complete Monte Carlo simulation is used to check the accuracy of the results. The first order analysis is found to give accurate estimates of means and variances of DO and BOD up to travel times exceeding the critical time. Uncertainty in travel time and the BOD decay constant are found to be most important for small travel times; uncertainty in the reaeration coefficient dominates near the critical time. Uncertainty in temperature was found to be a negligible source of uncertainty in DO for all travel times.  相似文献   

6.
ABSTRACT: The ability to predict extreme floods is an important part of the planning process for any water project for which failure will be very costly. The length of a gage record available for use in estimating extreme flows is generally much shorter than the recurrence interval of the desired flows, resulting in estimates having a high degree of uncertainty. Maximum likelihood estimators of the parameters of the three parameter lognormal (3PLN) distribution, which make use of historical data, are presented. A Monte Carlo study of extreme flows estimated from samples drawn from three hypothetical 3PLN populations showed that inclusion of historical flows with the gage record reduced the bias and variance of extreme flow estimates. Asymptotic theory approximations of parameter variances and covariances calculated using the second and mixed partial derivatives of the log likelihood function agreed well with Monte Carlo results. First order approximations of the standard deviations of the extreme flow estimates did not agree with the Monte Carlo results. An alternative method for calculating those standard deviations, the “asymptotic simulation” method, is described. The standard deviations calculated by asymptotic simulation agree well with the Monte Carlo results.  相似文献   

7.
ABSTRACTS: Modeling error can be divided into two basic components: use of an incorrect model and input parameter uncertainty. Incorrect model usage can be further subdivided into inappropriate model selection and inherent modeling error due to process aggregation. Total modeling error is a culmination of these various modeling error components, with overall optimization requiring reductions in all. A technique, utilizing Monte Carlo analysis, is employed to investigate the relative importance of input parameter uncertainty versus process aggregation error. An expanded form of the Streeter-Phelps dissolved oxygen equation is used to demonstrate the application of this technique. A variety of scenarios are analyzed to illustrate the relative obfuscation of each modeling error component. Under certain circumstances an aggregated model performs better than a more complex model, which perfectly simulates the real system. Alternately, process aggregation error dominates total modeling error for other situations. The ability to differentiate modeling error impact is a function of the desired or imposed model performance level (accuracy tolerance).  相似文献   

8.
Water quality modelling of the river Yamuna (India) using QUAL2E-UNCAS   总被引:2,自引:0,他引:2  
This paper describes the utility of QUAL2E as a modelling package in the evaluation of a water quality improvement programme. In this study, QUAL2E was applied to determine the pollution loads in the river Yamuna during its course through the national capital territory of Delhi, India. The study aimed at examining the influence of different scenarios on river water quality. Four different pollution scenarios were analysed besides the 'business as usual' situation. The study revealed that it was necessary to treat the discharge from drains to the river Yamuna and diversion of a substantial load to the Agra canal for further treatment was also essential. It was also established through this study that maintaining a flow rate of more than 10 m(3)/s in the river could also help preserve the river's water quality. To clearly display the model outcomes and demarcate polluted zones in the river stretch, model results were interfaced with a Geographical Information System (GIS) to produce cartographic outputs. In addition, uncertainty analysis in the form of first-order error analysis and Monte Carlo analysis was performed, to realise the effect of each model parameter on DO and BOD predictions. The uncertainty analysis gave satisfactory results on simulated data.  相似文献   

9.
Sewage discharge from an ocean outfall is subject to water quality standards, which are often stated in probabilistic terms. Monte Carlo simulation (MCS) has been used in the past to evaluate the ability of a designed outfall to meet water quality standards or compliance guidelines associated with sewage discharges. In this study, simpler and less computer-intensive probabilistic methods are considered. The probabilistic methods evaluated are the popular mean first-order second-moment (MFOSM) and the advance first-order second-moment (AFOSM) methods. Available data from the Spaniard's Bay Outfall located on the east coast of Newfoundland, Canada, were used as inputs for a case study. Both methods were compared with results given by MCS. It was found that AFOSM gave a good approximation of the failure probability for total coliform concentration at points remote from the outfall. However, MFOSM was found to be better when considering only the initial dilutions between the discharge point and the surface. Reasons for the different results may be the difference in complexity of the performance function in both cases. This study does not recommend the use of AFOSM for failure analysis in ocean outfall design and analysis because the analysis requires computational efforts similar to MCS. With the advancement of computer technology, simulation techniques, available software, and its flexibility in handling complex situations, MCS is still the best choice for failure analysis of ocean outfalls when data or estimates on the parameters involved are available or can be assumed.  相似文献   

10.
11.
ABSTRACT: A common framework for the analysis of water resources systems is the input-parameter-output representation. The system, described by its parameters, is driven by inputs and responds with outputs. To calibrate (estimate the parameters) models of these systems requires data on both inputs and outputs, both of which are subject to random errors. When one is uncertain as to whether the predominant source of error is associated with inputs or outputs, uncertainty also exists as to the correct specification of a calibration criterion. This paper develops and analyzes two alternative least squares criteria for calibrating a numerical water quality model. The first criterion assumes that errors are associated with inputs while the second assumes output errors. Statistical properties of the resulting estimators are examined under conditions of pure input or output error and mixed error conditions from a theoretical perspective and then using simulated results from a series of Monte Carlo experiments.  相似文献   

12.
After Hurricane Katrina passed through the US Gulf Coast in August 2005, floodwaters covering New Orleans were pumped into Lake Pontchartrain as part of the rehabilitation process in order to make the city habitable again. The long-term consequences of this environmentally critical decision were difficult to assess at the time and were left to observation. In the aftermath of these natural disasters, and in cases of emergency, the proactive use of screening level models may prove to be an important factor in making appropriate decisions to identify cost effective and environmentally friendly mitigation solutions. In this paper, we propose such a model and demonstrate its use through the application of several hypothetical scenarios to examine the likely response of Lake Pontchartrain to the contaminant loading that were possibly in the New Orleans floodwaters. For this purpose, an unsteady-state fugacity model was developed in order to examine the environmental effects of contaminants with different physicochemical characteristics on Lake Pontchartrain. The three representative contaminants selected for this purpose are benzene, atrazine, and polychlorinated biphenyls (PCBs). The proposed approach yields continuous fugacity values for contaminants in the water, air, and sediment compartments of the lake system which are analogous to concentrations. Since contaminant data for the floodwaters are limited, an uncertainty analysis was also performed in this study. The effects of uncertainty in the model parameters were investigated through Monte Carlo analysis. Results indicate that the acceptable recovery of Lake Pontchartrain will require a long period of time. The computed time range for the levels of the three contaminants considered in this study to decrease to maximum contaminant levels (MCLs) is about 1 year to 68 years. The model can be implemented to assess the possible extent of damage inflicted by any storm event on the natural water resources of Southern Louisiana or similar environments elsewhere. Furthermore, the model developed can be used as a useful decision-making tool for planning and remediation in similar emergency situations by examining various potential contamination scenarios and their consequences.  相似文献   

13.
Environmental integrated assessments are often carried out via the aggregation of a set of environmental indicators. Aggregated indices derived from the same data set can differ substantially depending upon how the indicators are weighted and aggregated, which is often a subjective matter. This article presents a method of generating aggregated environmental indices in an objective manner via Monte Carlo simulation. Rankings derived from the aggregated indices within and between three Monte Carlo simulations were used to evaluate the overall environmental condition of the study area. Other insights, such as the distribution of good or bad values of indicators at a watershed and/or a subregion, were observed in the study.  相似文献   

14.
Nitrogen flows impacted by human activities in the Day-Nhue River Basin in northern Vietnam have been modeled using adapted material flow analysis (MFA). This study introduces a modified uncertainty analysis procedure and its importance in MFA. We generated a probability distribution using a Monte Carlo simulation, calculated the nitrogen budget for each process and then evaluated the plausibility under three different criterion sets. The third criterion, with one standard deviation of the budget value as the confidence interval and 68% as the confidence level, could be applied to effectively identify hidden uncertainties in the MFA system. Sensitivity analysis was conducted for revising parameters, followed by the reassessment of the model structure by revising equations or flow regime, if necessary. The number of processes that passed the plausibility test increased from five to nine after reassessment of model uncertainty with a greater model quality. The application of the uncertainty analysis approach to this case study revealed that the reassessment of equations in the aquaculture process largely changed the results for nitrogen flows to environments. The significant differences were identified as increased nitrogen load to the atmosphere and to soil/groundwater (17% and 41%, respectively), and a 58% decrease in nitrogen load to surface water. Thus, modified uncertainty analysis was considered to be an important screening system for ensuring quality of MFA modeling.  相似文献   

15.
ABSTRACT: A simple simulation type approach and a statistical method are proposed for determining the confidence interval of the T‐year frequency rainfall percentiles (or precipitation extremes) for generalized extreme value (GEV) distributions. The former method is based on the Monte Carlo testing procedure. To generate realizations, the covariance structure of the three parameters of GEV is investigated using an observed information matrix of the likelihood function. For distributions with realistic parameters, the correlation between the location and the scale parameters is practically constant when the shape parameter varies around values close to its optimal value. The latter method is based on likelihood ratio statistics. In the case where the joint confidence surface for shape parameters and estimates is plotted with lines of best estimates, the region where the estimated best percentile value can be chosen as a possible estimate is part of the joint confidence surface. The projection of this bounded region on axis of percentile is defined as the effective confidence interval in this research. The use of this effective interval as the confidence interval of the percentile of T‐year frequency rainfall is particularly recommended because it is stable for T and it reflects variations in all three parameters of GEV appropriately.  相似文献   

16.
ABSTRACT: The effectiveness of urban Best Management Practices (BMPs) in achieving the No-Net-Increase Policy (NNTP), a policy designed to limit nonpoint nitrogen loading to Long Island Sound (US), is analyzed. A unit loading model is used to simulate annual nitrogen exported from the Norwalk River watershed (Connecticut) under current and future conditions. A probabilistic uncertainty analysis is used to incorporate uncertainty in nitrogen export coefficients and BMP nitrogen removal effectiveness. The inclusion of uncertainty in BMP effectiveness and nitrogen export coefficients implies that additional BMPs, or BMPs with a greater effectiveness in nitrogen removal, will be required to achieve the NNIP. Even though including uncertainty leads to an increase in BMP implementation rates or BMP effectiveness, this type of analysis provides the decision maker with a more realistic assessment of the likelihood that implementing BMPs as a management strategy will be successful. Monte Carlo simulation results indicate that applying BMPs to new urban developments alone will not be sufficient to achieve the NNIP since BMPs are not 100 percent effective in removing the increase in nitrogen caused by urbanization. BMPs must also be applied to selected existing urban areas. BMPs with a nitrogen removal effectiveness of 40–60 percent, probably the highest level of removal that can be expected over an entire watershed, must be applied to at least 75 percent of the existing urban area to achieve the NNIP This high rate of application is not likely to be achieved in urbanized watersheds in the LIS watershed; therefore, additional point source control will be necessary to achieve the NNIP  相似文献   

17.
This research analyses the application of spatially explicit sensitivity and uncertainty analysis for GIS (Geographic Information System) multicriteria decision analysis (MCDA) within a multi-dimensional vulnerability assessment regarding flooding in the Salzach river catchment in Austria. The research methodology is based on a spatially explicit sensitivity and uncertainty analysis of GIS-CDA for an assessment of the social, economic, and environmental dimensions of vulnerability. The main objective of this research is to demonstrate how a unified approach of uncertainty and sensitivity analysis can be applied to minimise the associated uncertainty within each dimension of the vulnerability assessment. The methodology proposed for achieving this objective is composed of four main steps. The first step is computing criteria weights using the analytic hierarchy process (AHP). In the second step, Monte Carlo simulation is applied to calculate the uncertainties associated with AHP weights. In the third step, the global sensitivity analysis (GSA) is employed in the form of a model-independent method of output variance decomposition, in which the variability of the different vulnerability assessments is apportioned to every criterion weight, generating one first-order (S) and one total effect (ST) sensitivity index map per criterion weight. Finally, in the fourth step, an ordered weighted averaging method is applied to model the final vulnerability maps. The results of this research demonstrate the robustness of spatially explicit GSA for minimising the uncertainty associated with GIS-MCDA models. Based on these results, we conclude that applying the variance-based GSA enables assessment of the importance of each input factor for the results of the GIS-MCDA method, both spatially and statistically, thus allowing us to introduce and recommend GIS-based GSA as a useful methodology for minimising the uncertainty of GIS-MCDA.  相似文献   

18.
Legislation on the protection of biodiversity (e.g., European Union Habitat and Bird Directives) increasingly requires ecological impact assessment of human activities. However, knowledge and understanding of relevant ecological processes and species responses to different types of impact are often incomplete. In this paper we demonstrate with a case study how impact assessment can be carried out for situations where data are scarce but some expert knowledge is available. The case study involves two amphibian species, the great crested newt (Triturus cristatus) and the natterjack toad (Bufo calamita) in the nature reserve the Meinweg in the Netherlands, for which plans are being developed to reopen an old railway track called the Iron Rhine. We assess the effects of this railway track and its proposed alternatives (scenarios) on the metapopulation extinction time and the occupancy times of the patches for both species using a discrete-time stochastic metapopulation model. We quantify the model parameters using expert knowledge and extrapolated data. Because of our uncertainty about these parameter values, we perform a Monte Carlo uncertainty analysis. This yields an estimate of the probability distribution of the model predictions and insight into the contribution of each distinguished source of uncertainty to this probability distribution. We show that with a simple metapopulation model and an extensive uncertainty analysis it is possible to detect the least harmful scenario. The ranking of the different scenarios is consistent. Thus, uncertainty analysis can enhance the role of ecological impact assessment in decision making by making explicit to what extent incomplete knowledge affects predictions.  相似文献   

19.
Uncertainty Analysis In Dissolved Oxygen Modeling in Streams   总被引:1,自引:0,他引:1  
Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter–Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors. Note: This version was published online in June 2005 with the cover date of August 2004.  相似文献   

20.
ABSTRACT: A model for estimating the probability of exceeding groundwater quality standards at environmental receptors based on a simple contaminant transport model is described. The model is intended for locations where knowledge about site-specific hydrogeologic conditions is limited. An efficient implementation methodology using numerical Monte Carlo simulation is presented. The uncertainty in the contaminant transport system due to uncertainty in the hydraulic conductivity is directly calculated in the Monte Carlo simulations. Numerous variations of the deterministic parameters of the model provide an indication of the change in exceedance probability with change in parameter value. The results of these variations for a generic example are presented in a concise graphical form which provides insight into the topology of the exceedance probability surface. This surface can be used to assess the impact of the various parameters on exceedance probability.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号