首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
ABSTRACT: A model for estimating the probability of exceeding groundwater quality standards at environmental receptors based on a simple contaminant transport model is described. The model is intended for locations where knowledge about site-specific hydrogeologic conditions is limited. An efficient implementation methodology using numerical Monte Carlo simulation is presented. The uncertainty in the contaminant transport system due to uncertainty in the hydraulic conductivity is directly calculated in the Monte Carlo simulations. Numerous variations of the deterministic parameters of the model provide an indication of the change in exceedance probability with change in parameter value. The results of these variations for a generic example are presented in a concise graphical form which provides insight into the topology of the exceedance probability surface. This surface can be used to assess the impact of the various parameters on exceedance probability.  相似文献   

2.
The material flow analysis method can be used to assess the impact of environmental sanitation systems on resource consumption and environmental pollution. However, given the limited access to reliable data, application of this data-intensive method in developing countries may be difficult. This paper presents an approach allowing to develop material flow models despite limited data availability. Application of an iterative procedure is of key importance: model parameter values should first be assessed on the basis of a literature review and by eliciting expert judgement. If model outputs are not plausible, sensitive input parameters should be reassessed more accurately. Moreover, model parameters can be expressed as probability distributions and variable uncertainty estimated by using Monte Carlo simulation. The impact of environmental sanitation systems on the phosphorus load discharged into surface water in Hanoi, Vietnam, is simulated by applying the proposed approach.  相似文献   

3.
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in quantifying input uncertainty even with little information. Uncertainties in forest carbon budget projections were examined with Monte Carlo analyses of the model FORCARB. We identified model sensitivity to range, shape, and covariability among model probability density functions, even under conditions of limited initial information. Distributional forms of probabilities were not as important as covariability or ranges of values. Covariability among FORCARB model parameters emerged as a very influential component of uncertainty, especially for estimates of average annual carbon flux.  相似文献   

4.
Nitrogen flows impacted by human activities in the Day-Nhue River Basin in northern Vietnam have been modeled using adapted material flow analysis (MFA). This study introduces a modified uncertainty analysis procedure and its importance in MFA. We generated a probability distribution using a Monte Carlo simulation, calculated the nitrogen budget for each process and then evaluated the plausibility under three different criterion sets. The third criterion, with one standard deviation of the budget value as the confidence interval and 68% as the confidence level, could be applied to effectively identify hidden uncertainties in the MFA system. Sensitivity analysis was conducted for revising parameters, followed by the reassessment of the model structure by revising equations or flow regime, if necessary. The number of processes that passed the plausibility test increased from five to nine after reassessment of model uncertainty with a greater model quality. The application of the uncertainty analysis approach to this case study revealed that the reassessment of equations in the aquaculture process largely changed the results for nitrogen flows to environments. The significant differences were identified as increased nitrogen load to the atmosphere and to soil/groundwater (17% and 41%, respectively), and a 58% decrease in nitrogen load to surface water. Thus, modified uncertainty analysis was considered to be an important screening system for ensuring quality of MFA modeling.  相似文献   

5.
This research analyses the application of spatially explicit sensitivity and uncertainty analysis for GIS (Geographic Information System) multicriteria decision analysis (MCDA) within a multi-dimensional vulnerability assessment regarding flooding in the Salzach river catchment in Austria. The research methodology is based on a spatially explicit sensitivity and uncertainty analysis of GIS-CDA for an assessment of the social, economic, and environmental dimensions of vulnerability. The main objective of this research is to demonstrate how a unified approach of uncertainty and sensitivity analysis can be applied to minimise the associated uncertainty within each dimension of the vulnerability assessment. The methodology proposed for achieving this objective is composed of four main steps. The first step is computing criteria weights using the analytic hierarchy process (AHP). In the second step, Monte Carlo simulation is applied to calculate the uncertainties associated with AHP weights. In the third step, the global sensitivity analysis (GSA) is employed in the form of a model-independent method of output variance decomposition, in which the variability of the different vulnerability assessments is apportioned to every criterion weight, generating one first-order (S) and one total effect (ST) sensitivity index map per criterion weight. Finally, in the fourth step, an ordered weighted averaging method is applied to model the final vulnerability maps. The results of this research demonstrate the robustness of spatially explicit GSA for minimising the uncertainty associated with GIS-MCDA models. Based on these results, we conclude that applying the variance-based GSA enables assessment of the importance of each input factor for the results of the GIS-MCDA method, both spatially and statistically, thus allowing us to introduce and recommend GIS-based GSA as a useful methodology for minimising the uncertainty of GIS-MCDA.  相似文献   

6.
ABSTRACT: A simulation analysis of contaminated sediment transport involves model selection, data collection, model calibration and verification, and evaluation of uncertainty in the results. Sensitivity analyses provide information to address these issues at several stages of the investigation. A sensitivity analysis of simulated contaminated sediment transport is used to identify the most sensitive output variables and the parameters most responsible for the output variable sensitivity. The output variables included are streamflow and the flux of sediment and Cs137. The sensitivities of these variables are measured at the field and intermediate scales, for flood and normal flow conditions, using the HSPF computer model. A sensitivity index was used to summarize and compare the results of a large number of output variables and parameters. An extensive database was developed to calibrate the model and conduct the sensitivity analysis on a 6.2 mi2 catchment in eastern Tennessee. The fluxes of sediment and Cs137 were more sensitive than streamflow to changes in parameters for both flood and normal flow conditions. The relative significance of specific parameters on output variable sensitivity varied according to the type of flow condition and the location in the catchment. An implications section illustrates how sensitivity analysis results can help with model selection, planning data collection, calibration, and uncertainty analysis.  相似文献   

7.
Two radioactive elements, uranium (U) and radon (Rn), which are of potential concern in New Hampshire (NH) groundwater, are investigated. Exceedance probability maps are tools to highlight locations where the concentrations of undesirable substances in the groundwater may be elevated. Two forms of statistical analysis are used to create exceedance probability maps for U and Rn in NH groundwater. The first, Boosted Regression Tree (BRT), was selected for estimating U exceedance values. It computes exceedance values directly using the Bernoulli distribution function. The second method of statistical analysis used for Rn to determine exceedance probabilities is ordinary least squares (OLS) regression. In the process of determining exceedance probabilities for U and Rn, the utility of a new dataset is investigated. That new predictor dataset is the Multi-Order Hydrologic Position (MOHP) dataset. MOHP raster datasets have been produced nationally for the conterminous United States at a 30-m resolution. The concept behind MOHP is that, for any given point on the earth's surface, there is the potential for a longer groundwater flow path as one goes deeper beneath the land surface. MOHP predictors were tested in both models. Three MOHP predictors were found useful in the BRT model and two in the OLS model. MOHP data were found useful as predictors along with other site characteristics in predicting U and Rn exceedance probabilities in New Hampshire groundwater.  相似文献   

8.
ABSTRACT: This work presents a flexible system called GIS‐based Flood Information System (GFIS) for floodplain modeling, flood damages calculation, and flood information support. It includes two major components, namely floodplain modeling and custom designed modules. Model parameters and input data are gathered, reviewed, and compiled using custom designed modules. Through these modules, it is possible for GFIS to control the process of flood‐plain modeling, presentation of simulation results, and calculation of flood damages. Empirical stage‐damage curves are used to calculate the flood damages. These curves were generated from stage‐damage surveys of anthropogenic structures, crops, etc., in the coastal region of a frequently flooded area in Chia‐I County, Taiwan. The average annual flood damages are calculated with exceedance probability and flood damages for the designed rainfalls of 2, 5, 10, 25, 50, 100, and 200 year recurrence intervals with a duration of 24 hours. The average annual flood depth in this study area can also be calculated using the same method. The primary advantages of GFIS are its ability to accurately predict the locations of flood area, depth, and duration; calculate flood damages in the floodplain; and compare the reduction of flood damages for flood mitigation plans.  相似文献   

9.
Nutrient concentration targets are an important component of managing river eutrophication. Relationships between periphyton biomass and site characteristics for 78 gravel‐bed rivers in New Zealand were represented by regression models. The regression models had large uncertainties but identified broad‐scale drivers of periphyton biomass. The models were used to derive concentration targets for the nutrients, total nitrogen (TN) and dissolved reactive phosphorous (DRP), for 21 river classes to achieve periphyton biomass thresholds of 50, 120, and 200 mg chlorophyll a m?2. The targets incorporated a temporal exceedance criterion requiring the specified biomass threshold not be exceeded by more than 8% of samples. The targets also incorporated a spatial exceedance criterion requiring the biomass thresholds will not be exceeded at more than a fixed proportion (10%, 20%, or 50%) of locations. The spatial exceedance criterion implies, rather than requiring specific conditions at individual sites, the objective is to restrict biomass to acceptable levels at a majority of locations within a domain of interest. A Monte Carlo analysis was used to derive the uncertainty of the derived nutrient concentration targets for TN and DRP. The uncertainties reduce with increasing size of the spatial domain. Tests indicated the nutrient concentration targets were reasonably consistent with independent periphyton biomass data despite differences in the protocols used to measure biomass at the training and test sites.  相似文献   

10.
This paper examines the performance of a semi‐distributed hydrology model (i.e., Soil and Water Assessment Tool [SWAT]) using Sequential Uncertainty FItting (SUFI‐2), generalized likelihood uncertainty estimation (GLUE), parameter solution (ParaSol), and particle swarm optimization (PSO). We applied SWAT to the Waccamaw watershed, a shallow aquifer dominated Coastal Plain watershed in the Southeastern United States (U.S.). The model was calibrated (2003‐2005) and validated (2006‐2007) at two U.S. Geological Survey gaging stations, using significant parameters related to surface hydrology, hydrogeology, hydraulics, and physical properties. SWAT performed best during intervals with wet and normal antecedent conditions with varying sensitivity to effluent channel shape and characteristics. In addition, the calibration of all algorithms depended mostly on Manning's n‐value for the tributary channels as the surface friction resistance factor to generate runoff. SUFI‐2 and PSO simulated the same relative probability distribution tails to those observed at an upstream outlet, while all methods (except ParaSol) exhibited longer tails at a downstream outlet. The ParaSol model exhibited large skewness suggesting a global search algorithm was less capable of characterizing parameter uncertainty. Our findings provide insights regarding parameter sensitivity and uncertainty as well as modeling diagnostic analysis that can improve hydrologic theory and prediction in complex watersheds. Editor's note : This paper is part of the featured series on SWAT Applications for Emerging Hydrologic and Water Quality Challenges. See the February 2017 issue for the introduction and background to the series.  相似文献   

11.
The main objective of this research is to model the uncertainty associated with GIS-based multi-criteria decision analysis (MCDA) for crop suitability assessment. To achieve this goal, an integrated approach using GIS-MCDA in association with Monte Carlo simulation (MCS) and global sensitivity analysis (GSA) were applied for Saffron suitability mapping in East-Azerbaijan Province in Iran. The results of this study indicated that integration of MCDA with MCS and GSA could improve modeling precision by reducing data variance. Results indicated that applying the MCS method using the local training data leads to computing the spatial correlation between criteria weights and characteristics of the study area. Results of the GSA method also allow us to obtain the priority of criteria and identify the most important criteria and the variability of outputs under uncertainty conditions for model inputs. The findings showed that, commonly used primary zoning methods, without considering the interaction effects of variables, had significant errors and uncertainty in the output of MCDA-based suitability models, which should be minimized by the advanced complementarity of sensitivity and uncertainty analysis.  相似文献   

12.
ABSTRACT. Recent advances in water quality modelling have pointed out the need for stochastic models to simulate the probabilistic nature of water quality. However, often all that is needed is an estimate of the uncertainty in predicting water quality variables. First order analysis is a simple method of providing an estimate in the uncertainty in a deterministic model due to uncertain parameters. The method is applied to the simplified Streeter-Phelps equations for DO and BOD; a more complete Monte Carlo simulation is used to check the accuracy of the results. The first order analysis is found to give accurate estimates of means and variances of DO and BOD up to travel times exceeding the critical time. Uncertainty in travel time and the BOD decay constant are found to be most important for small travel times; uncertainty in the reaeration coefficient dominates near the critical time. Uncertainty in temperature was found to be a negligible source of uncertainty in DO for all travel times.  相似文献   

13.
This article presents the design of a fuzzy decision support system (DSS) for the assessment of alternative strategies proposed for the restoration of Lake Koronia, Greece. Fuzzy estimates for the critical characteristics of the possible strategies, such as feasibility, environmental impact, implementation time, and costs are evaluated and supplied to the fuzzy DSS. Different weighting factors are assigned to the critical characteristics and the proposed strategies are ordered with respect to the system responses. The best strategies are selected and their expected impact on the ecosystem is evaluated with the aid of a fuzzy model of the lake. Sensitivity analysis and simulation results have shown that the proposed fuzzy DSS can serve as a valuable tool for the selection and evaluation of appropriate management actions. Note: This version was published online in June 2005 with the cover date of August 2004.  相似文献   

14.
ABSTRACT: The probability distributions of annual peak flows used in flood risk analysis quantify the risk that a design flood will be exceeded. But the parameters of these distributions are themselves to a degree uncertain and this uncertainty increases the risk that the flood protection provided will in fact prove to be inadequate. The increase in flood risk due to parameter uncertainty is small when a fairly long record of data is available and the annual flood peaks are serially independent, which is the standard assumption in flood frequency analysis. But standard tests for serial independence are insensitive to the type of grouping of high and low values in a time series, which is measured by the Hurst coefficient. This grouping increases the parameter uncertainty considerably. A study of 49 annual peak flow series for Canadian rivers shows that many have a high Hurst coefficient. The corresponding increase in flood risk due to parameter uncertainty is shown to be substantial even for rivers with a long record, and therefore should not be neglected. The paper presents a method of rationally combining parameter uncertainty due to serial correlation, and the stochastic variability of peak flows in a single risk assessment. In addition, a relatively simple time series model that is capable of reproducing the observed serial correlation of flood peaks is presented.  相似文献   

15.
Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.  相似文献   

16.
Salinity in the Upper Colorado River Basin (UCRB) is due to both natural sources and processes, and anthropogenic activities. Given economic damage due to salinity of $295 million in 2010, understanding salinity sources and production together with transport are of great importance. SPAtially Referenced Regressions On Watershed (SPARROW) is a nonlinear regression water quality model that simulates sources and transport of contaminants such as dissolved‐solids. However, SPARROW simulations of dissolved‐solids in the UCRB only represent conditions through 1998 due to limited data availability. More importantly, prior simulations focused on a single year calibration and its transferability to other years, and the validity of this approach is questionable, given the changing hydrologic and climatic conditions. This study presents different calibration approaches to assess the best approach for reducing model uncertainty. This study conducted simulations from 1999 to 2011, and the results showed good model accuracy. However, the number of monitoring stations decreased significantly in recent years resulting in higher model uncertainty. The uncertainty analysis was conducted using SPARROW results and bootstrapping. The results suggest that the watershed rankings based on salinity yields changed due to the uncertainty analysis and therefore, uncertainty consideration should be an important part of the management strategy.  相似文献   

17.
Parametric (propagation for normal error estimates) and nonparametric methods (bootstrap and enumeration of combinations) to assess the uncertainty in calculated rates of nitrogen loading were compared, based on the propagation of uncertainty observed in the variables used in the calculation. In addition, since such calculations are often based on literature surveys rather than random replicate measurements for the site in question, error propagation was also compared using the uncertainty of the sampled population (e.g., standard deviation) as well as the uncertainty of the mean (e.g., standard error of the mean). Calculations for the predicted nitrogen loading to a shallow estuary (Waquoit Bay, MA) were used as an example. The previously estimated mean loading from the watershed (5,400 ha) to Waquoit Bay (600 ha) was 23,000 kg N yr−1. The mode of a nonparametric estimate of the probability distribution differed dramatically, equaling only 70% of this mean. Repeated observations were available for only 8 of the 16 variables used in our calculation. We estimated uncertainty in model predictions by treating these as sample replicates. Parametric and nonparametric estimates of the standard error of the mean loading rate were 12–14%. However, since the available data include site-to-site variability, as is often the case, standard error may be an inappropriate measure of confidence. The standard deviations were around 38% of the loading rate. Further, 95% confidence intervals differed between the nonparametric and parametric methods, with those of the nonparametric method arranged asymmetrically around the predicted loading rate. The disparity in magnitude and symmetry of calculated confidence limits argue for careful consideration of the nature of the uncertainty of variables used in chained calculations. This analysis also suggests that a nonparametric method of calculating loading rates using most frequently observed values for variables used in loading calculations may be more appropriate than using mean values. These findings reinforce the importance of including assessment of uncertainty when evaluating nutrient loading rates in research and planning. Risk assessment, which may need to consider relative probability of extreme events in worst-case scenarios, will be in serious error using normal estimates, or even the nonparametric bootstrap. A method such as our enumeration of combinations produces a more reliable distribution of risk.  相似文献   

18.
Legislation on the protection of biodiversity (e.g., European Union Habitat and Bird Directives) increasingly requires ecological impact assessment of human activities. However, knowledge and understanding of relevant ecological processes and species responses to different types of impact are often incomplete. In this paper we demonstrate with a case study how impact assessment can be carried out for situations where data are scarce but some expert knowledge is available. The case study involves two amphibian species, the great crested newt (Triturus cristatus) and the natterjack toad (Bufo calamita) in the nature reserve the Meinweg in the Netherlands, for which plans are being developed to reopen an old railway track called the Iron Rhine. We assess the effects of this railway track and its proposed alternatives (scenarios) on the metapopulation extinction time and the occupancy times of the patches for both species using a discrete-time stochastic metapopulation model. We quantify the model parameters using expert knowledge and extrapolated data. Because of our uncertainty about these parameter values, we perform a Monte Carlo uncertainty analysis. This yields an estimate of the probability distribution of the model predictions and insight into the contribution of each distinguished source of uncertainty to this probability distribution. We show that with a simple metapopulation model and an extensive uncertainty analysis it is possible to detect the least harmful scenario. The ranking of the different scenarios is consistent. Thus, uncertainty analysis can enhance the role of ecological impact assessment in decision making by making explicit to what extent incomplete knowledge affects predictions.  相似文献   

19.
Abstract: Systematic consideration of uncertainty in data, model structure, and other factors is generally unaddressed in most Total Maximum Daily Load (TMDL) calculations. Our previous studies developed the Management Objectives Constrained Analysis of Uncertainty (MOCAU) approach as an uncertainty analysis technique specifically for watershed water quality models, based on a synthetic case. In this study, we applied MOCAU to analyze diazinon loading in the Newport Bay watershed (Southern California). The study objectives included (1) demonstrating the value of performing stochastic simulation and uncertainty analysis for TMDL development, using MOCAU as the technique and (2) evaluating the existing diazinon TMDL and generating insights for the development of scientifically sound TMDLs, considering uncertainty. The Watershed Analysis Risk Management Framework model was used as an example of a complex watershed model. The study revealed the importance and feasibility of conducting stochastic watershed water quality simulation for TMDL development. The critical role of management objectives in a systematic uncertainty assessment was well demonstrated. The results of this study are intuitive to TMDL calculation, model structure improvement and sampling strategy design.  相似文献   

20.
This article presents methods and results of interlaboratory comparison in the electromagnetic field (EMF) survey measurements performed by Electromagnetic Environment Protection Laboratory. Based on the results, the author analyzed factors affecting the precision of EMF measurements, in particular—difficult to estimate—the “human factor”. In practice, the human factor has never been taken into account in the EMF measurements accuracy estimations budget, and the author estimate the importance of this factor that may include even up to half of the total uncertainty of measurements. Taking into account all factors depredating the accuracy of measurements, the uncertainty of the survey EMF measurements was estimated at the level of 2–4 dB.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号