首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
ABSTRACT: A framework for sensitivity and error analysis in mathematical modeling is described and demonstrated. The Lake Eutrophication Analysis Procedure (LEAP) consists of a series of linked models which predict lake water quality conditions as a function of watershed land use, hydrolgic variables, and morphometric variables. Specification of input variables as distributions (means and standard errors) and use of first-order error analysis techniques permits estimation of output variable means, standard errors, and confidence ranges. Predicted distributions compare favorably with those estimated using Monte-Carlo simulation. The framework is demonstrated by applying it to data from Lake Morey, Vermont. While possible biases exist in the models calibrated for this application, prediction variances, attributed chiefly to model error, are comparable to the observed year-to-year variance in water quality, as measured by spring phosphorus concentration, hypolimnetic oxygen depletion rate, summer chlorophyll-a, and summer transparency in this lake. Use of the framework provides insight into important controlling factors and relationships and identifies the major sources of uncertainty in a given model application.  相似文献   

2.
A topic of interest in the finance world is measuring systematic risk. Accurately measuring the systematic risk component—or Beta—of an asset or portfolio is important in many financial applications. In this work, we consider the efficiency of a range of Beta estimation methods commonly used in practice from a reference-day risk perspective. We show that, when using the industry standard data sample of 5 years of monthly returns, the choice of reference-day used to calculate underlying returns has a significant impact on all of the Beta estimation methods considered. Driven by this finding, we propose and test an alternative nonparametric bootstrap approach for calculating Beta estimates which is unaffected by reference-day risk. Our primary goal is to determine a point-estimate of Beta, independent of reference-day.  相似文献   

3.
This paper presents a Data Envelopment Analysis (DEA) model combined with bootstrapping to assess performance in mining operations. Since DEA-type indicators based on nonparametric production analysis are simply point estimates without any standard error, we provide a methodology to assess the performance of strip mining operations by means of a DEA bootstrapping approach. This methodology is applied to a sample of fifteen Illinois strip coal mines using publicly available data (Thompson et al., 1995). The applied approach uses a mixed mine environmental performance indicator (MMEPI) that is derived by means of a VRS DEA environmental technology treating overburden as an undesirable output under the weak disposability assumption, and we compare this measure with a traditional output-oriented mine performance indicator (MPI) omitting overburden. Although omitting undesirable output results in biased performance estimates, these findings are based on sample specific results and indicate this bias is not statistically significant. The confidence intervals derived by the bootstrapping of the proposed MMEPI point estimates indicate that significant inefficiency has taken place in the analyzed sample of Illinois strip mines.  相似文献   

4.
Abstract: Systematic consideration of uncertainty in data, model structure, and other factors is generally unaddressed in most Total Maximum Daily Load (TMDL) calculations. Our previous studies developed the Management Objectives Constrained Analysis of Uncertainty (MOCAU) approach as an uncertainty analysis technique specifically for watershed water quality models, based on a synthetic case. In this study, we applied MOCAU to analyze diazinon loading in the Newport Bay watershed (Southern California). The study objectives included (1) demonstrating the value of performing stochastic simulation and uncertainty analysis for TMDL development, using MOCAU as the technique and (2) evaluating the existing diazinon TMDL and generating insights for the development of scientifically sound TMDLs, considering uncertainty. The Watershed Analysis Risk Management Framework model was used as an example of a complex watershed model. The study revealed the importance and feasibility of conducting stochastic watershed water quality simulation for TMDL development. The critical role of management objectives in a systematic uncertainty assessment was well demonstrated. The results of this study are intuitive to TMDL calculation, model structure improvement and sampling strategy design.  相似文献   

5.
Abstract: A mix of causative mechanisms may be responsible for flood at a site. Floods may be caused because of extreme rainfall or rain on other rainfall events. The statistical attributes of these events differ according to the watershed characteristics and the causes. Traditional methods of flood frequency analysis are only adequate for specific situations. Also, to address the uncertainty of flood frequency estimates for hydraulic structures, a series of probabilistic analyses of rainfall‐runoff and flow routing models, and their associated inputs, are used. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated to evaluate the probability of floods. Therefore, the objectives of this study were to develop a flood frequency curve derivation method driven by multiple random variables and to develop a tool that can consider the uncertainties of design floods. This study focuses on developing a flood frequency curve based on nonparametric statistical methods for the estimation of probabilities of rare floods that are more appropriate in Korea. To derive the frequency curve, rainfall generation using the nonparametric kernel density estimation approach is proposed. Many flood events are simulated by nonparametric Monte Carlo simulations coupled with the center Latin hypercube sampling method to estimate the associated uncertainty. This study applies the methods described to a Korean watershed. The results provide higher physical appropriateness and reasonable estimates of design flood.  相似文献   

6.
This study analyzed the scope effects of respondent uncertainty in contingent valuation (CV) by evaluating whether willingness to pay (WTP) estimates were sensitive to changes in the magnitudes of motorized emission reductions in the city of Nairobi, Kenya. The WTP estimates were elicited through the conventional payment card (PC), stochastic payment card (SPC) and the polychotomous payment card (PPC) formats. While SPC and PPC formats were used to capture respondent uncertainty, the PC format captured respondent certainty regarding the amounts individuals were WTP for emission reductions. Based on parametric and nonparametric analysis, results show that certain (PC) respondents stated significantly larger WTP amounts for larger emission reductions than for smaller reductions. Conversely, uncertain (SPC and PPC) respondents stated smaller amounts for larger emission reductions than certain (PC) respondents. The implication is that though respondents were sensitive to the scope of motorized emission reductions, respondent uncertainty lowered their sensitivity to scope.  相似文献   

7.
ABSTRACT: Despite the fact that lake phosphorus loading criteria have proven to be valuable tools in lake management, they are generally subjective in nature or incomplete in form. In order to address these shortcomings, the oxic-anoxic transition point was selected as an objective quality criterion and discriminant analysis was used to construct a lake classification function. This function is dependent upon lake phosphorus loading, mean depth, and overflow rate. The value of the function may be expressed as a probability of classification (as either oxic or anoxic). When used in prediction, inclusion of the input error permits the estimation of the change in classification probability as input uncertainty is reduced. Further, the form of the discriminant function suggests that the annual volumetric loading is a more informative term for the expression of phosphorus loading than is the annual areal loading.  相似文献   

8.
As demand for water in the southwestern United States increases and climate change potentially decreases the natural flows in the Colorado River system, there will be increased need to optimize the water supply. Lake Powell is a large reservoir with potentially high loss rates to bank storage and evaporation. Bank storage is estimated as a residual in the reservoir water balance. Estimates of local inflow contribute uncertainty to estimates of bank storage. Regression analyses of local inflow with gaged tributaries have improved the estimate of local inflow. Using a stochastic estimate of local inflow based on the standard error of the regression estimator and of gross evaporation based on observed variability at Lake Mead, a reservoir water balance was used to estimate that more than 14.8 billion cubic meters (Gm3) has been stored in the banks, with a 90% probability that the value is actually between 11.8 and 18.5 Gm3. Groundwater models developed by others, observed groundwater levels, and simple transmissivity calculations confirm these bank storage estimates. Assuming a constant bank storage fraction for simulations of the future may cause managers to underestimate the actual losses from the reservoir. Updated management regimes which account more accurately for bank storage and evaporation could save water that will otherwise be lost to the banks or evaporation.  相似文献   

9.
Hydrologic characterization at ungauged locations is one of the quintessential challenges of hydrology. Beyond simulation of historical streamflows, it is similarly important to characterize the level of uncertainty in hydrologic estimates. In tandem with updates to Massachusetts Sustainable Yield Estimator, this work explores the application of global uncertainty estimates to daily streamflow simulations. Expanding on a method developed for deterministic modeling, this approach produces confidence intervals on daily streamflow developed through nonlinear spatial interpolation of daily streamflow using flow duration curves; the 95% confidence is examined. Archived cross‐validations of daily streamflows from 66 watersheds in and around Massachusetts are used to evaluate an approach to uncertainty characterization. Neighboring sites are treated as ungauged, producing relative errors that can be resampled and applied to target sites. The method, with some modification, is found to provide appropriately narrow confidence intervals that contain 95% of the observed streamflows in cross‐validation. Further characterizing uncertainty, multiday means of daily streamflow are evaluated. Working through cross‐validation in Massachusetts, two‐ to three‐month averages of daily streamflow show the best performance. These two approaches to uncertainty characterization inform how streamflow simulation produced for prediction in ungauged basins can be used for water resources management.  相似文献   

10.
Tree bole volumes of 89 Scots pine (Pinus sylvestris L.), 96 Brutian pine (Pinus brutia Ten.), 107 Cilicica fir (Abies cilicica Carr.) and 67 Cedar of Lebanon (Cedrus libani A. Rich.) trees were estimated using Artificial Neural Network (ANN) models. Neural networks offer a number of advantages including the ability to implicitly detect complex nonlinear relationships between input and output variables, which is very helpful in tree volume modeling. Two different neural network architectures were used and produced the Back propagation (BPANN) and the Cascade Correlation (CCANN) Artificial Neural Network models. In addition, tree bole volume estimates were compared to other established tree bole volume estimation techniques including the centroid method, taper equations, and existing standard volume tables. An overview of the features of ANNs and traditional methods is presented and the advantages and limitations of each one of them are discussed. For validation purposes, actual volumes were determined by aggregating the volumes of measured short sections (average 1 meter) of the tree bole using Smalian's formula. The results reported in this research suggest that the selected cascade correlation artificial neural network (CCANN) models are reliable for estimating the tree bole volume of the four examined tree species since they gave unbiased results and were superior to almost all methods in terms of error (%) expressed as the mean of the percentage errors.  相似文献   

11.
ABSTRACT: Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distributions (Log-Pearson III and Weibull) had lower mean square errors than did the Box-Cox transformation method or the Log-Boughton method which is based on a fit of plotting positions.  相似文献   

12.
ABSTRACT: Cumulative density functions (c.d.f.'s) for water quality random variables may be estimated using data from a routine grab sampling program. The c.d.f. may then be used to estimate the probability that a single grab sample will violate a given stream standard and to determine the anticipated number of violations in a given number of samples. Confidence limits about a particular point on the c.d.f. may be used to reflect the accuracy with which the sample estimate represents the true c.d.f. Methods are presented here for calculating such confidence limits using both a normal model and a nonparametric model. Examples are presented to illustrate the usefulness of an estimated c.d.f. and associated confidence limits in assessing whether an observed number of standard violations is the result of natural variability or represents real degradation in water quality.  相似文献   

13.
The ability to detect and to develop a precise and accurate estimate of the entrainment mortality fraction is an important step in projecting power plant impacts on future fish population levels. Recent work indicates that these mortailities may be considerably less than 100% for some fish species in the early life stages. Point estimates of the entrainment mortality fraction have been developed based on probabilistic arguments, but the precision of these estimates has not been studied beyond the simple statistical test of the null hypothesis that no entrainment mortaility exists.The ability to detect entrainment mortality is explored as a function of the sample sizes (numbers of organisms collected) at the intake and discharge sampling stations of a power plant and of the proportion of organisms found alive in the intake samples (intake survival). Minimum detectable entrainment mortality, confidence interval width, and type II error (probability of accepting the null hypothesis of no entrainment mortality when there is mortality) are considered. Increasing sample size and/or decreasing sampling mortality will decrease the minimum detectable entrainment mortality, confidence interval width, and type II error for a given level of type I error.The results of this study are considered in the context of designing useful monitoring programs for determining the entrainment mortality fraction. Preliminary estimates of intake survival and the entrainment mortality fraction can be used to obtain estimates of the sample size needed for a specified level of confidence interval width or type II error. Final estimates of the intake survival and the entrainment mortality fraction can be used to determine the minimum detectable entrainment mortality and the type II error.  相似文献   

14.
Abstract: We proposed a step‐by‐step approach to quantify the sensitivity of ground‐water discharge by evapotranspiration (ET) to three categories of independent input variables. To illustrate the approach, we adopt a basic ground‐water discharge estimation model, in which the volume of ground water lost to ET was computed as the product of the ground‐water discharge rate and the associated area. The ground‐water discharge rate was assumed to equal the ET rate minus local precipitation. The objective of this study is to outline a step‐by‐step procedure to quantify the contributions from individual independent variable uncertainties to the uncertainty of total ground‐water discharge estimates; the independent variables include ET rates of individual ET units, areas associated with the ET units, and precipitation in each subbasin. The specific goal is to guide future characterization efforts by better targeting data collection for those variables most responsible for uncertainty in ground‐water discharge estimates. The influential independent variables to be included in the sensitivity analysis are first selected based on the physical characteristics and model structure. Both regression coefficients and standardized regression coefficients for the selected independent variables are calculated using the results from sampling‐based Monte Carlo simulations. Results illustrate that, while as many as 630 independent variables potentially contribute to the calculation of the total annual ground‐water discharge for the case study area, a selection of seven independent variables could be used to develop an accurate regression model, accounting for more than 96% of the total variance in ground‐water discharge. Results indicate that the variability of ET rate for moderately dense desert shrubland contributes to about 75% of the variance in the total ground‐water discharge estimates. These results point to a need to better quantify ET rates for moderately dense shrubland to reduce overall uncertainty in estimates of ground‐water discharge. While the approach proposed here uses a basic ground‐water discharge model taken from an earlier study, the procedure of quantifying uncertainty and sensitivity can be generalized to handle other types of environmental models involving large numbers of independent variables.  相似文献   

15.
16.
ABSTRACT: This study tests the hypothesis that climatic data can be used to develop a watershed model so that stream flow changes following forest harvest can be determined. Measured independent variables were precipitation, daily maximum and minimum temperature, and concurrent relative humidity. Computed variables were humidity deficit, saturated vapor pressure, and ambient vapor pressure. These climatic variables were combined to compute a monthly evaporation index. Finally, the evaporation index and monthly precipitation were regressed with measured monthly stream flow and the monthly estimates of stream flow were combined for the hydrologic year. A regression of predicted versus measured annual stream flow had a standard error of 1.5 inches (within 6.1 percent of the measured value). When 10, 15, and 20 years of data were used to develop the regression equations, predicted minus measured stream flow for the last 7 years of record (1972–1978) were within 16.8, 11.5, and 9.7 percent of the measured mean, respectively. Although single watershed calibration can be used in special conditions, the paired watershed approach is expected to remain the preferred method for determining the effects of forest management on the water resource.  相似文献   

17.
We investigate the sensitivity of phosphorus loading (mass/time) in an urban stream to variations in climate using nondimensional sensitivity, known as elasticity, methods commonly used by economists and hydrologists. Previous analyses have used bivariate elasticity methods to represent the general relationship between nutrient loading and a variable of interest, but such bivariate relations cannot reflect the complex multivariate nonlinear relationships inherent among nutrients, precipitation, temperature, and streamflow. Using fixed‐effect multivariate regression methods, we obtain two phosphorus models (nonparametric and parametric) for an urban stream with high explanatory power that can both estimate phosphorus loads and the elasticity of phosphorus loading to changes in precipitation, temperature, and streamflow. A case study demonstrates total phosphorus loading depends significantly on season, rainfall, combined sewer overflow events, and flow rate, yet the elasticity of total phosphorus to all these factors remains relatively constant throughout the year. The elasticity estimates reported here can be used to examine how nutrient loads may change under future climate conditions.  相似文献   

18.
ABSTRACT: Components contributing to uncertainty in the location of the flood plain fringe of a mapped flood plain are identified and examined to determine their relative importance. First-order uncertainty analysis is used to provide a procedure for quantifying the magnitude of uncertainty in the location of the flood plain fringe. Application of the procedure indicated that one standard deviation of uncertainty in flood plain inundation width was about one third of the mean computed inundation width for several flood population-flood geometry combinations. Suggested mapping criteria, which directly incorporate uncertainty estimates, are given. While these criteria are more suitable for use in developing areas than in flood plains that have had extensive development, the analysis procedure can be used to accommodate property owners who challenge the validity of estimated flood fringe boundaries. Use of uncertainty analysis in flood plain mapping should enhance the credibility of the final plan.  相似文献   

19.
The quality of scientific information in policy-relevant fields of research is difficult to assess, and quality control in these important areas is correspondingly difficult to maintain. Frequently there are insufficient high-quality measurements for the presentation of the statistical uncertainty in the numerical estimates that are crucial to policy decisions. We propose and develop a grading system for numerical estimates that can deal with the full range of data quality—from statistically valid estimates to informed guesses. By analyzing the underlying quality of numerical estimates, summarized as spread and grade, we are able to provide simple rules whereby input data can be coded for quality, and these codings carried through arithmetical calculations for assessing the quality of model results. For this we use the NUSAP (numeral, unit, spread, assessment, pedigree) notational system. It allows the more quantitative and the more qualitative aspects of data uncertainty to be managed separately. By way of example, we apply the system to an ecosystem valuation study that used several different models and data of widely varying quality to arrive at a single estimate of the economic value of wetlands. The NUSAP approach illustrates the major sources of uncertainty in this study and can guide new research aimed at the improvement of the quality of outputs and the efficiency of the procedures.  相似文献   

20.
Abstract: A parametric regression model was developed for assessing the variability and long‐term trends in pesticide concentrations in streams. The dependent variable is the logarithm of pesticide concentration and the explanatory variables are a seasonal wave, which represents the seasonal variability of concentration in response to seasonal application rates; a streamflow anomaly, which is the deviation of concurrent daily streamflow from average conditions for the previous 30 days; and a trend, which represents long‐term (inter‐annual) changes in concentration. Application of the model to selected herbicides and insecticides in four diverse streams indicated the model is robust with respect to pesticide type, stream location, and the degree of censoring (proportion of nondetections). An automatic model fitting and selection procedure for the seasonal wave and trend components was found to perform well for the datasets analyzed. Artificial censoring scenarios were used in a Monte Carlo simulation analysis to show that the fitted trends were unbiased and the approximate p‐values were accurate for as few as 10 uncensored concentrations during a three‐year period, assuming a sampling frequency of 15 samples per year. Trend estimates for the full model were compared with a model without the streamflow anomaly and a model in which the seasonality was modeled using standard trigonometric functions, rather than seasonal application rates. Exclusion of the streamflow anomaly resulted in substantial increases in the mean‐squared error and decreases in power for detecting trends. Incorrectly modeling the seasonal structure of the concentration data resulted in substantial estimation bias and moderate increases in mean‐squared error and decreases in power.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号