首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
This discussion addresses some aspects of a recent paper appearing in this journal which investigates cost effective coastal water management based on different assumptions of the probability distributions (normal and log-normal) of pollutant transport. We also suggest an alternative approach to overcome the technical problems of using the theoretical correct distribution for characterising environmental data (log-normal) within a probabilistic programming framework.  相似文献   

2.
Kampas A  Adamidis K 《Journal of environmental management》2005,74(4):383-8; discussion 389-92
This discussion addresses some aspects of a recent paper appearing in this journal which investigates cost effective coastal water management based on different assumptions of the probability distributions (normal and log-normal) of pollutant transport. We also suggest an alternative approach to overcome the technical problems of using the theoretical correct distribution for characterising environmental data (log-normal) within a probabilistic programming framework.  相似文献   

3.
The frequency distributions of both grade and size of metal deposits may be well approximated by lognormal distribution functions. Using data on presently viable deposits and a simplified function which links production cost to deposit grade and size, a bivariate lognormal deposit grade/size distribution may be calibrated for a given geological environment. Exploration is introduced by assuming that the proportion discovered of the potential uranium reserve available at or below a given production cost can be represented by a function of the average deposit size and the unit exploration expenditure. As output the model derives estimates of total reserves linked to maximum production costs and to exploration expenditure where the latter may be expressed either as expenditure per lb of mineral discovered or as a given percentage of operating profit. The model is offered as a basis for discussion, and the conclusions are tentative.  相似文献   

4.
ABSTRACT: The Mississippi Department of Environmental Quality uses the Steady Riverine Environmental Assessment Model (STREAM) to establish effluent limitations. While the U.S. Environmental Protection Agency has approved of its use, questions arise regarding the model's simplicity. The objective of this research was to compare STREAM with the more commonly utilized Enhanced Stream Water Quality Model (QUAL2E). The comparison involved a statistical evaluation procedure based on sensitivity analyses, input probability distribution functions, and Monte Carlo simulation with site‐specific data from a 46‐mile (74‐km) reach of the Big Black River in central Mississippi. Site specific probability distribution functions were derived from measured rates of reaeration, sediment oxygen demand, photosynthesis, and respiration. Both STREAM and QUAL2E reasonably predicted daily average dissolved oxygen (DO) based on a comparison of output probability distributions with observed DO. Observed DO was consistently within 90 percent confidence intervals of model predictions. The STREAM approach generally overpredicted while QUAL2E generally matched observed DO. Using the more commonly assumed lognormal distribution as opposed to a Weibull distribution for two of the sensitive input parameters resulted in minimal differences in the statistical evaluations. The QUAL2E approach had distinct advantages over STREAM in simulating the growth cycle of algae.  相似文献   

5.
ABSTRACT: The Pearson type 3 (P3) and log Pearson type 3 (LP3) distributions are very frequently used in flood frequency analysis. Existing methods for constructing confidence intervals for quantiles (Xp) of these two distributions are very crude. Most of these methods are based on the idea of adjusting confidence intervals for quantiles Yp of the normal distribution to obtain approximate confidence inervals for quantiles Xp of the P3/LP3 distribution. Since there is no theoretical reason why this “base” distribution, Y, should be taken to be normal, we search in the present study for the best possible base distribution for producing confidence intervals for P3/LP3 quantiles. We consider a group of base distributions such as the normal, log normal, Weibull, Gumbel, and exponential. We first assume that the skew coefficient, γ of X, to be known, and develop a method for adjusting confidence intervals for Yp to produce approximate confidence intervals for Xp. We then compare this method (Method A) with another method (Method B) introduced by Stedinger. Simulation shows that the performance of each of these two methods depends on the base distribution Y that is being used, but as a whole, the normal distribution appears to be the best-fit distribution for producing confidence intervals for P3/LP3 quantiles when γ is assumed to be known. We then extend our method (Method A) to the more important case of unknown coefficient of skewness. It is shown that by taking Y to be Weibull, fairly accurate confidence intervals for P3/LP3 quantiles can be obtained for quite a wide range of sample sizes and coefficients of skewness commonly found in hydrology. The case of the P3 distribution with negative skewness needs further research.  相似文献   

6.
ABSTRACT: A statistical approach for making Total Maximum Daily Load (TMDL) impairment decisions is developed as an alternative to the simple tally of the number of measurements that happen to exceed the standard. The method ensures that no more than a small (e.g., 10 percent) percentage of water quality samples will exceed a regulatory standard with a high level of confidence (e.g., 95 percent). The method is based on the 100(1‐α) percent lower confidence limit on an upper percentile of the concentration distribution. Advantages of the method include: (1) it provides a direct test of the hypothesis that a prespecified percentage of the true concentration distribution exceeds a regulatory standard, (2) it is applicable to a wide variety of different statistical concentration distributions, (3) it directly incorporates the magnitude of the measured concentrations unlike traditional approaches, and (4) it has explicit statistical power characteristics (i.e., what is the probability of missing an environmental impact). Detailed study of the simple tally approach reveals that it achieves high statistical power at the expense of unacceptably high false positive rates (30 to 40 percent false positive results). By contrast, the statistical approach results in similar statistical power while achieving a nominal false positive rate of 5 percent.  相似文献   

7.
Parametric (propagation for normal error estimates) and nonparametric methods (bootstrap and enumeration of combinations) to assess the uncertainty in calculated rates of nitrogen loading were compared, based on the propagation of uncertainty observed in the variables used in the calculation. In addition, since such calculations are often based on literature surveys rather than random replicate measurements for the site in question, error propagation was also compared using the uncertainty of the sampled population (e.g., standard deviation) as well as the uncertainty of the mean (e.g., standard error of the mean). Calculations for the predicted nitrogen loading to a shallow estuary (Waquoit Bay, MA) were used as an example. The previously estimated mean loading from the watershed (5,400 ha) to Waquoit Bay (600 ha) was 23,000 kg N yr−1. The mode of a nonparametric estimate of the probability distribution differed dramatically, equaling only 70% of this mean. Repeated observations were available for only 8 of the 16 variables used in our calculation. We estimated uncertainty in model predictions by treating these as sample replicates. Parametric and nonparametric estimates of the standard error of the mean loading rate were 12–14%. However, since the available data include site-to-site variability, as is often the case, standard error may be an inappropriate measure of confidence. The standard deviations were around 38% of the loading rate. Further, 95% confidence intervals differed between the nonparametric and parametric methods, with those of the nonparametric method arranged asymmetrically around the predicted loading rate. The disparity in magnitude and symmetry of calculated confidence limits argue for careful consideration of the nature of the uncertainty of variables used in chained calculations. This analysis also suggests that a nonparametric method of calculating loading rates using most frequently observed values for variables used in loading calculations may be more appropriate than using mean values. These findings reinforce the importance of including assessment of uncertainty when evaluating nutrient loading rates in research and planning. Risk assessment, which may need to consider relative probability of extreme events in worst-case scenarios, will be in serious error using normal estimates, or even the nonparametric bootstrap. A method such as our enumeration of combinations produces a more reliable distribution of risk.  相似文献   

8.
ABSTRACT: The ability to predict extreme floods is an important part of the planning process for any water project for which failure will be very costly. The length of a gage record available for use in estimating extreme flows is generally much shorter than the recurrence interval of the desired flows, resulting in estimates having a high degree of uncertainty. Maximum likelihood estimators of the parameters of the three parameter lognormal (3PLN) distribution, which make use of historical data, are presented. A Monte Carlo study of extreme flows estimated from samples drawn from three hypothetical 3PLN populations showed that inclusion of historical flows with the gage record reduced the bias and variance of extreme flow estimates. Asymptotic theory approximations of parameter variances and covariances calculated using the second and mixed partial derivatives of the log likelihood function agreed well with Monte Carlo results. First order approximations of the standard deviations of the extreme flow estimates did not agree with the Monte Carlo results. An alternative method for calculating those standard deviations, the “asymptotic simulation” method, is described. The standard deviations calculated by asymptotic simulation agree well with the Monte Carlo results.  相似文献   

9.
10.
Abstract: For a number of years, best management practices (BMPs) have been implemented within the Town Brook watershed as part of a watershed wide effort to reduce phosphorus losses to the New York City water supply reservoirs. Currently, there are no quantitative indications of the effectiveness of these practices at the watershed scale. Additionally, work is needed to evaluate management practice solutions for costs in relation to effectiveness. In this study we develop a methodology for evaluating management solutions to determine the best way(s) to select and place management practices so that pollutant removal targets are met at minimum cost. The study combines phosphorus losses as simulated by the Soil and Water Assessment Tool (SWAT), management practice effectiveness estimates from a predeveloped characterization tool, and practice costs in optimizations using a genetic algorithm. For a user defined target phosphorus removal (60 percent for this study), optimization favors nutrient management plans, crop rotations, contour strip cropping, and riparian forest buffers; the most cost effective scenario achieves a cost effectiveness of 24/kg phosphorus removal per year compared to the 34/kg phosphorus removal per year associated with the current basic implementation scheme. The study suggests that there is a need to evaluate potential solutions prior to implementation and offers a means of generating and evaluating the solutions.  相似文献   

11.
ABSTRACT: A convenient method for the statistical analysis of hydrologic extremes is to use probability papers to fit selected theoretical distributions to extremal observations. Three commonly accepted statistical distributions of extreme hydrologic events are: the double exponential distribution, the bounded exponential distribution, and the Log Pearson Type III distribution. In most cases, probability papers are distribution specific. But, for the Log Pearson Type III distribution, the probability paper is characterized by a population-specific parameter, namely, the coefficient of skewness. It is not practicable to procure probability papers for all possible values of this parameter. Therefore, a computer program is developed to generate population-specific probability papers and to perform statistical analysis of the data using computer graphics. Probability papers covering return periods up to 1000 years or more are generated for the three distributions mentioned above. Using a plot routine, available extremal observations are plotted on selected probability papers and a linear regression analysis is used to fit a straight line to the data. Predictions of hydrologic extremes for higher recurrence intervals can be made by extrapolating the fitted straight lines.  相似文献   

12.
A method of predicting probability distributions of annual floods is presented and is applied to the Fraser River catchment of British Columbia. The Gumbel distribution is found to adequately describe the observed flood frequency data. Using the estimated Gumbel parameters, discriminant analysis is performed to separate basins into flood regions. Within each region, regression analysis is used to relate physiographic and climatic variables to the means and standard deviations of the annual flood series. The regression equations are applied to four test basins and the results indicate that the method is suitable for an estimation of annual floods.  相似文献   

13.
In water-quality management problems, uncertainties may exist in a number of impact factors and pollution-related processes (e.g., the volume and strength of industrial wastewater and their variations can be presented as random events through identifying a statistical distribution for each source); moreover, nonlinear relationships may exist among many system components (e.g., cost parameters may be functions of wastewater-discharge levels). In this study, an inexact two-stage stochastic quadratic programming (ITQP) method is developed for water-quality management under uncertainty. It is a hybrid of inexact quadratic programming (IQP) and two-stage stochastic programming (TSP) methods. The developed ITQP can handle not only uncertainties expressed as probability distributions and interval values but also nonlinearities in the objective function. It can be used for analyzing various scenarios that are associated with different levels of economic penalties or opportunity losses caused by improper policies. The ITQP is applied to a case of water-quality management to deal with uncertainties presented in terms of probabilities and intervals and to reflect dynamic interactions between pollutant loading and water quality. Interactive and derivative algorithms are employed for solving the ITQP model. The solutions are presented as combinations of deterministic, interval and distributional information, and can thus facilitate communications for different forms of uncertainties. They are helpful for managers in not only making decisions regarding wastewater discharge but also gaining insight into the tradeoff between the system benefit and the environmental requirement.  相似文献   

14.
Uncertainty Assessment for Management of Soil Contaminants with Sparse Data   总被引:3,自引:0,他引:3  
In order for soil resources to be sustainably managed, it is necessary to have reliable, valid data on the spatial distribution of their environmental impact. However, in practice, one often has to cope with spatial interpolation achieved from few data that show a skewed distribution and uncertain information about soil contamination. We present a case study with 76 soil samples taken from a site of 15 square km in order to assess the usability of information gleaned from sparse data. The soil was contaminated with cadmium predominantly as a result of airborne emissions from a metal smelter. The spatial interpolation applies lognormal anisotropic kriging and conditional simulation for log-transformed data. The uncertainty of cadmium concentration acquired through data sampling, sample preparation, analytical measurement, and interpolation is factor 2 within 68.3 % confidence. Uncertainty predominantly results from the spatial interpolation necessitated by low sampling density and spatial heterogeneity. The interpolation data are shown in maps presenting likelihoods of exceeding threshold values as a result of a lognormal probability distribution. Although the results are not deterministic, this procedure yields a quantified and transparent estimation of the contamination, which can be used to delineate areas for soil improvement, remediation, or restricted area use, based on the decision-makers probability safety requirement.  相似文献   

15.
When mineral wastes are reused in construction materials, a current practice is to evaluate their environmental impact using standard leaching test. However, due to the uncertainty of the measurement, it is usually quite difficult to estimate the pollutant potential compared to other materials or threshold limits. The aim of this paper is to give a quantitative evaluation of the uncertainty of leachate concentrations of cement-based materials, as a function of the number of test performed. The relative standard deviations and relative confidence intervals are determined using experimental data in order to give a global evaluation of the uncertainty of leachate concentrations (determination of total relative standard deviation). Various combinations were realized in order to point out the origin of large dispersion of the results (determination of relative standard deviation linked to analytical measured and to leaching procedure), generalisation was suggested and the results were compared to literature. An actual example was given about the introduction of residue (meat and bone meal bottom ash--MBM-BA) in mortar, leaching tests were carried out on various samples with and without residue MBM-BA. In conclusion large dispersion were observed and mainly due to heterogeneity of materials. So heightened attention needed to analyse leaching result on cement-based materials and further more other tests (e.g. ecotoxicology) should be performed to evaluate the environmental effect of these materials.  相似文献   

16.
ABSTRACT: Specific annual suspended sediment yields and their standard deviations are presented for 47 basins of North Island, New Zealand. Most of the variance in yields is explained by catchment mean rainfall. Rivers with similar flow range have similar suspended sediment concentration ratings, independent of differing watershed lithology and regolith, except for six basins having an abundance of soft fine sediments. Prediction equations for yield and its standard deviation are derived for four essentially arbitrary regions. AU feature rainfall as the independent variable. Differences between regions may owe to variations in intensity, frequency, and duration patterns of storms and, in one area, to bed material size as well. The temporal distribution of annual yields from a basin m be modeled by a two-parameter lognormal function: the prediction equations above may be used to evaluate this function at a site for which suspended sediment data are unavailable.  相似文献   

17.
ABSTRACT: Construction of a “peaking storage tank” may reduce the operational cost of municipal water in the availability of a time-of-use energy rate. A peaking storage tank is used for storing water that is pumped from wells or other sources of supply during off-peak periods when energy costs are less for use during periods of on-peak water demand. The optimal size of a peaking storage tank is that which results in the minimum total cost, which includes both the storage construction cost and the cost of operation of the pumps. The operational cost for a given time-of-use rate is determined by help of a pipe network simulation model solved by the Newton-Raphson technique and a dynamic programming optimization model. A more simplified method is also introduced. Analyses show that low off-peak energy costs make the construction of peaking storage tanks economically attractive and reduce on-peak energy use, which results in electrical load leveling.  相似文献   

18.
ABSTRACT: A frequency analysis approach for the prediction of flow characteristics at ungaged locations is applied to a region of high annual precipitation and low topography in north and central Florida. Stationary time series of annual flows are fitted with the lognormal distribution and estimated parameters of the distribution are fitted by third order trend surfaces. These explain 65 and 74 percent of the observed variances in the mean and standard deviation, respectively. Predictions of parameters are then made for several locations previously unused in the study and they are used to estimate the return periods of various flows from the lognormal distribution. Application of the Kolmogorov-Smirnov goodness-of-fit test suggests that only one of the five test stations can be considered significantly different from the observed data, confirming the applicability of this technique.  相似文献   

19.
本文介绍了固液分离中温UASB 混凝气浮 SBR生化处理高浓度果汁废水。运行结果表明:该工艺具有适应性强,运行稳定、有机物去除率高、运行成本低等特点,出水可达到《污水综合排放标准》中一级标准。  相似文献   

20.
ABSTRACT: The statistical analysis of data which have trace level measurements has traditionally been a two-step process in which data are first censored using criteria based on measurement precision, and then analyzed with statistical methods for censored data. The process might be more informative if data were left uncensored. In this paper, information loss attributable to censoring and measurement noise are assessed by comparing the sample mean and median of uncensored measurements with a log regression mean and median based on censored data. Measurements are derived from lognormal parent distributions which have random variability characteristic of trace level measurement. The relative performance of estimators used with error-free samples and with samples having measurement noise can be explained by differences between the probability distributions of parents and measurements. Measurement introduces bias and dispersion and transforms lognormal parent distributions toward greater symmetry. Estimates using uncensored data are less biased and more accurate than the log regression mean and median when censoring exceeds about 50 percent, and are not much worse at any fraction censored. For data with many (80 percent) results below the limit of detection, bias may be quite severe.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号