首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
ABSTRACT: In order to promote a uniform and consistent approach for floodflow frequency studies, the U.S. Water Resources Council has recommended the use of the log-Pearson type III distribution with a generalized skew coefficient. This paper investigates various methods of determining generalized skew coefficients. A new method is introduced that determines generalized skew coefficients using a weighting procedure based upon the variance of regional (map) skew coefficients and the variance of sample skew coefficients. The variance of skew derived from sample data is determined using either of two non-parametric methods called the jackknife or bootstrap. Applications of the new weighting procedure are presented along with an experimental study to test various weighting procedures to derive generalized skew coefficients.  相似文献   

2.
Parametric (propagation for normal error estimates) and nonparametric methods (bootstrap and enumeration of combinations) to assess the uncertainty in calculated rates of nitrogen loading were compared, based on the propagation of uncertainty observed in the variables used in the calculation. In addition, since such calculations are often based on literature surveys rather than random replicate measurements for the site in question, error propagation was also compared using the uncertainty of the sampled population (e.g., standard deviation) as well as the uncertainty of the mean (e.g., standard error of the mean). Calculations for the predicted nitrogen loading to a shallow estuary (Waquoit Bay, MA) were used as an example. The previously estimated mean loading from the watershed (5,400 ha) to Waquoit Bay (600 ha) was 23,000 kg N yr−1. The mode of a nonparametric estimate of the probability distribution differed dramatically, equaling only 70% of this mean. Repeated observations were available for only 8 of the 16 variables used in our calculation. We estimated uncertainty in model predictions by treating these as sample replicates. Parametric and nonparametric estimates of the standard error of the mean loading rate were 12–14%. However, since the available data include site-to-site variability, as is often the case, standard error may be an inappropriate measure of confidence. The standard deviations were around 38% of the loading rate. Further, 95% confidence intervals differed between the nonparametric and parametric methods, with those of the nonparametric method arranged asymmetrically around the predicted loading rate. The disparity in magnitude and symmetry of calculated confidence limits argue for careful consideration of the nature of the uncertainty of variables used in chained calculations. This analysis also suggests that a nonparametric method of calculating loading rates using most frequently observed values for variables used in loading calculations may be more appropriate than using mean values. These findings reinforce the importance of including assessment of uncertainty when evaluating nutrient loading rates in research and planning. Risk assessment, which may need to consider relative probability of extreme events in worst-case scenarios, will be in serious error using normal estimates, or even the nonparametric bootstrap. A method such as our enumeration of combinations produces a more reliable distribution of risk.  相似文献   

3.
ABSTRACT: An investigation of treated municipal wastewaters discharged into Texas streams was conducted to determine the probable effect of concentrations of ammonia in receiving waters, based on existing data on ammonia levels which are lethal to various species of fish. Recorded data for most Texas cities were analyzed. Based on existing toxicity criteria for ammonia of 1/10 TLm= 0.31 mg/1 NH3-N, employing known discharge flow rates, and 7-day, 5-year or 7-day, 10-year low flows in Texas streams, appreciable numbers of sites were found to pose a threat to various species of fish. Using the bluegill (Lepomis macrochirus) as a median tolerance limit species, data from 65 cities which met the aforecited requirements, were analyzed. Those included a total of 92 wastewater effluents. Sixty-nine percent of those cities and 70% of their effluents exceeded the 0.31 mg/1 NH3-N limit in the stream below the discharge point. Thirty-seven percent of the cities equaled or exceeded the 96-hour TLm concentration limit of 3.1 mg/1 ammonia. Based on the 10 mg/1 NO3-N standard for intake water for potable supplies, 32% of the effluents resulted in a stream concentration which exceeded 10 mg/1, assuming a straight conversion of NH3-N to NO3-N.  相似文献   

4.
Field surveys of biological responses can provide valuable information about environmental status and anthropogenic stress. However, it is quite usual for biological variables to differ between sites or change between two periods of time also in the absence of an impact. This means that there is an obvious risk that natural variation will be interpreted as environmental impact, or that relevant effects will be missed due to insufficient statistical power. Furthermore, statistical methods tend to focus on the risks for Type-I error, i.e. false positives. For environmental management, the risk for false negatives is (at least) equally important. The aim of the present study was to investigate how the probabilities for false positives and negatives are affected by experimental set up (number of reference sites and samples per site), decision criteria (statistical method and α-level) and effect size. A model was constructed to simulate data from multiple reference sites, a negative control and a positive control. The negative control was taken from the same distribution as the reference sites and the positive control was just outside the normal range. Using the model, the probabilities to get false positives and false negatives were calculated when a conventional statistical test, based on a null hypothesis of no difference, was used along with alternative tests that were based on the normal range of natural variation. Here, it is tested if an investigated site is significantly inside (equivalence test) and significantly outside (interval test) the normal range. Furthermore, it was tested how the risks for false positives and false negatives are affected by changes in α-level and effect size. The results of the present study show that the strategy that best balances the risks between false positives and false negatives is to use the equivalence test. Besides tests with tabulated p-values, estimates generated using a bootstrap routine were included in the present study. The simulations showed that the probability for management errors was smaller for the bootstrap compared to the traditional test and the interval test.  相似文献   

5.
Predictive models of wildlife-habitat relationships often have been developed without being tested The apparent classification accuracy of such models can be optimistically biased and misleading. Data resampling methods exist that yield a more realistic estimate of model classification accuracy These methods are simple and require no new sample data. We illustrate these methods (cross-validation, jackknife resampling, and bootstrap resampling) with computer simulation to demonstrate the increase in precision of the estimate. The bootstrap method is then applied to field data as a technique for model comparison We recommend that biologists use some resampling procedure to evaluate wildlife habitat models prior to field evaluation.  相似文献   

6.
ABSTRACT: An application of the receiving water block of the EPA Storm Water management Model (SWMM) is presented to quantify water quality impacts and evaluated control alternatives for a 208 areawide waste water management plan in Volusia Country, Florida. The water quality impact analyses were conducted for dry-and wet-weather conditions to simulate dissolved oxygen (DO), chlorides, total nitrogen (TN), and total phosphorus (TP) in the Halifax Rivers, Florida, a 40-kilometer-long tidal estuary located on the Atlantic coast of Florida near Daytona Beach. Dry-weather analysis was performed using conventional 7-day, 10-year low flow conditions to determine a set of unit transfer coefficients which estimate the pollutant concentration transferred to any point in the estuary from a constant unit discharge of pollutants at the existing waste water treatment plant outfall locations. Wet-weather analysis was performed by continuous simulation of a typical three-month summer wet season in Florida. Three-month cumulative duration curves of DO, TN and TP concentrations were constructed to estimate the relative value of controlling urban runoff of waste water treatment plant effluent on the Halifax River. The three-month continuous simulation indicated that the greatest change in DO, TN, and TP duration curves is possible by abatement of waste water treatment plant pollution.  相似文献   

7.
The basic theories and fundamental assumptions usually employed in the solution of unsteady groundwater flow problems are reviewed critically. The best known method of analysis for such problems is based on the Dupuit-Forchheimer approximation and leads to a nonlinear parabolic differential equation which is generally solved by linearization or numerical methods. The accuracy of the solution to this equation can be improved by use of a different approach which does not employ the Dupuit Forchheimer assumption, but rather is based on a semi-numerical solution of the Laplace equation for quasi-steady conditions. The actual unsteady process is replaced by a sequence of steady-state conditions, and it is assumed that the actual unsteady flow characteristics during a short time interval can be approximated by those associated with “average” steady state flow. The Laplace equation is solved by a semi-discretization method according to which the horizontal coordinate is divided into subintervals, while the vertical coordinate is maintained continuous. The proposed method is applied to a typical tile drainage problem, and, based on a comparison of calculated results with experimental data, the method is evaluated and practical conclusions regarding its applicability are advanced.  相似文献   

8.
ABSTRACT: Ground-water pumpage withdrew 57 cubic feet per second from aquifers beneath the Yahara River Basin in 1970. Forty-six cubic feet per second were exported by the diversion of treated wastewater from the drainage basin. The low-flow hydrology of the upper Yahara River has been impacted by this diversion. Prior to 1959, the wastewater was discharged into the river, augmenting the baseflow during low-flow periods. As much as 85% of streamflow was due to effluent discharge. In 1959 the wastewater was transferred from the river basin. The result was a decrease of about one-third in mean annual streamflow, and a decrease of more than 50% in the 7Q2 and 7Q10. Regression analysis showed the annual 7-day low-flow and 60-day low-flow have a statistically significant correlation with mean annual flow. Using predictions of future mean annual discharge of the river with increasing interbasin transfers, it is shown that by 1990 there is a significant probability that in some years the 60-day low-flow in the river will be zero.  相似文献   

9.
This paper is concerned with regional frequency analysis of hydrologic multiyear droughts. A drought event is defined by three parameters: severity, duration, and magnitude. A method is proposed here to standardize drought severities with a duration adjustment to enable comparison among drought events. For purposes of a regional study, the index drought method is selected and applied to standardized droughts to give a regional frequency curve. However, the recurrence intervals of the drought events obtained from index drought method are limited to the historic period of record. Therefore, by taking advantage of random variations of droughts in both time and space, a multivariate simulation model is used to estimate exceedence probabilities associated with regional drought maxima. This method, named the regional extreme drought method, is capable of generating a series of drought events which, although they have not occurred historically, are more severe than historic events. By combining the results of the index drought method and regional extreme drought analysis, a regional drought probability graph is constructed which ranges from severe droughts to more frequent droughts. This procedure is applied to the mean annual flow records of streams located in the San Joaquin Valley of California, and drought-severity-frequency plots are prepared for 1-year, 2-year, and 3-year durations.  相似文献   

10.
ABSTRACT: A regional adjustment relationship was developed to estimate long-term (30-year) monthly median discharges from short term (three-year) records. This method differs from traditional approaches in that it is based on site-specific discharge data but does not require correlation of these data with discharges from a single hydrologically similar long-term gage. The method is shown to be statistically robust, and applicable to statistics other than the median.  相似文献   

11.
ABSTRACT: The problem of real-time quality control of streamflow data is addressed. Five methods are investigated via a Monte-Carlo simulation experiment based on streamflow data from Bird Creek basin in Oklahoma. The five methods include three deterministic approaches and two statistical approaches. The relative performance of the investigated methods is evaluated under hypothesized random mechanism generating isolated outliers. The deterministic method based on streamflow gradient analysis and the statistical method based on forecast residual analysis perform best in detecting such outliers.  相似文献   

12.
ABSTRACT: The parameters of the extreme value type 1 distribution were estimated for 55 annual flood data sets by seven methods. These are the methods of (1) moments, (2) probability weighted moments, (3) mixed moments, (4) maximum likelihood estimation, (5) incomplete means, (6) principle of maximum entropy, and (7) least squares. The method of maximum likelihood estimation was found to be the best and the method of incomplete means the worst. The differences between the methods of principle of maximum entropy, probability weighted moments, moments, and least squares were only minor. The difference between these methods and the method of maximum likelihood was not pronounced.  相似文献   

13.
ABSTRACT: The pebble count procedure (Wolman, 1954) is the measurement of 100 randomly selected stones from a homogeneous population on a river bed or bar, which yields reproducible size distribution curves for surficial deposits of gravel and cobbles. The pebble count is widely used in geomorphologr (and increasingly in river engineering) to characterize surficial grain size distributions in lieu of bulk samples, for which adequate sample sizes become enormous for gravels. Variants on the original method have been proposed, one of which, the so-called ‘zig-zag’ method (Bevenger and King, 1995), involves sampling along a diagonal line and drawing data points from many different geomorphic units. The method is not reproducible, probably because it incorporates stones from many different populations, and because an inadequate number of grains is sampled from any given population. Sampling of coarse bed material should be geomorphically stratified based on the natural sorting of grain sizes into distinct channel features. If a composite grain size is desired, the areas of the bed occupied by different populations can be mapped, pebble counts conducted on each, and a weighted average distribution computed.  相似文献   

14.
15.
ABSTRACT: A common problem arises in testing for trends in water quality when observations are reported as “less than detection limit.” If a single detection limit is used for the entire study, existing non-parametric statistical methods, modified for ties, are applicable. If, however, the detection limit varies during the course of the study, resulting in multiple detection limits, then the commonly used trend detection methods are not appropriate. A statistic similar to Kendall's tau, but based on expected ranks, is proposed. Monte Carlo simulations show that the normal approximation to the distribution of this statistic is quite good, even for small samples and a large proportion of censored observations. The statistic is also shown to have greater power than the ad-hoc method of treating all observations less than the target censored observation as tied.  相似文献   

16.
ABSTRACT: In recent years, several approaches to hydrologic frequency analysis have been proposed that enable one to direct attention to that portion of an overall probability distribution that is of greatest interest. The majority of the studies have focused on the upper tail of a distribution for flood analyses, though the same ideas can be applied to low flows. This paper presents an evaluation of the performances of five different estimation methods that place an emphasis on fitting the lower tail of the lognormal distribution for estimation of the ten‐year low‐flow quantile. The methods compared include distributional truncation, MLE treatment of censored data, partial probability weighted moments, LL‐moments, and expected moments. It is concluded that while there are some differences among the alternative methods in terms of their biases and root mean square errors, no one method consistently performs better than the others, particularly with recognition that the underlying population distribution is unknown. Therefore, it seems perfectly legitimate to make a selection of a method on the basis other criteria, such as ease of use. It is also shown in this paper that the five alternative methods can perform about as well as, if not better than, an estimation strategy involving fitting the complete lognormal distribution using L‐moments.  相似文献   

17.
ABSTRACT: The detection of change in a hydrologic varaible, particularly water quality, is a current problem. A method is presented for testing whether there has been a shift in the mean of a hydrologic variable based on the well established bivariate normal distribution theory. In this technique, the dependent, or target, and the independent, or control, variables are formed as weighted linear combinations of the mean values at a number of locations in a selected target and control area. The weighting factors are determined based on a mathematical programming technique which minimizes the conditional coefficient of variation thereby minimizing the number of observations required to detect a change of a preselected magnitude in the mean of the target area. The result is a situation where a savings in the number of observations required to detect a change is a consequence of adding more stations: the space-time tradeoff. Two applications of the technique are presented, the first using electrical conductivity (EC) data from two sets of river basins and the second using EC data from a set of basins as the target variable and annual discharge as the control. The results indicate that a significant savings in time can be achieved by using this method.  相似文献   

18.
ABSTRACT: Evaluation of the Great Lakes Environmental Research Laboratory's (GLERL's) physically-based monthly net basin supply forecast method reveals component errors and the effects of model improvements for use on the Laurentian Great Lakes. While designed for probabilistic outlooks, it is assessed for giving deterministic outlooks along with other net basin supply forecast methods of the U.S. Army Corps of Engineers and Environment Canada, and with a stochastic approach commissioned by the Corps. The methods are compared to a simple clima-tological forecast and to actual time series of net basin supplies. Aetual net basin supplies are currently determined by estimating all components directly, instead of as water-balance residuals. This is judged more accurate and appropriate for both forecasting and simulation. GLERL's physically-based method forecasts component supplies while the other methods are based on residual supplies. These other methods should be rederived to be based on component supplies. For each of these other methods, differences between their outlooks and residual supplies are used as error estimates for the rederived methods and component supplies. The evaluations are made over a recent period of record high levels followed by a record drought. Net basin supply outlooks are better than climatology, and GLERL's physically-based method performs best with regard to either component or residual net basin supplies. Until advances are made in long-range climate outlooks, deterministic supply outlooks cannot be improved significantly.  相似文献   

19.
ABSTRACT: Many automatic calibration processes have been proposed to efficiently calibrate the 16 parameters involved in the four‐layered tank model. The Multistart Powell and Stuffed Complex Evolution (SCE) methods are considered the best two procedures. Two rainfall events were designed to compare the performance and efficiency of these two methods. The first rainfall event is short term and the second designed for long term rainfall data collection. Both rainfall events include a lengthy no‐rainfall period. Two sets of upper and lower values for the search range were selected for the numerical tests. The results show that the Multistart Powell and SCE methods are able to obtain the true values for the 16 parameters with a sufficiently long no‐rainfall period after a rainfall event. In addition, by using two selected objective functions, one based on root mean square error and one based on root mean square relative error criteria, it is found that the no‐rainfall period lengths necessary to obtain the converged true values for the 16 parameters are roughly the same. The SCE method provides a more efficient search based on an appropriate preliminary search range. The Multistart Powell method, on the other hand, leads to more accurate search results when there is no suitable search range selected based on the parameter calibration experience.  相似文献   

20.
The principle of maximum entropy (POME) was used to derive the two-parameter gamma distribution used frequently in synthesis of instantaneous or finite-period unit hydrographs. The POME yielded the minimally prejudiced gamma distribution by maximizing the entropy subject to two appropriate constraints which were the mean of real values and the mean of the logarithms of real values of the variable. It provided a unique method for parameter estimation. Experimental data were used to compare this method with the methods of moments, cumulants, maximum likelihood estimation, and least squares.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号