首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 34 毫秒
1.
ABSTRACT: Bank full hydraulic geometry relationships relate stream channel geometry to watershed size for specific physiographic regions. This paper presents bank full hydraulic geometry relationships and recurrence intervals for the Southeastern Plain coercion and the flat woods subtype of the Middle Atlantic Coastal Plain ecoregion found within North Carolina's Coastal Plain physiographic province. Cross‐sectional and longitudinal survey data from gated and unpaged streams were used to compute channel dimension and profile information. Power functions were developed, relating drainage area to bank full discharge, cross‐sectional area, width, and mean depth. Recurrence intervals of bank full events were estimated from gagged streams using both a Log‐Pearson Type III distribution of peak annual discharge and a partial‐duration series of average daily discharge. Results from both methods indicate that average bank full recurrence intervals for the study area are below one year. Determinations of recurrence intervals by the Log‐Pearson Type III distribution were for the most part inconclusive (less than one year), while a partial duration series estimated a 0.19 year average, ranging from 0.11 to 0.31 years.  相似文献   

2.
ABSTRACT: Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distributions (Log-Pearson III and Weibull) had lower mean square errors than did the Box-Cox transformation method or the Log-Boughton method which is based on a fit of plotting positions.  相似文献   

3.
4.
Regression models of mean and mean annual maximum (MAM) cover were derived for two categories of periphyton cover (filaments and mats) using 22 years of monthly monitoring data from 78 river sites across New Zealand. Explanatory variables were derived from observations of water quality variables, hydrology, shade, bed sediment grain size, temperature, and solar radiation. The root mean square errors of these models were large (75‐95% of the mean of the estimated values). The at‐site frequency distributions of periphyton cover were approximated by the exponential distribution, which has the mean cover as its single parameter. Independent predictions of cover distributions at all sites were calculated using the mean predicted by the regression model and the theoretical exponential distribution. The probability that cover exceeds specified thresholds and estimates of MAM cover, based on the predicted distributions, had large uncertainties (~80‐100%) at the site scale. However, predictions aggregated by classes of an environmental classification accurately predicted the proportion of sites for which cover exceeded nominated criteria in the classes. The models are useful for assessing broad‐scale patterns in periphyton cover and for estimating changes in cover with changes in nutrients, hydrological regime, and light.  相似文献   

5.
ABSTRACT: The Palmer Drought Severity Index (PDSI) has been calculated for about 30 years as a means of providing a single measure of meteorological drought severity. It was intended to retrospectively look at wet and dry conditions using water balance techniques. The Standardized Precipitation Index (SPI) is a probability index that was developed to give a better representation of abnormal wetness and dryness than the Palmer indices. Before the user community will accept the SPI as an alternative to the Palmer indices, a standard method must be developed for computing the index. Standardization is necessary so that all users of the index will have a common basis for both spatial and temporal comparison of index values. If different probability distributions and models are used to describe an observed series of precipitation, then different SPI values may be obtained. This article describes the effect on the SPI values computed from different probability models as well as the effects on dry event characteristics. It is concluded that the Pearson Type III distribution is the “best” universal model, and that the reliability of the SPI is sample size dependent. It is also concluded that because of data limitations, SPIs with time scales longer than 24 months may be unreliable. An internet link is provided that will allow users to access Fortran 77 source code for calculating the SPI.  相似文献   

6.
Probability distributions that model the return periods of flood characteristics derived from partial duration series are proposed and tested in the Fraser River catchment of British Columbia. Theoretical distributions describing the magnitude, duration, frequency and timing of floods are found to provide a goof fit to the observed data. The five estimated parameters summarizing the flood characteristics of each basin are entered into a discriminant analysis procedure to establish flood regions. Three regions were identified, each displaying flood behavior closely related to the physical conditions of the catchment. Within each region, regression equations are obtained between parameter values and basin climatic and physiographic variables. These equations provide a satisfactory prediction of flood parameters and this allows the estimation of a comprehensive set of flood characteristics for areas with sparse hydrologic information.  相似文献   

7.
ABSTRACT: The Mississippi Department of Environmental Quality uses the Steady Riverine Environmental Assessment Model (STREAM) to establish effluent limitations. While the U.S. Environmental Protection Agency has approved of its use, questions arise regarding the model's simplicity. The objective of this research was to compare STREAM with the more commonly utilized Enhanced Stream Water Quality Model (QUAL2E). The comparison involved a statistical evaluation procedure based on sensitivity analyses, input probability distribution functions, and Monte Carlo simulation with site‐specific data from a 46‐mile (74‐km) reach of the Big Black River in central Mississippi. Site specific probability distribution functions were derived from measured rates of reaeration, sediment oxygen demand, photosynthesis, and respiration. Both STREAM and QUAL2E reasonably predicted daily average dissolved oxygen (DO) based on a comparison of output probability distributions with observed DO. Observed DO was consistently within 90 percent confidence intervals of model predictions. The STREAM approach generally overpredicted while QUAL2E generally matched observed DO. Using the more commonly assumed lognormal distribution as opposed to a Weibull distribution for two of the sensitive input parameters resulted in minimal differences in the statistical evaluations. The QUAL2E approach had distinct advantages over STREAM in simulating the growth cycle of algae.  相似文献   

8.
ABSTRACT: A generalized skew map for Louisiana streams was developed using data from Louisiana, Mississippi, Arkansas, and Texas with 20 or more years of annual flood records. A comparison between the newly developed Louisiana Generalized Skew Map (LGSM) and the generalized skew map recommended by the U.S. Water Resources Council (WRCGSM) was performed. The mean square error for the LGSM was 16 percent less than that of WRCGSM in direct application of the two maps. Performance of the new map was compared with the WRCGSM and with a regional analysis procedure through its application to the Log Pearson Type 3 (LP3) distribution. Two-thirds of the stations tested had lower standardized root mean square deviations (SRMSD) by a narrow margin using the skew coefficients obtained from LGSM instead of WRCGSM. The regional analysis also performed as well as the LGSM in terms of SRMSD. Thus, it was concluded that both LGSM and the regional analysis provide a more reliable tool for flood frequency analysis for Louisiana streams with short annual flood records.  相似文献   

9.
ABSTRACT: The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.  相似文献   

10.
ABSTRACT: A stochastic dynamic programming model is applied to a small hydroelectric system. The variation in number of stage iterations and the computer time required to reach steady state conditions with changes in the number of storage states is investigated. The increase in computer time required to develop the storage probability distributions with increase in the number of storage states is reviewed. It is found that for an average of seven inflow states, the largest number of storage states for which it is computationally feasible to develop the storage probability distributions is nine. It is shown that use of the dynamic program results based on a small number of storage states results in unrealistically skewed storage probability distributions. These skewed distributions are attributed to “trapping” states at the low end of the storage range.  相似文献   

11.
The primary advantage of Dynamically Dimensioned Search (DDS) algorithm is that it outperforms other optimization techniques in both convergence speed and searching ability for parameter sets that satisfy statistical guidelines while requiring only one algorithm parameter (perturbation factor) in the optimization process. Conventionally, a default value of 0.2 is used as the perturbation factor, where a normal distribution is applied with mean sampling distribution of zero and variance of one. However, the perturbation factor sensitivity to the performance of DDS for watershed modeling is still unknown. The fixed‐form sampling distribution may result in finding parameters at the local scale rather than global in the sampling space. In this study, the efficiency of DDS was evaluated by altering the perturbation factor (from 0.05 to 1.00) and the selection of sampling distribution (normal and uniform) on hydrologic and water quality predictions in a lowland agricultural watershed in Texas, United States. Results show that the use of altered perturbation factor may cause variations in convergence speed or the ability to find better solutions. In addition, DDS results were found to be very sensitive to sampling distribution selections, where DDS‐N (normal distribution) outperformed DDS‐U (uniform distribution) in all case scenarios. The choice of sampling distributions could be the potential major factor to be attributed for the performance of auto‐calibration techniques for watershed simulation models.  相似文献   

12.
ABSTRACT: We analyzed the type of hydrologic adjustments resulting from flow regulation across a range of dam types, distributed throughout the Connecticut River watershed, using two approaches: (1) the Index of Hydrologic Alteration (IHA) and (2) log‐Pearson Type III flood frequency analysis. We applied these analyses to seven rivers that have extensive pre‐and post‐disturbance flow records and to six rivers that have only long post‐regulation flow records. Lastly, we analyzed six unregulated streams to establish the regional natural flow regime and to test whether it has changed significantly over time in the context of an increase in forest cover from less than 20 percent historically to greater than 80 percent at present. We found significant hydrologic adjustments associated with both impoundments and land use change. On average, maximum peak flows decrease by 32 percent in impounded rivers, but the effect decreases with increasing flow duration. One‐day minimum low flows increase following regulation, except for the hydro‐electric facility on the mainstem. Hydrograph reversals occur more commonly now on the mainstem, but the tributary flood control structures experience diminished reversals. Major shifts in flood frequency occur with the largest effect occurring downstream of tributary flood control impoundments and less so downstream of the mainstem's hydroelectric facility. These overall results indicate that the hydrologic impacts of dams in humid environments can be as significant as those for large, multiple‐purpose reservoirs in more arid environments.  相似文献   

13.
ABSTRACT: A simple simulation type approach and a statistical method are proposed for determining the confidence interval of the T‐year frequency rainfall percentiles (or precipitation extremes) for generalized extreme value (GEV) distributions. The former method is based on the Monte Carlo testing procedure. To generate realizations, the covariance structure of the three parameters of GEV is investigated using an observed information matrix of the likelihood function. For distributions with realistic parameters, the correlation between the location and the scale parameters is practically constant when the shape parameter varies around values close to its optimal value. The latter method is based on likelihood ratio statistics. In the case where the joint confidence surface for shape parameters and estimates is plotted with lines of best estimates, the region where the estimated best percentile value can be chosen as a possible estimate is part of the joint confidence surface. The projection of this bounded region on axis of percentile is defined as the effective confidence interval in this research. The use of this effective interval as the confidence interval of the percentile of T‐year frequency rainfall is particularly recommended because it is stable for T and it reflects variations in all three parameters of GEV appropriately.  相似文献   

14.
ABSTRACT: Previous studies on multiyear droughts have often been limited to the analysis of historic annual flow series. A major disadvantage in this approach can be described as the unavailability of long historic flow records needed to obtain a significant number of drought events for the analysis. To overcome this difficulty, the present study proposes to use synthetically generated annual flow series. A methodology is presented to model annual flows based on an analysis of the harmonic and stochastic properties of the observed flows. Once the model is determined, it can be utilized to generate a flow series of desired length so as to include many hydrologic cycles within the process. The key parameter for a successful drought study is the truncation level used to distinguish low flows from high flows. In this paper, a concept of selecting the truncation level is also presented. The drought simulation procedure is illustrated by a case study of the Pequest watershed in New Jersey. For the above watershed, multiyear droughts were derived from both historic and generated flow series. Three important drought parameters, namely, the duration, severity, and magnitude, were determined for each drought event, and their probability distributions were studied. It was found that gamma and log normal probaility functions produce the best fit for the duration and severity, respectively. The derived probability curves from generated flows can be reliably used to predict the longest drought duration and the largest drought severity within a given return period.  相似文献   

15.
16.
ABSTRACT: Having determined various statistical parameters for five mean monthly hydrometeorological time series of the United States, directional variation of the autocorrelation numbers, their spatial distribution over the United States and statistical significances are presented in this paper. Two conceptually different approaches (one using directional strips of large combined watersheds and the other with analysis of variances) are employed to explore the geographic variation of the statistical parameters (autocorrelations and explained variances) in question. Results adequately indicate the correspondence between these variations and the observed topographic, climatologic and hydrologic characteristics over the United States.  相似文献   

17.
ABSTRACT: A stochastic estimation of low flow in the upper reaches of streams is needed for the planning, development, and management of water resources and/or water use systems. In this paper, the definition and development procedure for the stochastic flow duration curve is presented and applied to five catchments located in eastern Japan and to two catchments in western Thailand. The probability distribution of N‐year daily discharge data is extracted at various percentages of time for which specified discharges are equaled or exceeded in a water year. Such a distribution is usually represented with a straight line on log‐normal probability paper. However, some of the probability plots for the annual minimum daily discharge are best represented with a straight line on Weibull probability paper. The effectiveness of the stochastic flow duration curve defined for the evaluation of flow regime is illustrated through its application. The ten year probability for the discharge exceeded 97 percent of the time may be recognized as an index of low flow. The recession shape of the lower part of the flow duration curve is dependent on the strength of low flow persistence.  相似文献   

18.
ABSTRACT: Bivariate and trivariate distributions have been derived from the logistic model for the multivariate extreme value distribution. Marginals in the models are extreme value type I distributions for two-component mixture variables (mixed Gumbel distribution). This paper is a continuation of the previous works on multivariate distribution in hydrology. Interest is focused on the analysis of floods which are generated by different types of storms. The construction of their corresponding probability distributions and density functions are described. In order to obtain the parameters of such a bivariate or trivariate distribution, a generalized maximum likelihood estimation procedure is proposed to allow for the cases of samples with different lengths of record. A region in Northern Mexico with 42 gauging stations, grouped into two homogeneous regions, has been selected to apply the models. Results produced by the multivariate distributions have been compared with those obtained by the Normal, log-Normal-2, log-Normal-3, Gamma-2, Gamma-3, log-Pearson-3, Gumbel, TCEV and General Extreme Value distributions. Goodness of fit is measured by the criterion of standard error of fit. Results suggest that the proposed models are a suitable option to be considered when performing flood frequency analysis.  相似文献   

19.
ABSTRACT: A time series of annual flow of the Sacramento River, California, is reconstructed to A.D. 869 from tree rings for a long‐term perspective on hydrologic drought. Reconstructions derived by principal components regression of flow on time‐varying subsets of tree‐ring chronologies account for 64 to 81 percent of the flow variance in the 1906 to 1977 calibration period. A Monte Carlo analysis of reconstructed n‐year running means indicates that the gaged record contains examples of drought extremes for averaging periods of perhaps = 6 to 10 years, but not for longer and shorter averaging periods. For example, the estimated probability approaches 1.0 that the flow in A.D. 1580 was lower than the lowest single‐year gaged flow. The tree‐ring record also suggests that persistently high or low flows over 50‐year periods characterize some parts of the long‐term flow history. The results should contribute to sensible water resources planning for the Sacramento Basin and to the methodology of incorporating tree‐ring data in the assessment of the probability of hydrologic drought.  相似文献   

20.
ABSTRACT: Low-flow estimates, as determined by probabilistic modeling of observed data sequences, are commonly used to describe certain streamflow characteristics. Unfortunately, however, reliable low-flow estimates can be difficult to come by, particularly for gaging sites with short record lengths. The shortness of records leads to uncertainties not only in the selection of a distribution for modeling purposes but also in the estimates of the parameters of a chosen model. In flood frequency analysis, the common approach to mitigation of some of these problems is through the regionalization of frequency behavior. The same general approach is applied here to the case of low-flow estimation, with the general intent of not only improving low-flow estimates but also illustrating the gains that might be attained in so doing. Data used for this study is that which has been systematically observed at 128 streamflow gaging sites across the State of Alabama. Our conclusions are that the log Pearson Type 3 distribution is a suitable candidate for modeling of Alabama low-flows, and that the shape parameter of that distribution can be estimated on a regional basis. Low-flow estimates based on the regional estimator are compared with estimates based on the use of only at-site estimation techniques.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号