首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
Regional procedures to estimate flood magnitudes for ungaged watersheds typically ignore available site-specific historic flood information such as high water marks and the corresponding flow estimates, otherwise referred to as limited site-specific historic (LSSH) flood data. A procedure to construct flood frequency curves on the basis of LSSH flood observations is presented. Simple inverse variance weighting is employed to systematically combine flood estimates obtained from the LSSH data base with those from a regional procedure to obtain improved estimtes of flood peaks on the ungaged watershed. For the region studied, the variance weighted estimates of flow had a lower logarithmic standard error than either the regional or the LSSH flow estimates, when compared to the estimates determined by three standard distributions for gaged watersheds investigated in the development of the methodology. Use of the simple inverse variance weighting procedure is recommended when “reliable” estimates of LSSH floods for the ungaged site are available.  相似文献   

2.
ABSTRACT: The literature abounds with procedures for estimating the magnitude and frequency of floods at ungaged locations. Unfortunately, the large number of available procedures creates an awesome task for potential users of sorting and selecting a method for immediate use. The objectives of this paper are to present (1) criteria that are necessary to evaluate the usefulness of hydrologic procedures, (2) to present a classification system for categorizing the multitude of procedures that are available, (3) to summarize the findings of the literature review, and (4) to make recommendations on reporting of flood frequency estimation procedures on ungaged watersheds.  相似文献   

3.
ABSTRACT: Regional hydrologic procedures such as generalized least squares regression and streamflow record augmentation have been advocated for obtaining estimates of both flood-flow and low-flow statistics at ungaged sites. While such procedures are extremely useful in regional flood-flow studies, no evaluation of their merit in regional low-flow estimation has been made using actual streamflow data. This study develops generalized regional regression equations for estimating the d-day, T-year low-flow discharge, Qd, t, at ungaged sites in Massachusetts where d = 3, 7, 14, and 30 days. A two-parameter lognormal distribution is fit to sequences of annual minimum d-day low-flows and the estimated parameters of the lognormal distribution are then related to two drainage basin characteristics: drainage area and relief. The resulting models are general, simple to use, and about as precise as most previous models that only provide estimates of a single statistic such as Q7,10. Comparisons are provided of the impact of using ordinary least squares regression, generalized least squares regression, and streamflow record augmentation procedures to fit regional low-flow frequency models in Massachusetts.  相似文献   

4.
ABSTRACT: Considerable effort is expended each year in making flood peak estimates at both gaged and ungaged sites. Many methods, both simplistic and complex, have been proposed for making such estimates. The hydrologist that must make an estimate at a particular site is interested in the accuracy of the estimate. Most methods are developed using either statistical analyses or analytical optimization schemes. While publications describing these methods often include some statistical measure of goodness-of-flt, the terminology often does not provide the potential user with an answer to the question,‘How accurate is the estimate?’ That is, statistical terminology often are not used properly, which may lead to a false sense of security. The use of the correct terminology will help potential users evaluate the usefulness of a proposed method and provide a means of comparing different methods. This study provides definitions for terms often used in literature on flood peak estimation and provides an interpretation for these terms. Specific problems discussed include the use of arbitrary levels of significance in statistical tests of hypotheses, the identification of both random and systematic variation in estimates from hydrologic methods, and the difference between accuracy of model calibration and accuracy of prediction.  相似文献   

5.
ABSTRACT: An evaluation of flood frequency estimates simulated from a rainfall/runoff model is based on (1) computation of the equivalent years of record for regional estimating equations based on 50 small stream sites in Oklahoma and (2) computation of the bias for synthetic flood estimates as compared to observed estimates at 97 small stream sites with at least 20 years of record in eight eastern states. Because of the high intercorrelation of synthetic flood estimates between watersheds, little or no regional (spatial) information may be added to the network as a result of the modeling activity. The equivalent years of record for the regional estimating equations based totally on synthetic flood discharges is shown to be considerably less than the length of rainfall record used to simulate the runoff. Furthermore, the flood estimates from the rainfall/runoff model consistently underestimate the flood discharges based on observed record, particularly for the larger floods. Depending on the way bias is computed, the synthetic estimate of the 100-year flood discharge varies from 11 to 29 percent less than the value based on observed record. In addition, the correlation between observed and synthetic flood frequency estimates at the same site is also investigated. The degree of correlation between these estimates appears to vary with recurrence interval. Unless the correlation between these two estimates is known, it is not possible to compute a weighted estimate with minimum variance.  相似文献   

6.
One of the problems which often arises in engineering hydrology is to estimate data at a given site because either the data are missing or the site is ungaged. Such estimates can be made by spatial interpolation of data available at other sites. A number of spatial interpolation techniques are available today with varying degrees of complexity. It is the intent of this paper to compare the applicability of various proposed interpolation techniques for estimating annual precipitation at selected sites. The interpolation techniques analyzed include the commonly used Thiessen polygon, the classical polynomial interpolation by least-squares or Lagrange approach, the inverse distance technique, the multiquadric interpolation, the optimal interpolation and the Kriging technique. Thirty years of annual precipitation data at 29 stations located in the Region II of the North Central continental United States have been used for this study. The comparison is based on the error of estimates obtained at five selected sites. Results indicate that the Kriging and optimal interpolation techniques are superior to the other techniques. However, the multiquadric technique is almost as good as those two. The inverse distance interpolation and the Thiessen polygon gave fairly satisfactory results while the polynomial interpolation did not produce good results.  相似文献   

7.
Abstract: A mix of causative mechanisms may be responsible for flood at a site. Floods may be caused because of extreme rainfall or rain on other rainfall events. The statistical attributes of these events differ according to the watershed characteristics and the causes. Traditional methods of flood frequency analysis are only adequate for specific situations. Also, to address the uncertainty of flood frequency estimates for hydraulic structures, a series of probabilistic analyses of rainfall‐runoff and flow routing models, and their associated inputs, are used. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated to evaluate the probability of floods. Therefore, the objectives of this study were to develop a flood frequency curve derivation method driven by multiple random variables and to develop a tool that can consider the uncertainties of design floods. This study focuses on developing a flood frequency curve based on nonparametric statistical methods for the estimation of probabilities of rare floods that are more appropriate in Korea. To derive the frequency curve, rainfall generation using the nonparametric kernel density estimation approach is proposed. Many flood events are simulated by nonparametric Monte Carlo simulations coupled with the center Latin hypercube sampling method to estimate the associated uncertainty. This study applies the methods described to a Korean watershed. The results provide higher physical appropriateness and reasonable estimates of design flood.  相似文献   

8.
ABSTRACT: Low-flow estimates, as determined by probabilistic modeling of observed data sequences, are commonly used to describe certain streamflow characteristics. Unfortunately, however, reliable low-flow estimates can be difficult to come by, particularly for gaging sites with short record lengths. The shortness of records leads to uncertainties not only in the selection of a distribution for modeling purposes but also in the estimates of the parameters of a chosen model. In flood frequency analysis, the common approach to mitigation of some of these problems is through the regionalization of frequency behavior. The same general approach is applied here to the case of low-flow estimation, with the general intent of not only improving low-flow estimates but also illustrating the gains that might be attained in so doing. Data used for this study is that which has been systematically observed at 128 streamflow gaging sites across the State of Alabama. Our conclusions are that the log Pearson Type 3 distribution is a suitable candidate for modeling of Alabama low-flows, and that the shape parameter of that distribution can be estimated on a regional basis. Low-flow estimates based on the regional estimator are compared with estimates based on the use of only at-site estimation techniques.  相似文献   

9.
ABSTRACT: Five methods of developing regional regression models to estimate flood characteristics at ungaged sites in Arkansas are examined. The methods differ in the manner in which the State is divided into subregions. Each successive method (A to E) is computationally more complex than the previous method. Method A makes no subdivision. Methods B and C define two and four geographic subregions, respectively. Method D uses cluster/discriminant analysis to define subregions on the basis of similarities in watershed characteristics. Method E, the new region of influence method, defines a unique subregion for each ungaged site. Split-sample results indicate that, in terms of root-mean-square error, method E (38 percent error) is best. Methods C and D (42 and 41 percent error) were in a virtual tie for second, and methods B (44 percent error) and A (49 percent error) were fourth and fifth best.  相似文献   

10.
ABSTRACT: A climate factor, CT, (T = 2–, 25-, and 100-year recurrence intervals) that delineates regional trends in small-basin flood frequency was derived using data from 71 long-term rainfall record sites. Values of CT at these sites were developed by a regression analysis that related rainfall-runoff model estimates of T-year floods to a sample set of 50 model calibrations. CT was regionalized via kriging to develop maps depicting its geographic variation for a large part of the United States east of the 105th meridian. Kriged estimates of CT and basin-runoff characteristics were used to compute regionalized T-year floods for 200 small drainage basins. Observed T-year flood estimates also were developed for these sites. Regionalized floods are shown to account for a large percentage of the variability in observed flood estimates with coefficients of determination ranging from 0.89 for 2-year floods to 0.82 for 100-year floods. The relative importance of the factors comprising regionalized flood estimates is evaluated in terms of scale (size of drainage area), basin-runoff characteristics (rainfall. runoff model parameters), and climate (CT).  相似文献   

11.
The flood frequency characteristics of 18 watersheds in southeastern Arizona were studied using the log-Boughton and the log-Pearson Type 3 distribution. From the flood frequency study, a generalized envelope for Q100 for watersheds 0.01 to 4000 mi2 in area has been produced for southeastern Arizona. The generalized envelope allows comparisons to be made among the relative flood characteristics of the watersheds used in the study and provides a conservative estimate of Q100 for ungaged watersheds in the region.  相似文献   

12.
ABSTRACT: Data splitting is used to compare methods of determining “homogeneous” hydrologic regions. The methods compared use cluster analysis based on similarity of hydrologic characteristics or similarity of characteristics of a stream's drainage basin. Data for 221 stations in Arizona are used to show that the methods, which are a modification of DeCoursey's scheme for defining regions, improve the fit of estimation data to the model, but that is is necessary to have an independent measure of predictive accuracy, such as that provided by data splitting, to demonstrate improved predictive accuracy. The methods used the complete linkage algorithm for cluster analysis and computed weighted average estimates of hydrologic characteristics at ungaged sites.  相似文献   

13.
ABSTRACT: Baseflow, or water that enters a stream from slowly varying sources such as ground water, can be critical to humans and ecosystems. We evaluate a simple method for estimating base‐flow parameters at ungaged sites. The method uses one or more baseflow discharge measurements at the ungaged site and longterm streamflow data from a nearby gaged site. A given baseflow parameter, such as the median, is estimated as the product of the corresponding gage site parameter and the geometric mean of the ratios of the measured baseflow discharges and the concurrent discharges at the gage site. If baseflows at gaged and ungaged sites have a bivariate lognormal distribution with high correlation and nearly equal log variances, the estimated baseflow parameters are very accurate. We tested the proposed method using long‐term streamflow data from two watershed pairs in the Driftless Area of southwestern Wisconsin. For one watershed pair, the theoretical assumptions are well met; for the other the log‐variances are substantially different. In the first case, the method performs well for estimating both annual and long‐term baseflow parameters. In the second, the method performs remarkably well for estimating annual mean and annual median baseflow discharge, but less well for estimating the annual lower decile and the long‐term mean, median, and lower decile. In general, the use of four measurements in a year is not substantially better than the use of two.  相似文献   

14.
Abstract: The determination of sediment and nutrient loads is typically based on the collection and analysis of grab samples. The frequency and regularity of traditional sampling may not provide representation of constituent loading, particularly in systems with flashy hydrology. At two sites in the Little Bear River, Utah, continuous, high‐frequency turbidity was used with surrogate relationships to generate estimates of total phosphorus and total suspended solids concentrations, which were paired with discharge to estimate annual loads. The high frequency records were randomly subsampled to represent hourly, daily, weekly, and monthly sampling frequencies and to examine the effects of timing, and resulting annual load estimates were compared to the reference loads. Higher frequency sampling resulted in load estimates that better approximated the reference loads. The degree of bias was greater at the more hydrologically responsive site in the upper watershed, which required a higher sampling frequency than the lower watershed site to achieve the same level of accuracy in estimating the reference load. The hour of day and day of week of sampling impacted load estimation, depending on site and hydrologic conditions. The effects of sampling frequency on the determination of compliance with a water quality criterion were also examined. These techniques can be helpful in determining necessary sampling frequency to meet the objectives of a water quality monitoring program.  相似文献   

15.
ABSTRACT: Estimates of mean annual precipitation (MAP) over areas are the starting point for all computations of water and chemical balances for drainage basins and surface water bodies. Any errors in the estimates of MAP are propagated through the balance computations. These errors can be due to: (1) failures of individual gages to collect the amount of precpitation that actually falls; (2) operator errors; and (3) failure of the raingage network to adequately sample the region of interest. This paper attempts to evaluate the last of these types of error by applying kriging in two different approaches to estimating MAP in New Hampshire and Vermont, USA. The data base is the 1951–1980 normal precipitation at 120 raingages in the two states and in adjacent portions of bordering states and provinces. In the first approach, kriging is applied directly to the MAP values, while in the second, kriging is applied to a “precipitation delivery factor” that represents the MAP with the orographic effect removed. The first approach gives slightly better kriged estimates of MAP at seven validation stations that were not included in the original analysis, but results in an error surface that is highly contorted and in larger maximum errors over most of the region. The second approach had a considerably smoother error surface and, thus, is generally preferable as a basis for point and areal estimates of MAP. MAP estimates in the region have 95 percent confidence intervals of about 20 cm/yr at low and moderate elevations, and up to 35 cm/yr at high elevations. These uncertainties amount to about 20 percent of estimated MAP values.  相似文献   

16.
ABSTRACT: A frequency analysis approach for the prediction of flow characteristics at ungaged locations is applied to a region of high annual precipitation and low topography in north and central Florida. Stationary time series of annual flows are fitted with the lognormal distribution and estimated parameters of the distribution are fitted by third order trend surfaces. These explain 65 and 74 percent of the observed variances in the mean and standard deviation, respectively. Predictions of parameters are then made for several locations previously unused in the study and they are used to estimate the return periods of various flows from the lognormal distribution. Application of the Kolmogorov-Smirnov goodness-of-fit test suggests that only one of the five test stations can be considered significantly different from the observed data, confirming the applicability of this technique.  相似文献   

17.
ABSTRACT: Recent work has found that a one-parameter Weibull model of wet day precipitation amount based on the Weibull distribution provides a better fit to historical daily precipitation data for eastern U.S. sites than other one-parameter models. The general two-parameter Weibull distribution was compared in this study to other widely used distributions for describing the distribution of daily precipitation event sizes at 99 sites from the U.S. Pacific Northwest. Surprisingly little performance was sacrificed by reducing the two-parameter Weibull to a single-parameter distribution. Advantages of the single-parameter model included requiring only the mean wet day precipitation amount for calibration, invertibility for simulation purposes, and ease of analytical manipulation. The fit of the single-parameter Weibull to the 99 stations included in this study was significantly better than other single-parameter models tested, and performed as well as the widely endorsed, more cumbersome, two-parameter gamma model. Both the one-and two-parameter Weibull distributions are shown to have b-moments that are consistent with historical precipitation data, while the ratio of b-skew and b-variance in the gamma model is inconsistent with the historical recerd by this measure. In addition, it was found that the two-parameter gamma distribution was better fit using the method of moments estimators than maximum likelihood estimates. These findings suggested that the distribution in precipitation among sites in the Pacific Northwest with dramatically different settings are nearly identical if expressed in proportion to the mean site event size.  相似文献   

18.
ABSTRACT: As part of the U.S. Environmental Protection Agency's effort to determine the long-term effects of acidic deposition on surface water chemistry, annual runoff was estimated for about 1000 ungaged sites in the eastern U.S. using runoff contour maps. One concern in using contour maps was that a bias may be introduced in the runoff estimates due to the size of the 1000 ungaged sites relative to the size of the watersheds used in developing the maps. To determine if a bias was present the relationship between the annual runoff (expressed as depth) and the watershed area for the Northeast (NE) and Southern Blue Ridge Province (SBRP) was tested using five regional data bases. One short-term data base (1984 Water Year, n = 531) and two long-term data bases (1940–57, n = 134 and 1951–80, n = 342) were used in the NE. In the SBRP one short-term database (1984 Water Year, n = 531) and one long-term data base (1951–80, n = 60) were used. For the NE and the SBRP, runoff was not directly correlated with watershed area using the five regional databases. Also, runoff normalized by precipitation was not related to watershed area.  相似文献   

19.
ABSTRACT: Equations were developed to transform peak flows and to adapt design hydrographs and unit hydrographs from gaged watersheds to ungaged watersheds with similar hydrologic characteristics. Dimensional analysis was used to develop adjustment equations for peak flow and time base, and these equations were reinforced with results from regional flood frequency research. The authors believe that the use of these transformation equations should yield more reliable flood peak values and hydrogrphs than the common use of empirical flood estimating curves or equations.  相似文献   

20.
Abstract: Long‐term flow records for watersheds with minimal human influence have shown trends in recent decades toward increasing streamflow at regional and national scales, especially for low flow quantiles like the annual minimum and annual median flows. Trends for high flow quantiles are less clear, despite recent research showing increased precipitation in the conterminous United States over the last century that has been brought about primarily by an increased frequency and intensity of events in the upper 10th percentile of the daily precipitation distribution – particularly in the Northeast. This study investigates trends in 28 long‐term annual flood series for New England watersheds with dominantly natural streamflow. The flood series are an average of 75 years in length and are continuous through 2006. Twenty‐five series show upward trends via the nonparametric Mann‐Kendall test, 40% (10) of which are statistically significant (p < 0.1). Moreover, an average standardized departures series for 23 of the study gages indicates that increasing flood magnitudes in New England occurred as a step change around 1970. The timing of this is broadly synchronous with a phase change in the low frequency variability of the North Atlantic Oscillation, a prominent upper atmospheric circulation pattern that is known to effect climate variability along the United States east coast. Identifiable hydroclimatic shifts should be considered when the affected flow records are used for flood frequency analyses. Special treatment of the flood series can improve the analyses and provide better estimates of flood magnitudes and frequencies under the prevailing hydroclimatic condition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号