首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A statistical procedure is developed to adjust natural streamflows simulated by dynamical models in downstream reaches, to account for anthropogenic impairments to flow that are not considered in the model. The resulting normalized downstream flows are appropriate for use in assessments of future anthropogenically impaired flows in downstream reaches. The normalization is applied to assess the potential effects of climate change on future water availability on the Rio Grande at a gage just above the major storage reservoir on the river. Model‐simulated streamflow values were normalized using a statistical parameterization based on two constants that relate observed and simulated flows over a 50‐year historical baseline period (1964–2013). The first normalization constant is a ratio of the means, and the second constant is the ratio of interannual standard deviations between annual gaged and simulated flows. This procedure forces the gaged and simulated flows to have the same mean and variance over the baseline period. The normalization constants can be kept fixed for future flows, which effectively assumes that upstream water management does not change in the future, or projected management changes can be parameterized by adjusting the constants. At the gage considered in this study, the effect of the normalization is to reduce simulated historical flow values by an average of 72% over an ensemble of simulations, indicative of the large fraction of natural flow diverted from the river upstream from the gage. A weak tendency for declining flow emerges upon averaging over a large ensemble, with tremendous variability among the simulations. By the end of the 21st Century the higher‐emission scenarios show more pronounced declines in streamflow.  相似文献   

2.
Streamflow monitoring in the Colorado River Basin (CRB) is essential to ensure diverse needs are met, especially during periods of drought or low flow. Existing stream gage networks, however, provide a limited record of past and current streamflow. Modeled streamflow products with more complete spatial and temporal coverage (including the National Water Model [NWM]), have primarily focused on flooding, rather than sustained drought or low flow conditions. Objectives of this study are to (1) evaluate historical performance of the NWM streamflow estimates (particularly with respect to droughts and seasonal low flows) and (2) identify characteristics relevant to model inputs and suitability for future applications. Comparisons of retrospective flows from the NWM to observed flows from the United States Geological Survey stream gage network over 22 years in the CRB reveal a tendency for underestimating low flow frequency, locations with low flows, and the number of years with low flows. We found model performance to be more accurate for the Upper CRB and at sites with higher precipitation, snow percent, baseflow index, and elevations. Underestimation of low flows and variable model performance has important implications for future applications: inaccurate evaluations of historical low flows and droughts, and less reliable performance outside of specific watershed/stream conditions. This highlights characteristics on which to focus future model development efforts.  相似文献   

3.
ABSTRACT: The maximum concentration of a regulated substance that is allowed in a wastewater effluent usually is determined from the amount of dilution provided by the receiving water. Dilution flow is estimated from historical data by application of statistical criteria that define low flow conditions for regulatory purposes. Such use of historical data implies that the past is a good indicator of future conditions, at least for the duration of a discharge permit. Short records, however, introduce great uncertainty in the estimation of low flows because they are unlikely to capture events with recurrence frequencies of multiple years (e.g., ENSO events or droughts). We conducted an analysis of daily flows at several gages with long records in the South Platte River basin of Colorado. Low flows were calculated for successive time blocks of data (3‐, 5‐, 10‐, and 20‐years), and these were compared with low flows calculated for the entire period of record (> 70 years). In unregulated streams, time blocks of three or five years produce estimates of low flows that are highly variable and consistently greater than estimates derived from a longer period of record. Estimates of low flow from 10‐year blocks, although more stable, differ from the long term estimates by as much as a factor of two because of climate variation. In addition, the hydrographs of most streams in Colorado have been influenced by dams, diversions, or water transfers. These alterations to the natural flow regime shorten the record that is useful for analysis, but also tend to increase the calculated low flows. The presence of an upward trend in low flows caused by water use represents an unanticipated risk because it fails to incorporate societal response to severe drought conditions. Thus, climate variability poses a significant risk for water quality both directly, because it may not be represented adequately in the short periods of the hydrologic record that are typically used in permits, and indirectly, through its potential to cause altered use of water during time of scarcity.  相似文献   

4.
ABSTRACT: A simple simulation type approach and a statistical method are proposed for determining the confidence interval of the T‐year frequency rainfall percentiles (or precipitation extremes) for generalized extreme value (GEV) distributions. The former method is based on the Monte Carlo testing procedure. To generate realizations, the covariance structure of the three parameters of GEV is investigated using an observed information matrix of the likelihood function. For distributions with realistic parameters, the correlation between the location and the scale parameters is practically constant when the shape parameter varies around values close to its optimal value. The latter method is based on likelihood ratio statistics. In the case where the joint confidence surface for shape parameters and estimates is plotted with lines of best estimates, the region where the estimated best percentile value can be chosen as a possible estimate is part of the joint confidence surface. The projection of this bounded region on axis of percentile is defined as the effective confidence interval in this research. The use of this effective interval as the confidence interval of the percentile of T‐year frequency rainfall is particularly recommended because it is stable for T and it reflects variations in all three parameters of GEV appropriately.  相似文献   

5.
ABSTRACT: An auto-regressive model has been developed for hydrologic data simulation. The model is computationally easier, parsimonious in number of model parameters and more stable in statistical characteristics than the existing auto-regressive model. The proposed model was used for synthesizing 10 sequences, each of 100 year length, of monthly flows for the river Beas. The statistical parameters were calculated using 49-year historical record for the river. The data was also synthesized using existing auot-regressive model. The synthesized sequences have been compared. The results indicate that the proposed model is as good as the existing auto-regressive model in preserving the mean and standard deviation of historical record. It is further shown that the proposed model requires less parameters than the auto-regressive model for simulation of long-term dependence.  相似文献   

6.
ABSTRACT: The HEC-4 monthly stream flow simulation model, developed by the Hydrologic Engineering Center, Davis, California, is used to extend the available historical stream flow records in the Central Ohio area. The principal objective of this paper is to examine the effectiveness of the HEC-4 model in generating synthetic monthly flows. Important statistical parameters are evaluated in order to relate the statistical properties of the historical and generated flows. In doing so, it is observed that the mean, standard deviation, and skewness of the generated flows are consistently larger than the corresponding estimates based on historical flows. However, results show that these statistics, as well as the lag-1 serial correlation, are generally well maintained by the generated sequences. The degree to which any statistical dissimilarities would be critical, from an engineering design point of view, is demonstrated by utilizing their low flow characteristics. Estimates of reservoir safe-yields, based on a nonsequential mass-curve analysis of the historical and generated low flows, indicate a nominal difference in this particular study.  相似文献   

7.
We evaluate and compare the performance of Bayesian Monte Carlo (BMC), Markov chain Monte Carlo (MCMC), and the Generalized Likelihood Uncertainty Estimation (GLUE) for uncertainty analysis in hydraulic and hydrodynamic modeling (HHM) studies. The methods are evaluated in a synthetic 1D wave routing exercise based on the diffusion wave model, and in a multidimensional hydrodynamic study based on the Environmental Fluid Dynamics Code to simulate estuarine circulation processes in Weeks Bay, Alabama. Results show that BMC and MCMC provide similar estimates of uncertainty. The posterior parameter densities computed by both methods are highly consistent, as well as the calibrated parameter estimates and uncertainty bounds. Although some studies suggest that MCMC is more efficient than BMC, our results did not show a clear difference between the performance of the two methods. This seems to be due to the low number of model parameters typically involved in HHM studies, and the use of the same likelihood function. In fact, for these studies, the implementation of BMC results simpler and provides similar results to MCMC. The results of GLUE are, on the other hand, less consistent to the results of BMC and MCMC in both applications. The posterior probability densities tend to be flat and similar to the uniform priors, which can result in calibrated parameter estimates centered in the parametric space.  相似文献   

8.
ABSTRACT: Regression and time-series techniques have been used to synthesize and predict the stream flow at the Foresta Bridge gage from information at the upstream Pohono Bridge gage on the Merced River near Yosemite National Park. Using the available data from two time periods (calendar year 1979 and water year 1986), we evaluated the two techniques in their ability to model the variation in the observed flows and in their ability to predict stream flow at the Foresta Bridge gage for the 1979 time period with data from the 1986 time period. Both techniques produced reasonably good estimates and forecasts of the flow at the downstream gage. However, the regression model was found to have a significant amount of autocorrelation in the residuals, which the time-series model was able to eliminate. The time-series technique presented can be of great assistance in arriving at reasonable estimates of flow in data sets that have large missing portions of data.  相似文献   

9.
A graphical inverse method for determining the regional transmissivity distribution was applied to three field problems. The study areas were the Hanford Site, Washington; the Rocky Mountain Arsenal, Colorado; and the Nevada Test Site, Nevada. This method can aid in flow system conceptualization by revealing the location of bedrock controls for groundwater flow. It is a valuable tool for aiding the hydrogeologist in asking questions about the nature of trends in the pattern of transmissivity values. Quantitative estimates of regional transmissivities can be used as starting points for further parameter refinement. Sensitivity analysis using Monte Carlo simulation shows that quantitative estimates of transmissivity can be obtained when measurement error in the hydraulic head does not cause a large error in the hydraulic gradient.  相似文献   

10.
ABSTRACT: The probability distributions of annual peak flows used in flood risk analysis quantify the risk that a design flood will be exceeded. But the parameters of these distributions are themselves to a degree uncertain and this uncertainty increases the risk that the flood protection provided will in fact prove to be inadequate. The increase in flood risk due to parameter uncertainty is small when a fairly long record of data is available and the annual flood peaks are serially independent, which is the standard assumption in flood frequency analysis. But standard tests for serial independence are insensitive to the type of grouping of high and low values in a time series, which is measured by the Hurst coefficient. This grouping increases the parameter uncertainty considerably. A study of 49 annual peak flow series for Canadian rivers shows that many have a high Hurst coefficient. The corresponding increase in flood risk due to parameter uncertainty is shown to be substantial even for rivers with a long record, and therefore should not be neglected. The paper presents a method of rationally combining parameter uncertainty due to serial correlation, and the stochastic variability of peak flows in a single risk assessment. In addition, a relatively simple time series model that is capable of reproducing the observed serial correlation of flood peaks is presented.  相似文献   

11.
Nitrogen flows impacted by human activities in the Day-Nhue River Basin in northern Vietnam have been modeled using adapted material flow analysis (MFA). This study introduces a modified uncertainty analysis procedure and its importance in MFA. We generated a probability distribution using a Monte Carlo simulation, calculated the nitrogen budget for each process and then evaluated the plausibility under three different criterion sets. The third criterion, with one standard deviation of the budget value as the confidence interval and 68% as the confidence level, could be applied to effectively identify hidden uncertainties in the MFA system. Sensitivity analysis was conducted for revising parameters, followed by the reassessment of the model structure by revising equations or flow regime, if necessary. The number of processes that passed the plausibility test increased from five to nine after reassessment of model uncertainty with a greater model quality. The application of the uncertainty analysis approach to this case study revealed that the reassessment of equations in the aquaculture process largely changed the results for nitrogen flows to environments. The significant differences were identified as increased nitrogen load to the atmosphere and to soil/groundwater (17% and 41%, respectively), and a 58% decrease in nitrogen load to surface water. Thus, modified uncertainty analysis was considered to be an important screening system for ensuring quality of MFA modeling.  相似文献   

12.
ABSTRACT: Bivariate and trivariate distributions have been derived from the logistic model for the multivariate extreme value distribution. Marginals in the models are extreme value type I distributions for two-component mixture variables (mixed Gumbel distribution). This paper is a continuation of the previous works on multivariate distribution in hydrology. Interest is focused on the analysis of floods which are generated by different types of storms. The construction of their corresponding probability distributions and density functions are described. In order to obtain the parameters of such a bivariate or trivariate distribution, a generalized maximum likelihood estimation procedure is proposed to allow for the cases of samples with different lengths of record. A region in Northern Mexico with 42 gauging stations, grouped into two homogeneous regions, has been selected to apply the models. Results produced by the multivariate distributions have been compared with those obtained by the Normal, log-Normal-2, log-Normal-3, Gamma-2, Gamma-3, log-Pearson-3, Gumbel, TCEV and General Extreme Value distributions. Goodness of fit is measured by the criterion of standard error of fit. Results suggest that the proposed models are a suitable option to be considered when performing flood frequency analysis.  相似文献   

13.
ABSTRACT: Historical flow records are used to estimate the regulatory low flows that serve a key function in setting discharge permit limits through the National Pollutant Discharge Elimination System, which provides a nationwide mechanism for protecting water quality. Use of historical records creates an implicit connection between water quality protection and climate variability. The longer the record, the more likely the low flow estimate will be based on a broad set of climate conditions, and thus provides adequate water quality protection in the future. Unfortunately, a long record often is not available at a specific location. This analysis examines the connection between climate variability and the variability of biologically based and hydrologically based low flow estimates at 176 sites from the Hydro‐Climatic Data Network, a collection of stream gages identified by the USGS as relatively free of anthropogenic influences. Results show that a record of 10 to 20 years is necessary for satisfactory estimates of regulatory low flows. Although it is possible to estimate a biologically based low flow from a record of less than 10 years, these estimates are highly uncertain and incorporate a bias that undermines water quality protection.  相似文献   

14.
ABSTRACT: A time series of annual flow of the Sacramento River, California, is reconstructed to A.D. 869 from tree rings for a long‐term perspective on hydrologic drought. Reconstructions derived by principal components regression of flow on time‐varying subsets of tree‐ring chronologies account for 64 to 81 percent of the flow variance in the 1906 to 1977 calibration period. A Monte Carlo analysis of reconstructed n‐year running means indicates that the gaged record contains examples of drought extremes for averaging periods of perhaps = 6 to 10 years, but not for longer and shorter averaging periods. For example, the estimated probability approaches 1.0 that the flow in A.D. 1580 was lower than the lowest single‐year gaged flow. The tree‐ring record also suggests that persistently high or low flows over 50‐year periods characterize some parts of the long‐term flow history. The results should contribute to sensible water resources planning for the Sacramento Basin and to the methodology of incorporating tree‐ring data in the assessment of the probability of hydrologic drought.  相似文献   

15.
ABSTRACT: A model for estimating the probability of exceeding groundwater quality standards at environmental receptors based on a simple contaminant transport model is described. The model is intended for locations where knowledge about site-specific hydrogeologic conditions is limited. An efficient implementation methodology using numerical Monte Carlo simulation is presented. The uncertainty in the contaminant transport system due to uncertainty in the hydraulic conductivity is directly calculated in the Monte Carlo simulations. Numerous variations of the deterministic parameters of the model provide an indication of the change in exceedance probability with change in parameter value. The results of these variations for a generic example are presented in a concise graphical form which provides insight into the topology of the exceedance probability surface. This surface can be used to assess the impact of the various parameters on exceedance probability.  相似文献   

16.
ABSTRACT: Gaging stations established in 1895 at Millville, West Virginia and in 1882 at Harpers Ferry, West Virginia record flows ranging from a maximum of 6,509 m3s-1 to a minimum of 2 m3s-1. Historical and botanical indicators were used to extend the systematic flood record of the Shenandoah River for a study reach approximately 7.5-km long. The long systematic record at the site provides a good opportunity to assess the accuracy of these sources of paleoflood information. Habitation of the area by settlers of European descent began in 1733, and historical flood records extend from 1748. Qualitative historical records from different sources were compared to yield the most complete flood history. The correlation between the various sources was extremely high. Botanical flood evidence preserved as adventitious sprouts, tree scars, and ring anomalies were documented in 37 trees. A flood chronology established from these data extended from 1896 to 1955. Botanical indicators provided an accurate, although incomplete, flood chronology. The ability to determine accurate flood stages from paleohydrologic indicators varied. Historical data yielded relatively accurate stages to within 1–2 m; only minimum values of flood stage could be obtained from botanical indicators. These results illustrate some of the strengths and weaknesses of paleohydrologic investigations in the eastern United States.  相似文献   

17.
ABSTRACT: The purpose of this article is to discuss the importance of uncertainty analysis in water quality modeling, with an emphasis on the identification of the correct model specification. A wetland phosphorus retention model is used as an example to illustrate the procedure of using a filtering technique for model structure identification. Model structure identification is typically done through model parameter estimation. However, due to many sources of error in both model parameterization and observed variables and data, error-in-variable is often a problem. Therefore, it is not appropriate to use the least squares method for parameter estimation. Two alternative methods for parameter estimation are presented. The first method is the maximum likelihood estimator, which assumes independence of the observed response variable values. In anticipating the possible violation of the independence assumption, a second method, which coupled a maximum likelihood estimator and Kalman filter model, was presented. Furthermore, a Monte Carlo simulation algorithm is presented as a preliminary method for judging whether the model structure is appropriate or not.  相似文献   

18.
ABSTRACT. For a multipurpose single reservoir a deterministic optimal operating policy can be readily devised by the dynamic programming method. However, this method can only be applied to sets of deterministic stream flows as might be used repetitively in a Monte Carlo study or possibly in a historical study. This paper reports a study in which an optimal operating policy for a multipurpose reservoir was determined, where the optimal operating policy is stated in terms of the state of the reservoir indicated by the storage volume and the river flow in the preceding month and uses a stochastic dynamic programming approach. Such a policy could be implemented in real time operation on a monthly basis or it could be used in a design study. As contrasted with deterministic dynamic programming, this method avoids the artificiality of using a single set of stream flows. The data for this study are the conditional probabilities of the stream flow in successive months, the physical features of the reservoir in question, and the return functions and constraints under which the system operates.  相似文献   

19.
Channel roughness, often described by Manning's n, is used to represent the amount of resistance that flow encounters, and has direct implications on velocity and discharge. Ideally, n is calculated from a long‐term record of channel discharge and hydraulic geometry. In the absence of these data, a combination of photo references and a validated qualitative method is preferable to simply choosing n arbitrarily or from a table. The purpose of this study was to use United States Geological Survey (USGS) streamflow data to calculate roughness coefficients for streams in the mountains of North Carolina. Five USGS gage stations were selected for this study, representing drainage areas between 71.5 and 337 km2. Photo references of the study sites are presented. Measured discharges were combined with hydraulic geometry at a cross‐section to calculate roughness coefficients for flows of interest. At bankfull flow, n ranged between 0.039 and 0.064 for the five study sites. Roughness coefficients were not constant for all flows in a channel, and fluctuated over a large range. At all sites, roughness was highest during low‐flow conditions, then quickly decreased as flow increased, up to the bankfull elevation.  相似文献   

20.
Abstract: A mix of causative mechanisms may be responsible for flood at a site. Floods may be caused because of extreme rainfall or rain on other rainfall events. The statistical attributes of these events differ according to the watershed characteristics and the causes. Traditional methods of flood frequency analysis are only adequate for specific situations. Also, to address the uncertainty of flood frequency estimates for hydraulic structures, a series of probabilistic analyses of rainfall‐runoff and flow routing models, and their associated inputs, are used. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated to evaluate the probability of floods. Therefore, the objectives of this study were to develop a flood frequency curve derivation method driven by multiple random variables and to develop a tool that can consider the uncertainties of design floods. This study focuses on developing a flood frequency curve based on nonparametric statistical methods for the estimation of probabilities of rare floods that are more appropriate in Korea. To derive the frequency curve, rainfall generation using the nonparametric kernel density estimation approach is proposed. Many flood events are simulated by nonparametric Monte Carlo simulations coupled with the center Latin hypercube sampling method to estimate the associated uncertainty. This study applies the methods described to a Korean watershed. The results provide higher physical appropriateness and reasonable estimates of design flood.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号