首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 290 毫秒
1.
ABSTRACT: A convenient method for the statistical analysis of hydrologic extremes is to use probability papers to fit selected theoretical distributions to extremal observations. Three commonly accepted statistical distributions of extreme hydrologic events are: the double exponential distribution, the bounded exponential distribution, and the Log Pearson Type III distribution. In most cases, probability papers are distribution specific. But, for the Log Pearson Type III distribution, the probability paper is characterized by a population-specific parameter, namely, the coefficient of skewness. It is not practicable to procure probability papers for all possible values of this parameter. Therefore, a computer program is developed to generate population-specific probability papers and to perform statistical analysis of the data using computer graphics. Probability papers covering return periods up to 1000 years or more are generated for the three distributions mentioned above. Using a plot routine, available extremal observations are plotted on selected probability papers and a linear regression analysis is used to fit a straight line to the data. Predictions of hydrologic extremes for higher recurrence intervals can be made by extrapolating the fitted straight lines.  相似文献   

2.
3.
ABSTRACT: The minimization of the sum of absolute deviations and the minimization of the absolute maximum deviation (mini-max) were transformed into equivalent linear programs for the estimation of parameters in a transient and linear hydrologic system. It is demonstrated that these two methods yield viable parameter estimates that are globally optimal and reproduce properly the timing and magnitude of hydrologic events and associated variables such as total runoff. The two linear estimation methods compared favorably with the popular least-squares nonlinear estimation method. The generality of the theoretical developments shows that linear program equivalents are adequate competitors of nonlinear methods of hydrologic estimation and parameter calibration.  相似文献   

4.
ABSTRACT: In recent years, several approaches to hydrologic frequency analysis have been proposed that enable one to direct attention to that portion of an overall probability distribution that is of greatest interest. The majority of the studies have focused on the upper tail of a distribution for flood analyses, though the same ideas can be applied to low flows. This paper presents an evaluation of the performances of five different estimation methods that place an emphasis on fitting the lower tail of the lognormal distribution for estimation of the ten‐year low‐flow quantile. The methods compared include distributional truncation, MLE treatment of censored data, partial probability weighted moments, LL‐moments, and expected moments. It is concluded that while there are some differences among the alternative methods in terms of their biases and root mean square errors, no one method consistently performs better than the others, particularly with recognition that the underlying population distribution is unknown. Therefore, it seems perfectly legitimate to make a selection of a method on the basis other criteria, such as ease of use. It is also shown in this paper that the five alternative methods can perform about as well as, if not better than, an estimation strategy involving fitting the complete lognormal distribution using L‐moments.  相似文献   

5.
ABSTRACT: Most hydrologic models require input parameters which represent the variability found across an entire landscape. The estimation of such parameters is very difficult, particularly on rangeland. Improved model parameter estimation procedures are needed which incorporate the small-scale and temporal variability found on rangeland. This study investigates the use of a surface soil classification scheme to partition the spatial variability in hydrologic and interrill erosion processes in a sagebrush plant community. Four distinct microsites were found to exist within the sagebrush coppice-dune dune-interspace complex. The microsites explained the majority of variation in hydrologic and interrill erosion response found on the site and were discernable based on readily available soil and vegetation information. The variability within each microsite was quite low and was not well correlated with soil and vegetation properties. The surface soil classification scheme defined in this study can be quite useful for defining sampling procedures, for understanding hydrologic and erosion processes, and for parameterizing hydrologic models for use on sagebrush range-land.  相似文献   

6.
ABSTRACT: Kriging methods of geostatistical analysis provide valuable techniques for analysis of sediment contamination problems, including interpolation of concentration maps from point data and estimation of global mean concentrations. Sample collection efforts frequently include preliminary screening data of considerably more extensive coverage than the laboratory analyses on which estimation is usually based. How should these be incorporated in kriging? Screening and laboratory analysis constitute two separate estimates of the same spatial field but of very different characteristics. A modified version of co-kriging is developed to include the imprecise screening information in the analysis of contaminant distribution. Use of the method is demonstrated on a data set of sediment PCB samples from the Upper Hudson River, for which preliminary categorical mass spectrometry screening was used to select a smaller set of samples for gas chromatograph analysis. The method is widely applicable to many situations of contaminant and natural resource estimation.  相似文献   

7.
杨华 《四川环境》2004,23(1):45-47
以最大信息熵原理为理论基础的熵法估参方法,是一种具有严格物理和数学意义的新型参数估计方法,本文针对珠江广州河段主要污染物含量长年监测数据,对比熵法与传统方法矩法对四参数Г分布的估参结果,并以频率绝对离盖和最小为准则进行判定,结果表明,熵法估参结果与矩法总体上相当接近,且大部分样本的熵法估计参数优于矩法,在环境监测数据频率分析中具有实用性和推广价值。  相似文献   

8.
Abstract: A stochastic, spatially explicit method for assessing the impact of land cover classification error on distributed hydrologic modeling is presented. One‐hundred land cover realizations were created by systematically altering the North American Landscape Characterization land cover data according to the dataset’s misclassification matrix. The matrix indicates the probability of errors of omission in land cover classes and is used to assess the uncertainty in hydrologic runoff simulation resulting from parameter estimation based on land cover. These land cover realizations were used in the GIS‐based Automated Geospatial Watershed Assessment tool in conjunction with topography and soils data to generate input to the physically‐based Kinematic Runoff and Erosion model. Uncertainties in modeled runoff volumes resulting from these land cover realizations were evaluated in the Upper San Pedro River basin for 40 watersheds ranging in size from 10 to 100 km2 under two rainfall events of differing magnitudes and intensities. Simulation results show that model sensitivity to classification error varies directly with respect to watershed scale, inversely to rainfall magnitude and are mitigated or magnified by landscape variability depending on landscape composition.  相似文献   

9.
ABSTRACT: With the increased use of models in hydrologic design, there is an immediate need for a comprehensive comparison of hydrologic models, especially those intended for use at ungaged locations (i.e., where measured data are either not available or inadequate for model calibration). But some past comparisons of hydrologic models have used the same data base for both calibration and testing of the different models or implied that the results of model calibration are indicative of the accuracy at ungaged locations. This practice was examined using both the regression equation approach to peak discharge estimation and a unit hydrograph model that was intended for use in urban areas. The results suggested that the lack of data independence in the calibration and testing of regression equations may lead to both biased results and misleading statements about prediction accuracy. Additionally, although split-sample testing is recognized as desirable, the split-samples should be selected using a systematic-random sampling scheme, rather than random sampling, because random sampling with small samples may lead to a testing sample that is not representative of the population. A systematic-random sampling technique should lead to more valid conclusions about model reliability. For models like a unit hydrograph model, which are more complex and for which calibration is a more involved process, data independence is not as critical because the data fitting error variation is not as dominant as the error variation due to the calibration process and the inability of the model structure to conform with data variability.  相似文献   

10.
ABSTRACT: Growing interest in water quality has resulted in the development of monitoring networks and intensive sampling for various constituents. Common purposes are regulatory, source and sink understanding, and trend observations. Water quality monitoring involves monitoring system design; sampling site instrumentation; and sampling, analysis, quality control, and assurance. Sampling is a process to gather information with the least cost and least error. Various water quality sampling schemes have been applied for different sampling objectives and time frames. In this study, a flow proportional composite sampling scheme is applied to variable flow remote canals where the flow rate is not known a priori. In this scheme, historical weekly flow data are analyzed to develop high flow and low flow sampling trigger volumes for auto‐samplers. The median flow is used to estimate low flow sampling trigger volume and the five percent exceedence probability flow is used for high flow sampling trigger volume. A computer simulation of high resolution sampling is used to demonstrate the comparative bias in load estimation and operational cost among four sampling schemes. Weekly flow proportional composite auto‐sampling resulted in the least bias in load estimation with competitive operational cost compared to daily grab, weekly grab sampling and time proportional auto‐sampling.  相似文献   

11.
ABSTRACT: Resolution of the input GIS data used to parameterize distributed‐parameter hydrologic/water quality models may affect uncertainty in model outputs and impact the subsequent application of model results in watershed management. In this study we evaluated the impact of varying spatial resolutions of DEM, land use, and soil data (30 × 30 m, 100 × 100 m, 150 × 150 m, 200 × 200 m, 300 × 300 m, 500 × 500 m, and 1,000 × 1,000 m) on the uncertainty of SWAT predicted flow, sediment, NO3‐N, and TP transport. Inputs included measured hydrologic, meteorological, and watershed characteristics as well as water quality data from the Moores Creek watershed in Washington County, Arkansas. The SWAT model output was most affected by input DEM data resolution. A coarser DEM data resolution resulted in decreased representation of watershed area and slope and increased slope length. Distribution of pasture, forest, and urban areas within the watershed was significantly affected at coarser resolution of land use and resulted in significant uncertainty in predicted sediment, NO3‐N, and TP output. Soils data resolution had no significant effect on flow and NO3‐N predictions; however, sediment was overpredicted by 26 percent, and TP was underpredicted by 26 percent at 1,000 m resolution. This may be due to change in relative distribution of various hydrologic soils groups (HSGs) in the watershed. Minimum resolution for input GIS data to achieve less than 10 percent model output error depended upon the output variable of interest. For flow, sediment, NO3‐N, and TP predictions, minimum DEM data resolution should range from 30 to 300 m, whereas minimum land use and soils data resolution should range from 300 to 500 m.  相似文献   

12.
ABSTRACT: The Gunnison River drains a mountainous basin in western Colorado, and is a large contributor of water to the Colorado River. As part of a study to assess water resource sensitivity to alterations in climate in the Gunnison River basin, climatic and hydrologic processes are being modeled. A geographic information system (GIS) is being used in this study as a link between data and modelers - serving as a common data base for project personnel with differing specialties, providing a means to investigate the effects of scale on model results, and providing a framework for the transfer of parameter values among models. Specific applications presented include: (1) developing elevation grids for a precipitation model from digital elevation model (DEM) point-elevation values, and visualizing the effects of grid resolution on model results; (2) using a GIS to facilitate the definition and parameterization of a distributed-parameters, watershed model in multiple basins; and (3) nesting atmospheric and hydrologic models to produce possible scenarios of climate change.  相似文献   

13.
Abstract: With the popularity of complex, physically based hydrologic models, the time consumed for running these models is increasing substantially. Using surrogate models to approximate the computationally intensive models is a promising method to save huge amounts of time for parameter estimation. In this study, two learning machines [Artificial Neural Network (ANN) and support vector machine (SVM)] were evaluated and compared for approximating the Soil and Water Assessment Tool (SWAT) model. These two learning machines were tested in two watersheds (Little River Experimental Watershed in Georgia and Mahatango Creek Experimental Watershed in Pennsylvania). The results show that SVM in general exhibited better generalization ability than ANN. In order to effectively and efficiently apply SVM to approximate SWAT, the effect of cross‐validation schemes, parameter dimensions, and training sample sizes on the performance of SVM was evaluated and discussed. It is suggested that 3‐fold cross‐validation is adequate for training the SVM model, and reducing the parameter dimension through determining the parameter values from field data and the sensitivity analysis is an effective means of improving the performance of SVM. As far as the training sample size, it is difficult to determine the appropriate number of samples for training SVM based on the test results obtained in this study. Simple examples were used to illustrate the potential applicability of combining the SVM model with uncertainty analysis algorithm to save efforts for parameter uncertainty of SWAT. In the future, evaluating the applicability of SVM for approximating SWAT in other watersheds and combining SVM with different parameter uncertainty analysis algorithms and evolutionary optimization algorithms deserve further research.  相似文献   

14.
ABSTRACT: Developing a mass load estimation method appropriate for a given stream and constituent is difficult due to inconsistencies in hydrologic and constituent characteristics. The difficulty may be increased in flashy flow conditions such as karst. Many projects undertaken are constrained by budget and manpower and do not have the luxury of sophisticated sampling strategies. The objectives of this study were to: (1) examine two grab sampling strategies with varying sampling intervals and determine the error in mass load estimates, and (2) determine the error that can be expected when a grab sample is collected at a time of day when the diurnal variation is most divergent from the daily mean. Results show grab sampling with continuous flow to be a viable data collection method for estimating mass load in the study watershed. Comparing weekly, biweekly, and monthly grab sampling, monthly sampling produces the best results with this method. However, the time of day the sample is collected is important. Failure to account for diurnal variability when collecting a grab sample may produce unacceptable error in mass load estimates. The best time to collect a sample is when the diurnal cycle is nearest the daily mean.  相似文献   

15.
ABSTRACT: Data splitting is used to compare methods of determining “homogeneous” hydrologic regions. The methods compared use cluster analysis based on similarity of hydrologic characteristics or similarity of characteristics of a stream's drainage basin. Data for 221 stations in Arizona are used to show that the methods, which are a modification of DeCoursey's scheme for defining regions, improve the fit of estimation data to the model, but that is is necessary to have an independent measure of predictive accuracy, such as that provided by data splitting, to demonstrate improved predictive accuracy. The methods used the complete linkage algorithm for cluster analysis and computed weighted average estimates of hydrologic characteristics at ungaged sites.  相似文献   

16.
17.
ABSTRACT: The principle of maximum entropy (POME) was used to derive an alternative method for parameter estimation for the three parameter lognormal (TPLN) distribution. Six sets of annual peak discharge data were used to evaluate this method and compare it with the methods of moments and maximum likelihood estimation.  相似文献   

18.
This paper examines the performance of a semi‐distributed hydrology model (i.e., Soil and Water Assessment Tool [SWAT]) using Sequential Uncertainty FItting (SUFI‐2), generalized likelihood uncertainty estimation (GLUE), parameter solution (ParaSol), and particle swarm optimization (PSO). We applied SWAT to the Waccamaw watershed, a shallow aquifer dominated Coastal Plain watershed in the Southeastern United States (U.S.). The model was calibrated (2003‐2005) and validated (2006‐2007) at two U.S. Geological Survey gaging stations, using significant parameters related to surface hydrology, hydrogeology, hydraulics, and physical properties. SWAT performed best during intervals with wet and normal antecedent conditions with varying sensitivity to effluent channel shape and characteristics. In addition, the calibration of all algorithms depended mostly on Manning's n‐value for the tributary channels as the surface friction resistance factor to generate runoff. SUFI‐2 and PSO simulated the same relative probability distribution tails to those observed at an upstream outlet, while all methods (except ParaSol) exhibited longer tails at a downstream outlet. The ParaSol model exhibited large skewness suggesting a global search algorithm was less capable of characterizing parameter uncertainty. Our findings provide insights regarding parameter sensitivity and uncertainty as well as modeling diagnostic analysis that can improve hydrologic theory and prediction in complex watersheds. Editor's note : This paper is part of the featured series on SWAT Applications for Emerging Hydrologic and Water Quality Challenges. See the February 2017 issue for the introduction and background to the series.  相似文献   

19.
Uncertainty Assessment for Management of Soil Contaminants with Sparse Data   总被引:3,自引:0,他引:3  
In order for soil resources to be sustainably managed, it is necessary to have reliable, valid data on the spatial distribution of their environmental impact. However, in practice, one often has to cope with spatial interpolation achieved from few data that show a skewed distribution and uncertain information about soil contamination. We present a case study with 76 soil samples taken from a site of 15 square km in order to assess the usability of information gleaned from sparse data. The soil was contaminated with cadmium predominantly as a result of airborne emissions from a metal smelter. The spatial interpolation applies lognormal anisotropic kriging and conditional simulation for log-transformed data. The uncertainty of cadmium concentration acquired through data sampling, sample preparation, analytical measurement, and interpolation is factor 2 within 68.3 % confidence. Uncertainty predominantly results from the spatial interpolation necessitated by low sampling density and spatial heterogeneity. The interpolation data are shown in maps presenting likelihoods of exceeding threshold values as a result of a lognormal probability distribution. Although the results are not deterministic, this procedure yields a quantified and transparent estimation of the contamination, which can be used to delineate areas for soil improvement, remediation, or restricted area use, based on the decision-makers probability safety requirement.  相似文献   

20.
In this study, a constrained minimization method, the flexible tolerance method, was used to solve the optimization problems for determining hydrologic parameters in the root zone: water uptake rate, spatial root distribution, infiltration rate, and evaporation. Synthetic soil moisture data were first generated using the Richards' equation and its associated initial and boundary conditions, and these data were then used for the inverse analyses. The results of inverse simulation indicate the following. If the soil moisture data contain no noise, the rate of estimated water uptake and spatial root distribution parameters are equal to the true values without using constraints. If there is noise in the observed data, constraints must be used to improve the quality of the estimate results. In the estimation of rainfall infiltration and surface evaporation, interpolation methods should be used to reduce the number of unknowns. A fewer number of variables can improve the quality of inversely estimated parameters. Simultaneous estimation of spatial root distribution and water uptake rate or estimation of evaporation and water uptake rate is possible. The method was used to estimate the water uptake rate, spatial root distribution, infiltration rate, and evaporation using long‐term soil moisture data collected from Nebraska's Sand Hills.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号