首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Stone, Wesley W. and Robert J. Gilliom, 2012. Watershed Regressions for Pesticides (WARP) Models for Predicting Atrazine Concentrations in Corn Belt Streams. Journal of the American Water Resources Association (JAWRA) 48(5): 970‐986. DOI: 10.1111/j.1752‐1688.2012.00661.x Abstract: Watershed Regressions for Pesticides (WARP) models, previously developed for atrazine at the national scale, are improved for application to the United States (U.S.) Corn Belt region by developing region‐specific models that include watershed characteristics that are influential in predicting atrazine concentration statistics within the Corn Belt. WARP models for the Corn Belt (WARP‐CB) were developed for annual maximum moving‐average (14‐, 21‐, 30‐, 60‐, and 90‐day durations) and annual 95th‐percentile atrazine concentrations in streams of the Corn Belt region. The WARP‐CB models accounted for 53 to 62% of the variability in the various concentration statistics among the model‐development sites. Model predictions were within a factor of 5 of the observed concentration statistic for over 90% of the model‐development sites. The WARP‐CB residuals and uncertainty are lower than those of the National WARP model for the same sites. Although atrazine‐use intensity is the most important explanatory variable in the National WARP models, it is not a significant variable in the WARP‐CB models. The WARP‐CB models provide improved predictions for Corn Belt streams draining watersheds with atrazine‐use intensities of 17 kg/km2 of watershed area or greater.  相似文献   

2.
ABSTRACT A general methodology is described for identifying and statistically modeling trends which may be contained in a water quality time series. A range of useful exploratory data analysis tools are suggested for discovering important patterns and statistical characteristics of the data such as trends caused by external interventions. To estimate the entries in an evenly spaced time series when data are available at irregular time intervals, a new procedure based upon seasonal adjustment is described. Intervention analysis is employed at the confirmatory data analysis stage to rigorously model changes in the mean levels of a series which are identified using exploratory data analysis techniques. Furthermore, intervention analysis can be utilized for estimating missing observations when they are not too numerous. The effects of cutting down a forest upon various water quality variables and also the consequences of acid rain upon the alkalinity in a stream provide illustrative applications which demonstrate the effectiveness of the methodology.  相似文献   

3.
ABSTRACT: A framework for sensitivity and error analysis in mathematical modeling is described and demonstrated. The Lake Eutrophication Analysis Procedure (LEAP) consists of a series of linked models which predict lake water quality conditions as a function of watershed land use, hydrolgic variables, and morphometric variables. Specification of input variables as distributions (means and standard errors) and use of first-order error analysis techniques permits estimation of output variable means, standard errors, and confidence ranges. Predicted distributions compare favorably with those estimated using Monte-Carlo simulation. The framework is demonstrated by applying it to data from Lake Morey, Vermont. While possible biases exist in the models calibrated for this application, prediction variances, attributed chiefly to model error, are comparable to the observed year-to-year variance in water quality, as measured by spring phosphorus concentration, hypolimnetic oxygen depletion rate, summer chlorophyll-a, and summer transparency in this lake. Use of the framework provides insight into important controlling factors and relationships and identifies the major sources of uncertainty in a given model application.  相似文献   

4.
Spackman Jones, Amber, David K. Stevens, Jeffery S. Horsburgh, and Nancy O. Mesner, 2010. Surrogate Measures for Providing High Frequency Estimates of Total Suspended Solids and Total Phosphorus Concentrations. Journal of the American Water Resources Association (JAWRA) 1‐15. DOI: 10.1111/j.1752‐1688.2010.00505.x Abstract: Surrogate measures like turbidity, which can be observed with high frequency in situ, have potential for generating high frequency estimates of total suspended solids (TSS) and total phosphorus (TP) concentrations. In the semiarid, snowmelt‐driven, and irrigation‐regulated Little Bear River watershed of northern Utah, high frequency in situ water quality measurements were recorded in conjunction with periodic chemistry sampling. Site‐specific relationships were developed using turbidity as a surrogate for TP and TSS at two monitoring locations. Methods are presented for employing censored data and for investigating categorical explanatory variables (e.g., hydrologic conditions). Turbidity was a significant explanatory variable for TP and TSS at both sites, which differ in hydrologic and water quality characteristics. The relationship between turbidity and TP was stronger at the upper watershed site where TP is predominantly particulate. At both sites, the relationships between turbidity and TP varied between spring snowmelt and base flow conditions while the relationships between TSS and turbidity were consistent across hydrological conditions. This approach enables the calculation of high frequency time series of TP and TSS concentrations previously unavailable using traditional monitoring approaches. These methods have broad application for situations that require accurate characterization of fluxes of these constituents over a range of hydrologic conditions.  相似文献   

5.
Turner, Andy W., Jeff J. Hillis, and Charles F. Rabeni, 2012. A Sampler for Measuring Deposited Fine Sediments in Streams. Journal of the American Water Resources Association (JAWRA) 48(2): 366‐378. DOI: 10.1111/j.1752‐1688.2011.00618.x Abstract: Improvements and standardization of methodologies to quickly and accurately quantify deposited sediment in streams will allow advances in our understanding of biological effects of sedimentation. Most methods used to evaluate streambed conditions as part of biological monitoring or assessment programs are selected for ease of use, but can be subjective, inappropriate, and often of unknown accuracy. We developed a portable, light‐weight device to quantify deposited unconsolidated sediment (particles <2 mm) in wadeable streams. This deposited sediment sampler is a hand‐held unit that circumscribes an area of the streambed and through suction creates a force that suspends unconsolidated materials into a collector. Laboratory evaluations determined the efficiency (percent of available deposited sediment recovered) of the sampler to collect different sizes and concentrations of deposited sediment under differing streambed conditions, which allowed appropriate correction factors to be applied to each of four categories of streambed particle size. Field trials comparing our sampler to other methods commonly used by many state and federal agencies showed high comparability. The sampler can be constructed in just a few hours from inexpensive, easily obtained materials.  相似文献   

6.
ABSTRACT: The statistical analysis of data which have trace level measurements has traditionally been a two-step process in which data are first censored using criteria based on measurement precision, and then analyzed with statistical methods for censored data. The process might be more informative if data were left uncensored. In this paper, information loss attributable to censoring and measurement noise are assessed by comparing the sample mean and median of uncensored measurements with a log regression mean and median based on censored data. Measurements are derived from lognormal parent distributions which have random variability characteristic of trace level measurement. The relative performance of estimators used with error-free samples and with samples having measurement noise can be explained by differences between the probability distributions of parents and measurements. Measurement introduces bias and dispersion and transforms lognormal parent distributions toward greater symmetry. Estimates using uncensored data are less biased and more accurate than the log regression mean and median when censoring exceeds about 50 percent, and are not much worse at any fraction censored. For data with many (80 percent) results below the limit of detection, bias may be quite severe.  相似文献   

7.
ABSTRACT: Section 208 of the Federal Water Pollution Control Act Amendments of 1972 has provided the Southwestern Illinois Metropolitan and Regional Planning Commission (SIMAPC) with a unique opportunity for comprehensive planning of the region's water quality. SIMAPC initiated the 208 study by researching available technology for the analysis of point and nonpoint sources of pollution and establishing criteria by which to judge the various technniques. This led to SIMAPC'S choice of continuous simulation of stream and reservoir water quality as the most appropriate analytical tool for their needs. A continuous simulation model was calibrated and verified on three basins in the SIMPAC region. It was then used to produce load source analysis, pollution event frequency analysis, and pollution event duration analysis for ten pollutants under existing stream conditions and then under alternative future conditions. These results enabled the weighting of pollutant sources, analysis of the effectiveness of control measures, and quantitative analysis of the marginal benefit of each alternative.  相似文献   

8.
Abstract: An optimization procedure combining zonation methods with the Tabu Search method is proposed to identify the spatial distribution of hydraulic conductivity field. Three zonation methods, Voronoi diagram (VD), multiplicatively weighted Voronoi diagram (MWVD), and pattern zonation are adopted for the parameterization purposes. With the consideration of the homogeneity and the heterogeneity, there are four spatial distributions of hydraulic conductivity designed to test whether the parameter structure can be successfully identified. The fitting residual error is first considered to determine an adequate number of zones without over parameterization. Then, the parameter uncertainty is evaluated the decision of the number of zones. The results indicate that the MWVD performs better than other two methods because the MWVD has better flexibility in describing the zonal boundaries with small number of decision variables.  相似文献   

9.
ABSTRACT: Regulatory water quality monitoring has evolved to the point where it is a rather complex system encompassing many monitoring purposes and involving many monitoring activities. Lack of a system's perspective of regulatory monitoring hinders the development of effective and efficient monitoring programs to support water quality management. In this paper the regulatory water quality monitoring system is examined in a total systems context. The purposes of regulatory monitoring are reviewed and categorized according to their legal evolution. The activities of regulatory monitoring are categorized and organized into a system which follows the flow of information through the monitoring program. The monitoring purposes and activities are combined to form a monitoring system matrix - a framework within which the total regulatory water quality monitoring system is defined. The matrix, by defining the regulatory monitoring system and clarifying many interactions within the system, provides a basis upon which a more thorough approach to managing, evaluating, and eventually optimizing regulatory monitoring can be developed.  相似文献   

10.
ABSTRACT: The design and implementation of a national surface water quality monitoring network for New Zealand are described. Some of the lessons learned from the first year of operation are also addressed. Underpinning the design, and specified in advance, are the goal and objectives, the data quality assurance system, and the mechanism for data interpretation and reporting. Because of the difficulties associated with the use of a multitude of different agencies, only one agency is involved in field work and one laboratory undertakes the analysis. Staff training has been given a high priority. The network has been designed to give good trend detectability for regular sampling over a 5–10 year period.  相似文献   

11.
    
ABSTRACT: Nine surface water‐quality variables were analyzed for trend at 180 Virginia locations over the 1978 to 1995 period. Median values and seasonal Kendall's tau, a trend indicator statistic, were generated for dissolved oxygen saturation (DO), biochemical oxygen demand (BOD), pH (PH), total residue (TR), nonfilterable residue (NFR), nitrate‐nitrite nitrogen (NN), total Kjeldahl nitrogen (TKN), total phosphorus (TP), and fecal coliform (FC) at each location. Each location was assigned to one of four physiographic regions, and mean state and regional medians and taus were calculated. Widespread BOD and NFR improvements were detected and FC improvements occurred in the state's western regions. TR and TKN exhibited predominantly increasing trends at locations throughout the state. BOD, TKN, NFR, and TR medians were higher at coastal locations than in other regions. NN, TKN, and TR exhibited predominantly increasing trends in regions with high median concentrations, while declining trends predominated in regions with relatively high BOD, FC, and NFR medians. Appalachian locations exhibited the greatest regional water‐quality improvements for BOD, FC, NFR, and TKN. Factors responsible for regional differences appear to include geology, land use, and landscape features; these factors vary regionally.  相似文献   

12.
ABSTRACT: The risks associated with a traditional wasteload allocation (WLA) analysis were quantified with data from a recent study of the Upper Trinity River (Texas). Risk is define here as the probability of failing to meet an established in-stream water quality standard. The QUAL-TX dissolved oxygen (DO) water quality model was modified to a Monte Carlo framework. Flow augmentation coding was also modified to allow an exact match to be computed between the predicted and an established DO concentration standard, thereby providing an avenue for linking input parameter uncertainty to the assignment of a wasteload permit (allowable mass loading rate). Monte Carlo simulation techniques were employed to propagate input parameter uncertainties, typically encountered during WLA analysis, to the computed effluent five-day carbonaceous biochemical oxygen demand requirements for a single major wastewater treatment plant (WWTP). The risk of failing to meet an established in-stream DO criterion may be as high as 96 percent. The uncertainty associated with estimation of the future total Kjeldahl nitrogen concentration for a single tributary was found to have the greatest impact on the determination of allowable WWTP loadings.  相似文献   

13.
ABSTRACT: Water quality monitoring cannot address every information need through one data collection procedure. This paper discusses the goals and related procedures for designing water quality monitoring programs. The discussion focuses on the broad information needs of those agencies operating water quality networks. These information needs include the ability to assess trends and environmental impacts, determine compliance with objectives or standards, estimate mass transport, and perform general surveillance. Each of these information needs has different data requirements. This paper outlines these goals and discusses factors to consider in developing a monitoring plan on a site by site basis.  相似文献   

14.
ABSTRACT: A first-order uncertainty technique is developed to quantify the relationship between field data collection and a modeling exercise involving both calibration and subsequent verification. A simple statistic (LTOTAL) is used to quantify the total likelihood (probability) of successfully calibrating and verifying the model. Results from the first-order technique are compared with those from a traditional Monte Carlo simulation approach using a simple Streeter-Phelps dissolved oxygen model. The largest single difference is caused by the filtering or removal of unrealistic outcomes within the Monte Carlo framework. The amount of bias inherent in the first-order approach is also a function of the magnitude of input variability and sampling location. The minimum bias of the first-order technique is approximately 20 percent for a case involving relatively large uncertainties. However the bias is well behaved (consistent) so as to allow for correct decision making regarding the relative efficacy of various sampling strategies. The utility of the first-order technique is demonstrated by linking data collection costs with modeling performance. For a simple and inexpensive project, a wise and informed selection resulted in an LTOTAL value of 86 percent, while an uninformed selection could result in an LTOTAL value of only 55 percent.  相似文献   

15.
ABSTRACT: The selection of sampling frequencies in order to achieve reasonably small and uniform confidence interval widths about annual sample means or sample geometric means of water quality constituents is suggested as a rational approach to regulatory monitoring network design. Methods are presented for predicting confidence interval widths at specified sampling frequencies while considering both seasonal variation and serial correlation of the quality time series. Deterministic annual cycles are isolated and serial dependence structures of the autoregressive, moving average type are identified through time series analysis of historic water quality records. The methods are applied to records for five quality constituents from a nine-station network in Illinois. Confidence interval widths about annual geometric means are computed over a range of sampling frequencies appropriate in regulatory monitoring. Results are compared with those obtained when a less rigorous approach, ignoring seasonal variation and serial correlation, is used. For a monthly sampling frequency the error created by ignoring both seasonal variation and serial correlation is approximately 8 percent. Finally, a simpler technique for evaluating serial correlation effects based on the assumption of AR(1) type dependence is examined. It is suggested that values of the parameter p1, in the AR(1) model should range from 0.75 to 0.90 for the constituents and region studied.  相似文献   

16.
ABSTRACT: The U.S. Environmental Protection Agency has proposed a sample survey design to answer questions about the ecological condition and trends in condition of U.S. ecological resources. To meet the objectives, the design relies on a probability sample of the resource population of interest (e.g., a random sample of lakes) each year on which measurements are made during an index period. Natural spatial and temporal variability and variability in the sampling process all affect the ability to describe the status of a population and the sensitivity for trend detection. We describe the important components of variance and estimate their magnitude for indicators of trophic condition of lakes to illustrate the process. We also describe models for trend detection and use them to demonstrate the sensitivity of the proposed design to detect trends. If the variance structure that develops during the probability surveys is like that synthesized from available databases and the literature, then the trends in common indicators of trophic condition of the specified magnitude should be detectable within about a decade for Secchi disk transparency (0.5–1 percentiyear) and total phosphorus (2–3 percent/year), but not for chlorophyll-a (> 3–4 percent/year), which will take longer.  相似文献   

17.
ABSTRACT: Electronic instruments are increasingly being used to gather water quality data. Quality assurance protocols are needed which provide adequate documentation of the procedures followed in calibration, collection, and validation of electronically acquired data. The level of precision of many data loggers exceeds the technology which is commonly used to make field measurements. Overcoming this problem involves using laboratory quality equipment in the field or enhanced quality control at the time of instrument servicing. Time control procedures for data loggers are needed to allow direct comparisons of data between instruments. Electronic instruments provide a mechanism to study transient events in great detail, but, without time controls, multiple loggers produce data which contain artifacts due to timing errors. Individual sensors deployed with data loggers are subject to different degrees of drift over time. Certain measurements can be measured with defined precision and accuracy for long periods of time, while other sensors are subject to loss of both precision and accuracy with increasing time of use. Adequate quality assurance requires the levels of precision and accuracy be documented, particularly those which vary with increasing time deployment.  相似文献   

18.
The impoundment of the Kootenai River by Libby Dam caused changes in discharge and water quality in the river downstream from Lake Koocanusa. The changes observed downsteam were largely attributable to the depth of withdrawal from the reservoir and the reservoir's ability to store and mix various influent water masses. The preimpoundment and postimpoundment time series of discharge and six water quality variables were autocorrelated and exhibited strong seasonality. Intervention analysis, a technique employing Box-Jenkins time series models, was used to quantify the nature and magnitude of the changes in water quality after the construction of Libby Dam. The models were developed with data from June 1967 through February 1981 and were able to satisfactorily forecast riverine conditions from March 1981 through January 1982.  相似文献   

19.
Abstract: Consistency in determining Rosgen stream types was evaluated in 12 streams within the John Day Basin, northeastern Oregon. The Rosgen classification system is commonly used in the western United States and is based on the measurement of five stream attributes: entrenchment ratio, width‐to‐depth ratio, sinuosity, slope, and substrate size. Streams were classified from measurements made by three monitoring groups, with each group fielding multiple crews that conducted two to three independent surveys of each stream. In only four streams (33%) did measurements from all crews in all monitoring groups yield the same stream type. Most differences found among field crews and monitoring groups could be attributed to differences in estimates of the entrenchment ratio. Differences in entrenchment ratio were likely due to small discrepancies in determination of maximum bankfull depth, leading to potentially large differences in determination of Rosgen’s flood‐prone width and consequent values of entrenchment. The result was considerable measurement variability among crews within a monitoring group, and because entrenchment ratio is the first discriminator in the Rosgen classification, differences in the assessment of this value often resulted in different determination of primary stream types. In contrast, we found that consistently evaluated attributes, such as channel slope, rarely resulted in any differences in classification. We also found that the Rosgen method can yield nonunique solutions (multiple channel types), with no clear guidance for resolving these situations, and we found that some assigned stream types did not match the appearance of the evaluated stream. Based on these observations we caution the use of Rosgen stream classes for communicating conditions of a single stream or as strata when analyzing many streams due to the reliance of the Rosgen approach on bankfull estimates which are inherently uncertain.  相似文献   

20.
ABSTRACT: Models developed in Ohio to predict water quality conditions resulting from various land uses associated with the surface mining of coal are employed to ascertain their transferability to Maryland conditions. Discriminant analysis is employed to assess patterns of association between water quality and land use variables, and predictive models were then constructed with which to quantify changes in stream quality to be expected from the changing mosaic of upstream land uses in the Georges Greek basin of western Maryland. Data collected under procedures specified by the regulatory authority in Maryland may have accounted for the lack of statistically significant results from these models. Suggested changes in the collection of data are made for the coal region of Maryland.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号