首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Montgomery and Loftis (1987) have listed several situations for which the t-test does not accurately reproduce Type I errors, and should therefore be avoided. Characteristics common to water quality data (skewness or other non-normality, presence of outliers and less-thans) also reduce the power of the t-test, in relation to nonparametric alternatives. Thus if one is interested in reaching correct decisions when trends or differences exist, and not just when they do not, the t-test should not be considered “robust” (in the sense of being generally applicable) when its assumptions are violated. Further, t-tests assume that differences in means are relevant (the mean is a good measure of central tendency), and that data groups differ by some additive amount. When all of these assumptions are recognized, and in light of the availability of truly robust and comparatively powerful non-parametric alternatives, we believe there is little applicability of the t-test for detecting trends or differences in water quality variables.  相似文献   

2.
Water quality monitoring involves sampling a population, water quality, that is changing over time. Sample statistics (e.g., sample mean) computed from data collected by a monitoring network can be affected by three general factors: (1) random changes due to storms, rainfall, etc.; (2) seasonal changes in temperature, rainfall, etc.; and (3) serial correlation or duplication in information from sample to sample. (Closely spaced samples will tend to give similar information).In general, these effects have been noted, but their specific effects on water quality monitoring network design have not been well defined quantitatively. The purpose of this paper is to examine these effects with a specific data set and draw conclusions relative to sampling frequency determinations in network design.The design criterion adopted for this study of effects due to the above factors is the width of confidence intervals about annual sample geometric means of water quality variables. The data base for the study consisted of a daily record of 5 water quality variables at 9 monitoring stations in Illinois for a period of 1 year.Three general regions of frequencies were identified: (1) greater than approximately 30 samples per year where serial correlation plays a dominant role; (2) between approximately 10 and 30 samples per year where the effects of seasonal variation and serial correlation tended to cancel each other out; and (3) less than approximately 10 samples per year where seasonal variation plays a dominant role. In region 2, either seasonal variation and serial correlation should both be considered or both ignored. To consider only seasonal variation introduces more error than ignoring it. These results are network averages (over variables and stations) from one network, thus results for individual variables may deviate considerably from the average and from those for other networks.Financial support for this study was provided, in part, by the U.S. Environmental Protection Agency, grant number R805759-01-0.  相似文献   

3.
ABSTRACT: The selection of sampling frequencies in order to achieve reasonably small and uniform confidence interval widths about annual sample means or sample geometric means of water quality constituents is suggested as a rational approach to regulatory monitoring network design. Methods are presented for predicting confidence interval widths at specified sampling frequencies while considering both seasonal variation and serial correlation of the quality time series. Deterministic annual cycles are isolated and serial dependence structures of the autoregressive, moving average type are identified through time series analysis of historic water quality records. The methods are applied to records for five quality constituents from a nine-station network in Illinois. Confidence interval widths about annual geometric means are computed over a range of sampling frequencies appropriate in regulatory monitoring. Results are compared with those obtained when a less rigorous approach, ignoring seasonal variation and serial correlation, is used. For a monthly sampling frequency the error created by ignoring both seasonal variation and serial correlation is approximately 8 percent. Finally, a simpler technique for evaluating serial correlation effects based on the assumption of AR(1) type dependence is examined. It is suggested that values of the parameter p1, in the AR(1) model should range from 0.75 to 0.90 for the constituents and region studied.  相似文献   

4.
ABSTRACT: A framework for sensitivity and error analysis in mathematical modeling is described and demonstrated. The Lake Eutrophication Analysis Procedure (LEAP) consists of a series of linked models which predict lake water quality conditions as a function of watershed land use, hydrolgic variables, and morphometric variables. Specification of input variables as distributions (means and standard errors) and use of first-order error analysis techniques permits estimation of output variable means, standard errors, and confidence ranges. Predicted distributions compare favorably with those estimated using Monte-Carlo simulation. The framework is demonstrated by applying it to data from Lake Morey, Vermont. While possible biases exist in the models calibrated for this application, prediction variances, attributed chiefly to model error, are comparable to the observed year-to-year variance in water quality, as measured by spring phosphorus concentration, hypolimnetic oxygen depletion rate, summer chlorophyll-a, and summer transparency in this lake. Use of the framework provides insight into important controlling factors and relationships and identifies the major sources of uncertainty in a given model application.  相似文献   

5.
ABSTRACT: The detection of gradual trends in water quality time series is increasing in importance as concern grows for diffuse sources of pollution such as acid precipitation and agricultural non-point sources. A significant body of literature has arisen dealing with trend detection in water quality variables that exhibit seasonal patterns. Much of the literature has dealt with seasonality of the first moment. However, little has been mentioned about seasonality in the variance, and its effect upon the performance of trend detection techniques. In this paper, eight methods of trend detection that arise from both the statistical literature as well as the water quality literature have been compared by means of a simulation study. Varying degrees of seasonality in both the variances and the means have been introduced into the artificial data, and the performances of these procedures are analyzed. Since the focus is on lake and ground water quality monitoring, quarterly sampling and short to moderate record lengths are examined.  相似文献   

6.
ABSTRACT: The use of nonparametric tests for monotonic trend has flourished in recent years to support routine water quality data analyses. The validity of an assumption of independent, identically distributed error terms is an important concern in selecting the appropriate nonparametric test, as is the presence of missing values. Decision rules are needed for choosing between alternative tests and for deciding whether and how to pre-process data before trend testing. Several data pre-processing procedures in conjunction with the Mann-Kendall tau and the Seasonal Kendall test (with and without serial correlation correction) are evaluated using synthetic time series with generated serial correlation and missing data. A composite test (pre-testing for serial correlation followed by one of two trend tests) is evaluated and was found to perform satisfactorily.  相似文献   

7.
A survey sampling approach is presented for estimating upper centiles of aggregate distributions of surface water pesticide measurements obtained from datasets with large sample sizes but variable sampling frequency. It is applied to three atrazine monitoring programs of Community Water Systems (CWS) that used surface water as their drinking water source: the nationwide Safe Drinking Water Act (SDWA) data, the Syngenta Voluntary Monitoring Program (VMP), and the Atrazine Monitoring Program (AMP).The VMP/AMP CWS were selected on the basis of atrazine monitoring history (CWS having at least one annual average concentration from SDWA ≥ 1.6 ppb atrazine since 1997 in the AMP). Estimates of the raw water 95th, 99th, and 99.9th centile atrazine concentrations for the VMP/AMP CWS are 4.82, 11.85, and 34.00 ppb, respectively. The corresponding estimates are lower for the finished drinking water samples, with estimates of 2.75, 7.94, and 22.66 ppb, respectively. Finished water centile estimates for the VMP/AMP CWS using only the SDWA data for these sites are consistent with the results. Estimates are provided for the April through July period and for CWS based on surface water source type (static, flowing, or mixed). Requisite sample sizes are determined using statistical tolerance limits, relative SE, and the Woodruff interval sample size criterion. These analyses provide 99.9% confidence that the existing data include the 99.9th centile atrazine concentration for CWS raw and finished water in the Midwest atrazine high-use areas and in the nationwide SDWA dataset. The general validity of this approach is established by a simulation that shows estimates to be close to target quantities for weights based on sampling probabilities or time intervals between samples. Recommendations are given for suitable effective sample sizes to reliably determine interval estimates.  相似文献   

8.
Intervention analysis is a relatively new branch of time series analysis. The power of this technique, which gives the probability that changes in mean level can be distinguished from natural data variability, is quite sensitive to the way the data are collected. The principal independent variables influenced by the data collection design are overall sample size, sampling frequency, and the relative length of record before the occurrence of the event (intervention) that is postulated to have caused a change in mean process level.For three of the four models investigated, data should be collected so that the post-intervention record is substantially longer than the pre-intervention record. This is in conflict with the intuitive approach, which would be to collect equal amounts of data before and after the intervention. The threshold (minimum) level of change that can be detected is quite high unless sample sizes of at least 50 and preferably 100 are available; this minimum level is dependent on the complexity of the model required to describe the response of the process mean to the intervention. More complex models tend to require larger sample sizes for the same threshold detectable change level.Uniformity of sampling frequency is a key consideration. Environmental data collection programs have not historically been oriented toward data analysis using time series techniques, thus eliminating a potentially powerful tool from use in many environmental assessment applications.  相似文献   

9.
Profiles of retained colloids in porous media have frequently been observed to be hyper-exponential or non-monotonic with transport depth under unfavorable attachment conditions, whereas filtration theory predicts an exponential profile. In this work we present a stochastic model for colloid transport and deposition that allows various hypotheses for such deviations to be tested. The model is based on the conventional advective dispersion equation that accounts for first-order kinetic deposition and release of colloids. One or two stochastic parameters can be considered in this model, including the deposition coefficient, the release coefficient, and the average pore water velocity. In the case of one stochastic parameter, the probability density function (PDF) is characterized using log-normal, bimodal log-normal, or a simple two species/region formulation. When two stochastic parameters are considered, then a joint log-normal PDF is employed. Simulation results indicated that variations in the deposition coefficient and the average pore water velocity can both produce hyper-exponential deposition profiles. Bimodal formulations for the PDF were also able to produce hyper-exponential profiles, but with much lower variances in the deposition coefficient. The shape of the deposition profile was found to be very sensitive to the correlation of deposition and release coefficients, and to the correlation of pore water velocity and deposition coefficient. Application of the developed stochastic model to a particular set of colloid transport and deposition data indicated that chemical heterogeneity of the colloid population could not fully explain the observed behavior. Alternative interpretations were therefore proposed based on variability of the pore size and the water velocity distributions.  相似文献   

10.
A common assumption in flood frequency analysis is that annual peak flows are independent events. This study was undertaken to investigate the validity of this assumption with regard to Pennsylvania streams by statistically analyzing the dependence between annual peak flows and to determine if basin carryover effects relate to the degree of dependence. Five tests of dependence, the autocorrelation test, the median crossing test, the turning points test, the rank difference test, and the Spearman rank order serial correlation coefficient test were applied to the series of annual peak flows for 57 streams. Of the 57 streams analyzed, only two exhibited signs of dependence by at least two of the tests performed, and the baseflow component of annual peak flows was found to be unrelated to the degree of dependence exhibited between annual peak flows. It was concluded that the assumption of independence of annual peak flows is valid in flood frequency analysis for Pennsylvania streams.  相似文献   

11.
Many ecological studies use popular variables such as percentage cover of the vegetation to assess the effects of different treatments or environmental management or conditions. Starting with sparse vegetation, the growth in percentage cover is likely to be sigmoidal, and, unless repetitive cover is measured, it will have an upper asymptote of 100%. If the initial cover values under different treatments or management regimes are not equal, then the different growth rates due to the unequal starting values will be confounded with the different treatments.A family of suitable growth curve models can be fitted to the data arising under different treatments, so that one or more of the interpretable fitted parameters of the model can be considered as "responses" to the different treatments. These responses can be analysed to compare the effects of different treatments or environmental conditions, using either parametric or non-parametric methods. The suggested approach is illustrated by application to a particular data set from the literature. The implications for the design of field studies and for the analysis of other vegetational variables are discussed.  相似文献   

12.
Student scientists have analyzed groundwater used for drinking water in rural areas to understand groundwater quality. This was part of a greater effort to understand risks to drinking water. The data produced by middle level and high school students have not been accepted by experts because of concerns about method and student accuracy. We assessed the inherent errors associated with method accuracy, student precision, and sample variability to establish bounds for attainable trueness in water analyses. Analytical test kits and probes were evaluated for the determination of pH, conductivity, chloride, hardness, iron, total soluble metals, and nitrate. In terms of precision, all methods met or exceeded design specifications. Method trueness was variable and in general ranged from good to poor depending on method. A gage reproducibility and repeatability analysis of instrumental methods (pH and conductivity) partitioned the variances into student error (12‐46%), instrumental error (8‐21%), and random error (45‐68%). Overall, student‐generated data met some of the quality objectives consistent with the method limitations. Some methods exhibited a systematic bias and data adjustment may be necessary. Given good management of the student analyst process, it is possible to make precise and accurate measurements consistent with the methods specifications.  相似文献   

13.
ABSTRACT: With the advent of standards and criteria for water quality variables, there has been an increasing concern about the changes of these variables over time. Thus, sound statistical methods for determining the presence or absence of trends are needed. A Trend Detection Method is presented that provides: 1) Hypothesis Formulation - statement of the problem to be tested, 2) Data Preparation - selection of water quality variable and data, 3) Data Analysis - exploratory data analysis techniques, and 4) Statistical Tests - tests for detecting trends. The method is utilized in a stepwise fashion and is presented in a nonstatistical manner to allow use by those not well versed in statistical theory. While the emphasis herein is on lakes, the method may be adopted easily to other water bodies.  相似文献   

14.
ABSTRACT: Baseflow, or water that enters a stream from slowly varying sources such as ground water, can be critical to humans and ecosystems. We evaluate a simple method for estimating base‐flow parameters at ungaged sites. The method uses one or more baseflow discharge measurements at the ungaged site and longterm streamflow data from a nearby gaged site. A given baseflow parameter, such as the median, is estimated as the product of the corresponding gage site parameter and the geometric mean of the ratios of the measured baseflow discharges and the concurrent discharges at the gage site. If baseflows at gaged and ungaged sites have a bivariate lognormal distribution with high correlation and nearly equal log variances, the estimated baseflow parameters are very accurate. We tested the proposed method using long‐term streamflow data from two watershed pairs in the Driftless Area of southwestern Wisconsin. For one watershed pair, the theoretical assumptions are well met; for the other the log‐variances are substantially different. In the first case, the method performs well for estimating both annual and long‐term baseflow parameters. In the second, the method performs remarkably well for estimating annual mean and annual median baseflow discharge, but less well for estimating the annual lower decile and the long‐term mean, median, and lower decile. In general, the use of four measurements in a year is not substantially better than the use of two.  相似文献   

15.
In Finland, the current water conservation policy sets equal incentives for water conservation, regardless of the environmental condition. Before any policy reform, it is vital to investigate the tendency of landowners to adopt water conservation measures. In this study, we were interested in examining adoption if the soil quality implies a high leaching risk and if the water quality is already poor. By combining survey data with GIS data, we analysed the effect of farm and farmer characteristics and attitudes towards adoption. Our probit models indicated that financial variables were the key determinants of adoption for active farmers, whereas for passive owners, adoption was also explained by attitudes. In contrast to our expectations, adoption in areas under risk was weakly supported by our estimates. Environmental awareness, providing it increases with risk, is not strong enough to motivate adoption. Targeted agri-environmental measures, even though costly, cannot be avoided, and spatially tailored measures can attract adopters in hotspot areas.  相似文献   

16.
A multivariate statistical method for analyzing spatial patterns of water quality in Georgia and Kansas was tested using data in the US Environmental Protection Agency's STORET data system. Water quality data for Georgia and Kansas were organized by watersheds. We evaluated three questions: (a) can distinctive regional water quality patterns be detected and predicted using only a few water quality variables, (b) are regional water quality patterns correlated with terrestrial biotic regions, and (c) are regional water quality patterns correlated with fish distributions? Using existing data, this method can distinguish regions with water quality very different from the average conditions (as in Georgia), but it does not discriminate well between regions that do not have diverse water quality conditions (as in Kansas). Data that are spatially and temporally adequate for representing large regions and for multivariate statistical analysis are available for only a few common water quality parameters. Regional climate, lithology, and biotic regimes all have the potential to affect water quality, and terrestrial biotic regions and fish distributions do compare with regional water quality patterns, especially in a state like Georgia, where watershed characteristics are diverse. Thus, identifiable relationships between watershed characteristics and water quality should allow the development of an integrated landaquatic classification system that would be a valuable tool for resource management. Because geographical distributions of species may be limited by Zoogeographic and environmental factors, the recognition of patterns in fish distributions that correlate with regional water quality patterns could influence management strategies and aid regional assessments.  相似文献   

17.
This paper investigates long memory (or long-range dependence) in price returns and volatilities of energy futures contracts with different maturities. Based on a modified rescaled range analysis and three local Whittle methods, the results from rolling sample test suggest that the returns showed little or no long-range dependence over time but the volatilities displayed significant time-varying long-range dependence. Our evidence shows that some extreme events could cause long memory in returns and volatilities, leading to market inefficiency. Employing multiscale analysis, we find that the returns displayed no long-range dependence for any of the chosen time scales. Significant long-range dependence only existed in volatilities for daily time scales but not for monthly or yearly time scales.  相似文献   

18.
Dilemmas of natural resources governance have been a central concern for scholars, policy makers, and users. Major debates occur over the implications of property rights for common resources management. After the Mexican Revolution (1910–1917), land was distributed mainly as ejidos conceived as a hereditary but unalienable collective form of property. In 1992, a new Agrarian Law was decreed that allows individual ownership by removing various restrictions over the transfer of land. Scholars have examined the reform mainly focusing on land-tenure changes and environmental fragmentation. This study examines how the new ownership regime is affecting collective decision-making in ejidos located in a tropical dry forest (TDF) ecosystem. Information on decision-making processes before and after the 1992 reform was gathered through 52 interviews conducted in four ejidos selected along a gradient including agricultural, cattle-raising, and TDF use. The new individualized land property system reduced collective action in ejidos but did not trigger it. Collective action responses to the 1992 reform were buffered by self-organization each ejido already had. Heterogeneous users who shared a short history and showed little understanding of TDF and low dependence on its resources seemed to explain why ejidos have not been able to share a sense of community that would shape the construction of institutions for the collective management of forest resources. However, when a resource is scarce and highly valuable such as water the same users showed capacities for undertaking costly co-operative activities.  相似文献   

19.
Using the lens of Lefebvre's spatial trialectics, we assess the utility of photo-elicited interviewing for environmental justice, recognising that a view to social spatial analysis is essential to engaging with the historical processes of exclusion and discrimination that are crucial to explaining why unequal distributions of environmental injustice are systemic and not random. Drawing on insights from our own photo-elicited interviewing-based work in the neighbourhood called Parkdale in Toronto, we make two main recommendations for future environmental justice work using photo-elicited interviewing. First, researchers must be open to a broader epistemology, one that draws on a more spatially nuanced and temporally evolving knowledge of the full range of environmental influences on communities. Second, in order to arrive at a more robust critical analysis of social space, researchers should complement photo-elicited interviewing with historical research about the relevant communities and include participants from other comparative communities.  相似文献   

20.
What size sample is sufficient for spatially sampling ambient groundwater quality? Water quality data are only as spatially accurate as the geographic sampling strategies used to collect them. This research used sequential sampling and regression analysis to evaluate groundwater quality spatial sampling policy changes proposed by California's Department of Water Resources. Iterative or sequential sampling of a hypothetical groundwater basin's water quality produced data sets from sample sizes ranging from 2.8% to 95% coverage of available point sample sites. Contour maps based on these sample data sets were compared to an original (control), mapped hypothetical data set, to determine at which point map information content and pattern portrayal are not improved by increasing sample sizes. Comparing series of contour maps of ground water quality concentration is a common means of evaluating the geographic extent of groundwater quality change. Comparisons included visual inspection of contout maps and statistical tests on digital versions of these map files, including correlation and regression products. This research demonstrated that, down to about 15% sample site coverage, there is no difference between contour maps produced from the different sampling strategies and the contout map of the original data set.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号