首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Intervention analysis is a relatively new branch of time series analysis. The power of this technique, which gives the probability that changes in mean level can be distinguished from natural data variability, is quite sensitive to the way the data are collected. The principal independent variables influenced by the data collection design are overall sample size, sampling frequency, and the relative length of record before the occurrence of the event (intervention) that is postulated to have caused a change in mean process level.For three of the four models investigated, data should be collected so that the post-intervention record is substantially longer than the pre-intervention record. This is in conflict with the intuitive approach, which would be to collect equal amounts of data before and after the intervention. The threshold (minimum) level of change that can be detected is quite high unless sample sizes of at least 50 and preferably 100 are available; this minimum level is dependent on the complexity of the model required to describe the response of the process mean to the intervention. More complex models tend to require larger sample sizes for the same threshold detectable change level.Uniformity of sampling frequency is a key consideration. Environmental data collection programs have not historically been oriented toward data analysis using time series techniques, thus eliminating a potentially powerful tool from use in many environmental assessment applications.  相似文献   

2.
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.  相似文献   

3.
This paper presents a dynamic framework for environmental assessment when the system under study is undergoing successional change. Successional differences between sites for which one wishes to detect a difference because of a treatment are essentially confounding factors. We show how successional changes over the study period or resulting from differences in study site plot ages can be factored out by developing a null model of expected behavior over time. The null model for change in state with time is characterized in terms of a stochastic envelope around a nominal trajectory. Specific tests for the detection of trends associated with succession are described and illustrated on example data. It is concluded that the methods developed work particularly well for laboratory microcosm data.  相似文献   

4.
ABSTRACT: A linear filter (Kalman filter) technique was used with a Streamflow-concentration model the minimize surface water quality sampling frequencies when determining annual mean solute concentrations with a predetermined allowable error. The Kalman filter technique used the stream discharge interval as a replacement for the more commonly used time interval. Using filter computations, the measurement error variance was minimized within the sample size constraints. The Kalman filter application proposed here is applicable only under several conditions including: monitoring is solely to estimate annual mean concentration; discharge measurement errors are negligible; the Streamflow-concentration model is valid; and monthly samples reflect the total variance of the solute in question.  相似文献   

5.
What size sample is sufficient for spatially sampling ambient groundwater quality? Water quality data are only as spatially accurate as the geographic sampling strategies used to collect them. This research used sequential sampling and regression analysis to evaluate groundwater quality spatial sampling policy changes proposed by California's Department of Water Resources. Iterative or sequential sampling of a hypothetical groundwater basin's water quality produced data sets from sample sizes ranging from 2.8% to 95% coverage of available point sample sites. Contour maps based on these sample data sets were compared to an original (control), mapped hypothetical data set, to determine at which point map information content and pattern portrayal are not improved by increasing sample sizes. Comparing series of contour maps of ground water quality concentration is a common means of evaluating the geographic extent of groundwater quality change. Comparisons included visual inspection of contout maps and statistical tests on digital versions of these map files, including correlation and regression products. This research demonstrated that, down to about 15% sample site coverage, there is no difference between contour maps produced from the different sampling strategies and the contout map of the original data set.  相似文献   

6.
Environmental policies in the developing countries are the product of circumstances quite different from those found in advanced nations. As a result, such policies have thus far been rather permissive. There is, however, considerable variance among developing countries in this respect, as well as substantial change over time. Not much evidence has been uncovered pointing to a migration of industry and mining to ‘pollution havens’ in developing countries—the reverse in fact seems to be the case. Nevertheless, it is likely that environmental policies will remain less stringent in developingthan developed countries for the foreseeable future. Indeed, the gap may well widen substantially.  相似文献   

7.
Wildlife managers have little or no control over climate change. However, they may be able to alleviate potential adverse impacts of future climate change by adaptively managing wildlife for climate change. In particular, wildlife managers can evaluate the efficacy of compensatory management actions (CMAs) in alleviating potential adverse impacts of future climate change on wildlife species using probability-based or fuzzy decision rules. Application of probability-based decision rules requires managers to specify certain probabilities, which is not possible when they are uncertain about the relationships between observed and true ecological conditions for a species. Under such uncertainty, the efficacy of CMAs can be evaluated and the best CMA selected using fuzzy decision rules. The latter are described and demonstrated using three constructed cases that assume: (1) a single ecological indicator (e.g., population size for a species) in a single time period; (2) multiple ecological indicators for a species in a single time period; and (3) multiple ecological conditions for a species in multiple time periods.  相似文献   

8.
Sequential sampling is a method for monitoring benthic macroinvertebrates that can significantly reduce the number of samples required to reach a decision, and consequently, decrease the cost of benthic sampling in environmental impact assessments.Rather than depending on a fixed number of samples, this analysis cumulatively compares measured parameter values (for example, density, community diversity) from individual samples, with thresholds that are based on specified degrees of precision.In addition to reducing sample size, a monitoring program based on sequential sampling can provide clear-cut decisions as to whethera priori-defined changes in the measured parameter(s) have or have not occurred. As examples, sequential sampling programs have been developed to evaluate the impact of geothermal energy development on benthic macroinvertebrate diversity at The Geysers, California, and for monitoring the impact of crude oil contamination on chironomid midge [Cricotopus bicinctus (Meigen) andC. mackenziensis Oliver] population densities in the Trail River, Northwest Territories, Canada.  相似文献   

9.
The planning and execution of water quality management programs requires careful collection and analysis of data coupled with a systematic review and analysis of programmatic success. The environmental audit is a tool which facilitates improved water quality planning and management. This article demonstrates the utility of the environmental audit by reviewing portions of a comprehensive review of the water quality management program for the state of Idaho. The audit is a tool which forces careful design of a sampling program before data are collected. In the audit approach, program objectives are clearly stated prior to initiation of sampling. Stated objectives are also evaluated regularly to identify tension points, that is, conflicts between expectations and reality. In the example taken from Idaho, a management review team followed a directive to redesign the water quality monitoring program. We present a summary of the redesign as proposed by that team, to illustrate the results of a typical review of monitoring programs. That summary is followed by an example of how the proposed program would differ if the audit approach had been used. The two approaches offered both coincident and conflicting recommendations. Management review team and audit recommendations for lake sampling programs were similar even though a different process was used to develop the recommendations. The most striking contrast between the two results lies in the review team's approach to the problem. The directives followed, and the team's responses, concentrate on tools, such as increasing biological monitoring or reliance on monthly BWMP stations. In contrast, the audit results stress addressing management questions for which clear objectives have been stated, depending on specific tools only as needed to meet stated objectives. Although the audit does integrate externalities in its structure, it is little affected by economic or political influences. A major strength of the audit approach is its ability to provide defensible data for management decision making.  相似文献   

10.
Abstract: Streamlined sampling procedures must be used to achieve a sufficient sample size with limited resources in studies undertaken to evaluate habitat status and potential management‐related habitat degradation at a regional scale. At the same time, these sampling procedures must achieve sufficient precision to answer science and policy‐relevant questions with an acceptable and statistically quantifiable level of uncertainty. In this paper, we examine precision and sources of error in streambed substrate characterization using data from the Environmental Monitoring and Assessment Program (EMAP) of the U.S. Environmental Protection Agency, which uses a modified “pebble count” method in which particle sizes are visually estimated rather than measured. While the coarse (2?) size classes used in EMAP have little effect on the precision of estimated geometric mean (Dgm) or median (D50) particle diameter, variable classification bias among observers can contribute as much as 0.3?, or about 15‐20%, to the root‐mean‐square error (RMSE) of Dgm or D50 estimates. Dgm and D50 estimates based on EMAP data are nearly equal when fine sediments (<2 mm) are excluded, but otherwise can differ by up to a factor of 2 or more, with Dgm < D50 for gravel‐bed streams. The RMSE of reach‐scale particle size estimates based on visually classified particle count data from EMAP surveys, including variability associated with reoccupying unmarked sample reaches during revisits, is up to five to seven times higher than that reported for traditional measured pebble counts by multiple observers at a plot scale. Nonetheless, a variance partitioning analysis shows that the ratio of among site to revisit variance for several EMAP substrate metrics exceeds 8 for many potential regions of interest, suggesting that the data have adequate precision to be useful in regional assessments of channel morphology, habitat quality, or ecological condition.  相似文献   

11.
ABSTRACT: Among the many concerns associated with global climate change, the potential effects on water resources are frequently cited as the most worrisome. In contrast, those who manage water resources do not rate climatic change among their top planning and operational concerns. The difference in these views can be associated with how water managers operate their systems and the types of stresses, and the operative time horizons, that affect the Nation's water resources infrastructure. Climate, or more precisely weather, is an important variable in the management of water resources at daily to monthly time scales because water resources systems generally are operated on a daily basis. At decadal to centennial time scales, though, climate is much less important because (1) forecasts, particularly of regional precipitation, are extremely uncertain over such time periods, and (2) the magnitude of effects due to changes in climate on water resources is small relative to changes in other variables such as population, technology, economics, and environmental regulation. Thus, water management agencies find it difficult to justify changing design features or operating rules on the basis of simulated climatic change at the present time, especially given that reservoir-design criteria incorporate considerable buffering capacity for extreme meteorological and hydro-logical events.  相似文献   

12.
It is generally recognized that soil N(2)O emissions can exhibit pronounced day-to-day variations; however, measurements of soil N(2)O flux with soil chambers typically are done only at discrete points in time. This study evaluated the impact of sampling frequency on the precision of cumulative N(2)O flux estimates calculated from field measurements. Automated chambers were deployed in a corn/soybean field and used to measure soil N(2)O fluxes every 6 h from 25 Feb. 2006 through 11 Oct. 2006. The chambers were located in two positions relative to the fertilizer bands-directly over a band or between fertilizer bands. Sampling frequency effects on cumulative N(2)O-N flux estimation were assessed using a jackknife technique where populations of N(2)O fluxes were constructed from the average daily fluxes measured in each chamber. These test populations were generated by selecting measured flux values at regular time intervals ranging from 1 to 21 d. It was observed that as sampling interval increased from 7 to 21 d, variances associated with cumulative flux estimates increased. At relatively frequent sampling intensities (i.e., once every 3 d) N(2)O-N flux estimates were within +/-10% of the expected value at both sampling positions. As the time interval between sampling was increased, the deviation in estimated cumulative N(2)O flux increased, such that sampling once every 21 d yielded estimates within +60% and -40% of the actual cumulative N(2)O flux. The variance of potential fluxes associated with the between-band positions was less than the over-band position, indicating that the underlying temporal variability impacts the efficacy of a given sampling protocol.  相似文献   

13.
Transgenic or genetically modified plants possess novel genes that impart beneficial characteristics such as herbicide resistance. One of the least understood areas in the environmental risk assessment of genetically modified crops is their impact on soil- and plant-associated microbial communities. The potential for interaction between transgenic plants and plant residues and the soil microbial community is not well understood. The recognition that these interactions could change microbial biodiversity and affect ecosystem functioning has initiated a limited number of studies in the area. At this time, studies have shown the possibility that transgenes can be transferred to native soil microorganisms through horizontal gene transfer, although there is not evidence of this occurring in the soil. Furthermore, novel proteins have been shown to be released from transgenic plants into the soil ecosystem, and their presence can influence the biodiversity of the microbial community by selectively stimulating the growth of organisms that can use them. Microbial diversity can be altered when associated with transgenic plants; however, these effects are both variable and transient. Soil- and plant-associated microbial communities are influenced not only by plant species and transgene insertion but also by environmental factors such as field site and sampling date. Minor alterations in the diversity of the microbial community could affect soil health and ecosystem functioning, and therefore, the impact that plant variety may have on the dynamics of the rhizosphere microbial populations and in turn plant growth and health and ecosystem sustainability, requires further study.  相似文献   

14.
This study investigates how environmental strategies change over time. We submit evidence from the US steel industry that firms have modified their strategies over time. We offer that US industry passed through three stages--cost minimization, cost-effective compliance, and beneficial environmental controls. We compare typologies of environmental strategies and choose that of C. Oliver as the most appropriate. We investigate how environmental strategies in the steel industry changed over time a 4-year period. We offer that a further understanding of Oliver's strategies may increase understanding of the relationship between business and government on environmental issues. One over-arching problem in our field is the need to adequately operationalize how firms change strategies and pass through different stages. We hope that our study will help future researchers and practitioners better articulate the concepts of environmental strategies over time. Our study focused on the steel industry in the United States. We chose the US steel industry as one of the major environmental actors in the United States. The United States Environmental Protection Agency ranks the iron and steel industry as the largest industrial source of toxic environmental contamination. We encourage researchers to evaluate and test our methodology and findings in other contexts--both in other nations and different industries.  相似文献   

15.
Methods that are more cost-effective and objective are needed to detect important vegetation change within acceptable error rates. The objective of this research was to compare visual estimation to three new methods for determining vegetation cover in the sagebrush steppe. Fourteen management units at the US Sheep Experiment Station were identified for study. In each unit, 20 data collection points were selected for measuring plant cover using visual estimation, laser-point frame (LPF), 2 m above-ground-level (AGL) digital imagery, and 100-m AGL digital imagery. In 11 of 14 management units, determinations of vegetation cover differed (P < 0.05). However, when combined, overall determinations of vegetation cover did not differ. Standard deviation, corrected sums of squares, coefficient of variation, and standard error for the 100 m AGL method were half as large as for the LPF and less than the 2-m AGL and visual estimate. For the purpose of measuring plant cover, all three new methods are as good as or better than visual estimation for speed, standard deviation, and cost. The acquisition of a permanent image of a location is an important advantage of the 2 and 100 m AGL methods because vegetation can be reanalyzed using improved software or to answer different questions, and changes in vegetation over time can be more accurately determined. The reduction in cost per sample, the increased speed of sampling, and the smaller standard deviation associated with the 100-m AGL digital imagery are compelling arguments for adopting this vegetation sampling method.  相似文献   

16.
ABSTRACT: Developing a mass load estimation method appropriate for a given stream and constituent is difficult due to inconsistencies in hydrologic and constituent characteristics. The difficulty may be increased in flashy flow conditions such as karst. Many projects undertaken are constrained by budget and manpower and do not have the luxury of sophisticated sampling strategies. The objectives of this study were to: (1) examine two grab sampling strategies with varying sampling intervals and determine the error in mass load estimates, and (2) determine the error that can be expected when a grab sample is collected at a time of day when the diurnal variation is most divergent from the daily mean. Results show grab sampling with continuous flow to be a viable data collection method for estimating mass load in the study watershed. Comparing weekly, biweekly, and monthly grab sampling, monthly sampling produces the best results with this method. However, the time of day the sample is collected is important. Failure to account for diurnal variability when collecting a grab sample may produce unacceptable error in mass load estimates. The best time to collect a sample is when the diurnal cycle is nearest the daily mean.  相似文献   

17.
This study examined the relationship between environmental concern and ratings of acceptability of environmental impacts among visitors at two national park settings. Based on the concept of a social ecological paradigm shift, it was hypothesized that individuals with greater levels of environmental concern are less accepting of environmental impacts in national parks than individuals with lesser degrees of concern. Sample data came from Cape Lookout National Seashore (N=392) and Moores Creek National Battlefield (N=236), two national park units in the south-eastern U.S.A. Environmental concern was measured by the New Ecological Paradigm scale. Acceptability was measured by visitor responses to 25 items covering different types of environmental park impacts. Analysis of variance and Tukey's means comparison procedure were used to test for differences between groups defined by levels of environmental concern on impact acceptability. Significant relationships were found between environmental concern and 15 of the 25 specific impacts in the Cape Lookout sample and 13 significant relationships were found in the Moores Creek sample. However, the relationships between environmental concern and acceptability varied somewhat across the two samples. These findings suggested that individuals with greater environmental concern were less accepting (or tolerant) of certain types of park impacts, while individuals with lesser degrees of environmental concern were more accepting of certain park impacts. Differences across the study settings were attributed to the different orientations of park visitors between the two national park units and recency effects. While the data reported are preliminary, they should be informative for park management purposes, particularly in the determination of standards for park impacts.  相似文献   

18.
Determining a remeasurement frequency of variables over time is required in monitoring environmental systems. This article demonstrates methods based on regression modeling and spatio-temporal variability to determine the time interval to remeasure the ground and vegetation cover factor on permanent plots for monitoring a soil erosion system. The spatio-temporal variability methods include use of historical data to predict semivariograms, modeling average temporal variability, and temporal interpolation by two-step kriging. The results show that for the cover factor, the relative errors of the prediction increase with an increased length of time interval between remeasurements when using the regression and semivariogram models. Given precision or accuracy requirements, appropriate time intervals can be determined. However, the remeasurement frequency also varies depending on the prediction interval time. As an alternative method, the range parameter of a semivariogram model can be used to quantify average temporal variability that approximates the maximum time interval between remeasurements. This method is simpler than regression and semivariogram modeling, but it requires a long-term dataset based on permanent plots. In addition, the temporal interpolation by two-step kriging is also used to determine the time interval. This method is applicable when remeasurements in time are not sufficient. If spatial and temporal remeasurements are sufficient, it can be expanded and applied to design spatial and temporal sampling simultaneously.  相似文献   

19.
The problems posed by adaptive management for improved ecosystem health are reviewed. Other kinds of science-informed ecosystem management are needed for those regions of conflict between rapid human population growth, increased resource extraction, and the rising demand for better environmental amenities, where large-scale experiments are not feasible. One new framework is threshold-based resource management. Threshold-based resource management guides management choices among four major science and engineering approaches to achieve healthier ecosystems: self-sustaining ecosystem management, adaptive management, case-by-case resource management, and high-reliability management. As resource conflicts increase over a landscape (i.e., as the ecosystems in the landscape move through different thresholds), management options change for the environmental decision-maker in terms of what can and cannot be attained by way of ecosystem health. The major policy and management implication of the framework is that the exclusive use or recommendation of any one management regime, be it self-sustaining, adaptive, case-by-case, or high-reliability management, across all categories of ecosystems within a heterogeneous landscape that is variably populated and extractively used is not only inappropriate, it is fatal to the goals of improved ecosystem health. The article concludes with detailed proposals for environmental decision-makers to undertake “bandwidth management” in ways that blend the best of adaptive management and high-reliability management for improved ecosystem health while at the same time maintaining highly reliable flows of ecosystem services, such as water.  相似文献   

20.
Resource management issues continually change over time in response to coevolving social, economic, and ecological systems. Under these conditions adaptive management, or “learning by doing,” offers an opportunity for more proactive and collaborative approaches to resolving environmental problems. In turn, this will require the implementation of learning-based extension approaches alongside more traditional linear technology transfer approaches within the area of environmental extension. In this paper the Integrated Systems for Knowledge Management (ISKM) approach is presented to illustrate how such learning-based approaches can be used to help communities develop, apply, and refine technical information within a larger context of shared understanding. To outline how this works in practice, we use a case study involving pest management. Particular attention is paid to the issues that emerge as a result of multiple stakeholder involvement within environmental problem situations. Finally, the potential role of the Internet in supporting and disseminating the experience gained through ongoing adaptive management processes is examined.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号