首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The US Environmental Protection Agency's Office of Research and Development has initiated the Environmental Monitoring and Assessment Program (EMAP) to monitor status and trends in the condition of the nation's near coastal waters, forests, wetlands, agro-ecosystems, surface waters, deserts and rangelands. the programme is also intended to evaluate the effectiveness of Agency policies at protecting ecological resources occurring in these systems. Monitoring data collected for all ecosystems will be integrated for regional and national status and trends assessments. the near coastal component of EMAP consists of estuaries, coastal waters, and the Great Lakes. Near coastal ecosystems have been regionalized and classified, and an integrated sampling strategy has been developed. EPA and NOAA have agreed to coordinate and, to the extent possible, integrate the near coastal component of EMAP with the NOAA National Status and Trends Program. A demonstration project was conducted in estuaries of the mid-Atlantic region (Chesapeake Bay to Cape Cod) in the summer of 1990. in 1991, monitoring continued in mid-Atlantic estuaries and was initiated in estuaries of a portion of the Gulf of Mexico. Preliminary results indicate: there are no insurmountable logistical problems with sampling on a regional scale; several of the selected indicators are practical and sensitive on the regional scale; and an efficient effort in future years will provide valuable information on condition of estuarine resources at regional scales.  相似文献   

2.
The US Environmental Protection Agency's Office of Research and Development has initiated the Environmental Monitoring and Assessment Program (EMAP) to monitor status and trends in the condition of the nation's near coastal waters, forests, wetlands, agro-ecosystems, surface waters, deserts and rangelands. the programme is also intended to evaluate the effectiveness of Agency policies at protecting ecological resources occurring in these systems. Monitoring data collected for all ecosystems will be integrated for regional and national status and trends assessments. the near coastal component of EMAP consists of estuaries, coastal waters, and the Great Lakes. Near coastal ecosystems have been regionalized and classified, and an integrated sampling strategy has been developed. EPA and NOAA have agreed to coordinate and, to the extent possible, integrate the near coastal component of EMAP with the NOAA National Status and Trends Program. A demonstration project was conducted in estuaries of the mid-Atlantic region (Chesapeake Bay to Cape Cod) in the summer of 1990. in 1991, monitoring continued in mid-Atlantic estuaries and was initiated in estuaries of a portion of the Gulf of Mexico. Preliminary results indicate: there are no insurmountable logistical problems with sampling on a regional scale; several of the selected indicators are practical and sensitive on the regional scale; and an efficient effort in future years will provide valuable information on condition of estuarine resources at regional scales.  相似文献   

3.
Resampling from stochastic simulations   总被引:1,自引:0,他引:1  
To model the uncertainty of an estimate of a global property, the estimation process is repeated on multiple simulated fields, with the same sampling strategy and estimation algorithm. As opposed to conventional bootstrap, this resampling scheme allows for spatially correlated data and the common situation of preferential and biased sampling. The practice of this technique is developed on a large data set where the reference sampling distributions are available. Comparison of the resampled distributions to that reference shows the probability intervals obtained by resampling to be reasonably accurate and conservative, provided the original and actual sample has been corrected for the major biases induced by preferential sampling.Andre G. Journel is a Professor of Petroleum Engineering at Stanford University with a joint appointment in the Department of Geological and Environmental Sciences. He is, also, Director of the Stanford Center for Reservoir Forecasting. Professor Journel has pioneered applications of geostatistical techniques in the mining/petroleum industry and extended his expertise to environmental applications and repository site characterization. Most notably, he developed the concept of non-parametric geostatistics and stochastic imaging with application to modeling uncertainty in reservoir/site characterization. Although the research described in this article has been supported by the United States Environmental Protection Agency under Cooperative Agreement CR819407, it has not been subjected to Agency review and therefore does not necessarily reflect the views of the Agency and no official endorsement should be inferred.  相似文献   

4.
Long-term environmental monitoring places a set of demands on a sampling strategy not present in a survey designed for a single time period. The inevitability that a sample will become out of date must be a dominant consideration in planning a long-term monitoring programme. The sampling strategy must be able to accommodate periodic frame update and sample restructuring in order to address changes in the composition of the universe and changes in the perception of issues leading to new questions and concerns. The sampling strategy must be capable of adapting to such changes while maintaining its identification as a probability sample and its capacity to detect trends that span the update occasions. These issues are examined with respect to sub-population estimation, post-stratification via conditioning, and sample enlargement and reduction. Design features that involve complex sample structure create potentially serious difficulties, whereas an equal probability design permits greater adaptability and flexibility. Structure should be employed sparingly and in awareness of its undesirable effects.  相似文献   

5.
Adaptive two-stage one-per-stratum sampling   总被引:1,自引:0,他引:1  
We briefly describe adaptive cluster sampling designs in which the initial sample is taken according to a Markov chain one-per-stratum design (Breidt, 1995) and one or more secondary samples are taken within strata if units in the initial sample satisfy a given condition C. An empirical study of the behavior of the estimation procedure is conducted for three small artificial populations for which adaptive sampling is appropriate. The specific sampling strategy used in the empirical study was a single random-start systematic sample with predefined systematic samples within strata when the initially sampled unit in that stratum satisfies C. The bias of the Horvitz-Thompson estimator for this design is usually very small when adaptive sampling is conducted in a population for which it is suited. In addition, we compare the behavior of several alternative estimators of the standard error of the Horvitz-Thompson estimator of the population total. The best estimator of the standard error is population-dependent but it is not unreasonable to use the Horvitz-Thompson estimator of the variance. Unfortunately, the distribution of the estimator is highly skewed hence the usual approach of constructing confidence intervals assuming normality cannot be used here.  相似文献   

6.
Addressing onsite sampling in recreation site choice models   总被引:1,自引:0,他引:1  
Independent experts and politicians have criticized statistical analyses of recreation behavior, which rely upon onsite samples due to their potential for biased inference. The use of onsite sampling usually reflects data or budgetary constraints, but can lead to two primary forms of bias in site choice models. First, the strategy entails sampling site choices rather than sampling individuals—a form of bias called endogenous stratification. Under these conditions, sample choices may not reflect the site choices of the true population. Second, exogenous attributes of the individuals sampled onsite may differ from the attributes of individuals in the population—the most common form in recreation demand is avidity bias. We propose addressing these biases by combining two the existing methods: Weighted Exogenous Stratification Maximum Likelihood estimation and propensity score estimation. We use the National Marine Fisheries Service's Marine Recreational Fishing Statistics Survey to illustrate methods of bias reduction, employing both simulated and empirical applications. We find that propensity score based weights can significantly reduce bias in estimation. Our results indicate that failure to account for these biases can overstate anglers' willingness to pay for improvements in fishing catch, but weighted models exhibit higher variance of parameter estimates and willingness to pay.  相似文献   

7.
A design-based strategy for estimating wildlife ungulate abundance in a Mediterranean protected area (Maremma Regional Park) is considered. The estimation is based on pellet group count (clearance count technique) in a set of plots, whose size and number is established on the basis of practical considerations and available resources. The sampling scheme involves a preliminary stratification and subsequent two-stage sampling. In the first stage, large strata (defined through habitat features) are partitioned into spatial units and a sample of units is selected by means of a sampling scheme ensuring inclusion probabilities proportional to unit size, but avoiding the selection of contiguous units. Then, the abundances of the selected units are estimated in a second stage, in which plots are located using a random scheme ensuring an even coverage of the units. In small strata, only the second stage is performed. Unbiased estimators of abundance and conservative estimators of their variances are derived for each strata and for the whole study area. The proposed strategy has been applied since the Summer of 2006 and the estimation results reveal substantial improvement with respect to the previous results obtained by means of an alternative strategy.  相似文献   

8.
Thompson (1990) introduced the adaptive cluster sampling design. This sampling design has been shown to be a useful sampling method for parameter estimation of a clustered and scattered population (Roesch, 1993; Smith et al., 1995; Thompson and Seber, 1996). Two estimators, the modified Hansen-Hurwitz (HH) and Horvitz-Thompson (HT) estimators, are available to estimate the mean or total of a population. Empirical results from previous researches indicate that the modified HT estimator has smaller variance than the modified HH estimator. We analytically compare the properties of these two estimators. Some results are obtained in favor of the modified HT estimator so that practitioners are strongly recommended to use the HT estimator despite easiness of computations for the HH estimator.  相似文献   

9.
Markov Chain Monte Carlo on optimal adaptive sampling selections   总被引:1,自引:0,他引:1  
Under a Bayesian population model with a given prior distribution, the optimal sampling strategy with a fixed sample size n is an n-phase adaptive one. That is, the selection of the next sampling units should sequentially depend on the information obtained from the previously selected units, including the observed values of interest. Such an optimal strategy is in general not executable in practice due to its intensive computation. In many survey sampling situations, an important problem is that one would like to select a set of units in addition to a certain number of sampling units which have been observed. If the optimal strategy is an adaptive one, the selection of the additional units should take both the labels and the observed values of the already selected units into account. Hence, a simpler optimal two-phase adaptive sampling strategy under a Bayesian population model is proposed in this article for practical interest. A Markov chain Monte Carlo method is used to approximate the posterior joint distribution of the unobserved population units after the first phase sampling, for the optimal selection of the second phase sample. This approximation method is found to be successful to select the optimal second-phase sample. Finally, this optimal strategy is applied to a set of data from a study of geothermal CO2 emissions in Yellowstone National Park as a practical illustrative example.  相似文献   

10.
Estimates of a population’s growth rate and process variance from time-series data are often used to calculate risk metrics such as the probability of quasi-extinction, but temporal correlations in the data from sampling error, intrinsic population factors, or environmental conditions can bias process variance estimators and detrimentally affect risk predictions. It has been claimed (McNamara and Harding, Ecol Lett 7:16–20, 2004) that estimates of the long-term variance that incorporate observed temporal correlations in population growth are unaffected by sampling error; however, no estimation procedures were proposed for time-series data. We develop a suite of such long-term variance estimators, and use simulated data with temporally autocorrelated population growth and sampling error to evaluate their performance. In some cases, we get nearly unbiased long-term variance estimates despite ignoring sampling error, but the utility of these estimators is questionable because of large estimation uncertainty and difficulties in estimating correlation structure in practice. Process variance estimators that ignored temporal correlations generally gave more precise estimates of the variability in population growth and of the probability of quasi-extinction. We also found that the estimation of probability of quasi-extinction was greatly improved when quasi-extinction thresholds were set relatively close to population levels. Because of precision concerns, we recommend using simple models for risk estimates despite potential biases, and limiting inference to quantifying relative risk; e.g., changes in risk over time for a single population or comparative risk among populations.  相似文献   

11.
In phased sampling, data obtained in one phase is used to design the sampling network for the next phase. GivenN total observations, 1, ...,N phases are possible. Experiments were conducted with one-phase, two-phase, andN-phase design algorithms on surrogate models of sites with contaminated soils. The sampling objective was to identify through interpolation, subunits of the site that required remediation. The cost-effectiveness of alternate methods was compared by using a loss function. More phases are better, but in economic terms, the improvement is marginal. The optimal total number of samples is essentially independent of the number of phases. For two phase designs, 75% of samples in the first phase is near optimal; 20% or less is actually counterproductive.The U.S. Environmental Protection Agency (EPA) through its Office of Research and Development (ORD), partially funded and collaborated in the research described here. It has been subjected to the Agency's peer review and has been approved as an EPA publication. The U.S. Government has a non-exclusive, royalty-free licence in and to any copyright covering this article.  相似文献   

12.
Ratio estimation of the parametric mean for a characteristic measured on plants sampled by a line intercept method is presented and evaluated via simulation using different plant dispersion patterns (Poisson, regular cluster, and Poisson cluster), plant width variances, and numbers of lines. The results indicate that on average the estimates are close to the parametric mean under all three dispersion patterns. Given a fixed number of lines, variability of the estimates is similar across dispersion patterns with variability under the Poisson pattern slightly smaller than varia-bility under the cluster patterns. No variance estimates were negative under the Poisson pattern, but some estimates were negative under the cluster patterns for smaller numbers of lines. Variance estimates become closer to zero similarly for all spatial patterns as the number of lines increases. Ratio estimation of the parametric mean in line intercept sampling works better, from the viewpoint of approximate unbiasedness and variability of estimates, under the Poisson pattern with larger numbers of lines than other combinations of spatial patterns, plant width variances and numbers of lines.  相似文献   

13.
Cost-effective hotspot identification is an important issue in hazardous waste site characterization and evaluation. Composite sampling techniques are known to be cost effective when the cost of measurement is substantially higher than the cost of sampling. Although compositing incurs no loss of information on the means, information on individual sample values is lost due to compositing. In particular, if the interest is in identifying the largest individual sample value, the composite sampling techniques are not able to do so. Under certain assumptions, it may be possible to satisfactorily predict individual sample values using the composite sample data, but it is not generally possible to identify the largest individual sample value. In this paper, we propose two methods of identifying the largest individual sample value with some additional measurement effort. Both methods are modifications of the simple sweep-out method proposed earlier. Since analytical results do not seem to be feasible, performance of the proposed methods is assessed via simulation. The simulation results show that both the proposed methods, namely the locally sequential sweep-out and the globally sequential sweep-out, are better than the simple sweep-out method.Prepared with partial support from the Statistical Analysis and Computing Branch, Environmental Statistics and Information Division, Office of Policy, Planning, and Evaluation, United States Environmental Protection Agency, Washington, DC under a Cooperative Agreement Number CR-821531. The contents have not been subjected to Agency review and therefore do not necessarily reflect the views of the Agency and no official endorsement should be inferred.  相似文献   

14.
The implementation of an adaptive cluster sampling design often becomes logistically challenging because variation in the final sampling effort introduces uncertainty in survey planning. To overcome this drawback, an inexpensive and easy to measure auxiliary variable could be used in a two-phase survey strategy, called adaptive cluster double sampling (Félix-Medina and Thompson in Biometrika 91:877–891, 2004). In this paper, a two-phase sampling strategy is proposed which combines the idea of adaptive cluster double sampling with the principle of post-stratification. In the first-phase an adaptive cluster sample is selected by means of an inexpensive auxiliary variable. Networks from the first phase sampling are then post-stratified according to their size. In the second-phase, the network structure is used to select a subsample of units by means of stratified random sampling. The proposed sampling strategy employs stratification without requiring an a priori delineation of the strata. Indeed, the strata sizes are estimated in the course of the two-phase sampling process. Therefore, it is suitable for situations where stratification is suspected to be efficient but strata cannot be easily delineated in advance. In this framework, a new type of estimator for the population mean which mimics the stratified sampling mean estimator and an estimator of the sampling variance are proposed. The results of a simulation study confirm, as expected, that the use of post-stratification leads to gain in precision for the estimator. The proposed sampling strategy is applied for targeting an epiphytic lichen community Lobarion pulmonariae in a forest area of the Northern Apennines (N-Italy), characterized by several species of conservation concern.  相似文献   

15.
This paper reviews design-based estimators for two- and three-stage sampling designs to estimate the mean of finite populations. This theory is then extended to spatial populations with continuous, infinite populations of sampling units at the latter stages. We then assume that the spatial pattern is the result of a spatial stochastic process, so the sampling variance of the estimators can be predicted from the variogram. A realistic cost function is then developed, based on several factors including laboratory analysis, time of fieldwork, and numbers of samples. Simulated annealing is used to find designs with minimum sampling variance for a fixed budget. The theory is illustrated with a real-world problem dealing with the volume of contaminated bed sediments in a network of watercourses. Primary sampling units are watercourses, secondary units are transects perpendicular to the axis of the watercourse, and tertiary units are points. Optimal designs had one point per transect, from one to three transects per watercourse, and the number of watercourses varied depending on the budget. However, if laboratory costs are reduced by grouping all samples within a watercourse into one composite sample, it appeared to be efficient to sample more transects within a watercourse.  相似文献   

16.
Ranked set sampling: an annotated bibliography   总被引:1,自引:1,他引:1  
The paper provides an up-to-date annotated bibliography of the literature on ranked set sampling. The bibliography includes all pertinent papers known to the authors, and is intended to cover applications as well as theoretical developments. The annotations are arranged in chronological order and are intended to be sufficiently complete and detailed that a reading from beginning to end would provide a statistically mature reader with a state-of-the-art survey of ranked set sampling, including historical development, current status, and future research directions and applications. A final section of the paper gives a listing of all annotated papers, arranged in alphabetical order by author.This paper was prepared with partial support from the United States Environmental Protection Agency under a Cooperative Agreement Number CR-821531. The contents have not been subject to Agency review and therefore do not necessarily reflect the views or policies of the Agency and no official endorsement should be inferred.  相似文献   

17.
Although not design-unbiased, the ratio estimator is recognized as more efficient when a certain degree of correlation exists between the variable of primary interest and the auxiliary variable. Meanwhile, the Rao–Blackwell method is another commonly used procedure to improve estimation efficiency. Various improved ratio estimators under adaptive cluster sampling (ACS) that make use of the auxiliary information together with the Rao–Blackwellized univariate estimators have been proposed in past research studies. In this article, the variances and the associated variance estimators of these improved ratio estimators are proposed for a thorough framework of statistical inference under ACS. Performance of the proposed variance estimators is evaluated in terms of the absolute relative percentage bias and the empirical mean-squared error. As expected, results show that both the absolute relative percentage bias and the empirical mean-squared error decrease as the initial sample size increases for all the variance estimators. To evaluate the confidence intervals based on these variance estimators and the finite-population Central Limit Theorem, the coverage rate and the interval width are used. These confidence intervals suffer a disadvantage similar to that of the conventional ratio estimator. Hence, alternative confidence intervals based on a certain type of adjusted variance estimators are constructed and assessed in this article.  相似文献   

18.
The statistical properties of two-stage plot sampling estimators of abundance are considered. In the first stage, some spatial units are selected over the whole study area according to a suitable sampling design, while in the second stage, the selected units are surveyed with floating plot sampling to estimate the abundance within. Some insights into the accuracy of the resulting estimators are obtained by splitting the sample variance into the first and second-stage components, while performance is empirically checked by means of a simulation study. Simulation results show that, in most situations, a relevant amount of the overall variance is due to the second stage sampling.  相似文献   

19.
The combined mark-recapture and line transect sampling methodology proposed by Alpizar-Jara and Pollock [Journal of Environmental and Ecological Statistics, 3(4), 311–327, 1996; In Marine Mammal Survey and Assessment Methods Symposium. G.W. Garner, S.C. Amstrup, J.L. Laake, B.F.J. Manly, L.L. McDonald, and D.C. Robertson (Eds.), A.A. Balkema, Rotterdam, Netherlands, pp. 99–114, 1999] is used to illustrate the estimation of population size for populations with prominent nesting structures (i.e., bald eagle nests). In the context of a bald eagle population, the number of nests in a list frame corresponds to a pre-marked sample of nests, and an area frame corresponds to a set of transect strips that could be regularly monitored. Unlike previous methods based on dual frame methodology using the screening estimator [Haines and Pollock (Journal of Environmental and Ecological Statistics, 5, 245–256, 1998a; Survey Methodology, 24(1), 79–88, 1998b)], we no longer need to assume that the area frame is complete (i.e., all the nests in the sampled sites do not need to be seen). One may use line transect sampling to estimate the probability of detection in a sampled area. Combining information from list and area frames provides more efficient estimators than those obtained by using data from only one frame. We derive an estimator for detection probability and generalize the screening estimator. A simulation study is carried out to compare the performance of the Chapman modification of the Lincoln–Petersen estimator to the screening estimator. Simulation results show that although the Chapman estimator is generally less precise than the screening estimator, the latter can be severely biased in presence of uncertain detection. The screening estimator outperforms the Chapman estimator in terms of mean squared error when detection probability is near 1 wheareas the Chapman estimator outperforms the screening estimator when detection probability is lower than a certain threshold value depending on particular scenarios.  相似文献   

20.
Evolutionary improvements in Geographic Information Systems (GIS) now routinely allow the management and mapping of spatial-temporal information. In response, the development of statistical models to combine information of different types and spatial support is of vital importance to environmental science. In this paper we develop a hierarchical spatial statistical model for environmental indicators of stream and river systems in the United States Mid-Atlantic Region by combining information from separate monitoring surveys, available contextual information on hydrologic units and remote sensing information. These models are used to estimate the indicators throughout the riverine system based on information from multiple sources and aggregate scales. The analysis is based on information underlying the Landscape Atlas of the mid-Atlantic region produced by the US Environmental Monitoring and Assessment Program (EMAP). We also combine information from two overlapping separate monitoring surveys, the EMAP Stream and River Survey and the Maryland Biological Streams Survey. We present a general framework for comparative distributional analysis based on the concept of a relative spatial distribution. As an application, the spatial model is used to predict spatial distributions and relative spatial distributions for a watershed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号