首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Adaptive two-stage sequential sampling (ATSSS) design was developed to observe more rare units and gain higher efficiency, in the sense of having a smaller variance estimator, than conventional sampling designs with equal effort for rare and spatially cluster populations. For certain rare populations, incorporating auxiliary variables into a sampling design can further improve the observation of rare units and increase efficiency. In this article, we develop regression-type estimators for ATSSS so that auxiliary variables can be incorporated into the ATSSS design when warranted. Simulation studies on two populations show that the regression-type estimators can significantly increase the efficiency of ATSSS and the detection of more rare units as compared to conventional sampling counterparts. Simulation of sampling of desert shrubs in Inner Mongolia (one of the two populations studied) showed that by incorporating a GIS auxiliary variable into ATSSS with the regression estimators resulted in a gain in efficiency over ATSSS without the auxiliary variable. Further, we found that the use of the GIS auxiliary variable in a conventional two-stage design with a regression estimator did not show a gain in efficiency.  相似文献   

2.
Misidentification of animals is potentially important when naturally existing features (natural tags) such as DNA fingerprints (genetic tags) are used to identify individual animals. For example, when misidentification leads to multiple identities being assigned to an animal, traditional estimators tend to overestimate population size. Accounting for misidentification in capture–recapture models requires detailed understanding of the mechanism. Using genetic tags as an example, we outline a framework for modeling the effect of misidentification in closed population studies when individual identification is based on natural tags that are consistent over time (non-evolving natural tags). We first assume a single sample is obtained per animal for each capture event, and then generalize to the case where multiple samples (such as hair or scat samples) are collected per animal per capture occasion. We introduce methods for estimating population size and, using a simulation study, we show that our new estimators perform well for cases with moderately high capture probabilities or high misidentification rates. In contrast, conventional estimators can seriously overestimate population size when errors due to misidentification are ignored.  相似文献   

3.
The maximum likelihood (ML) method for regression analyzes of censored data (below detection limit) for nonlinear models is presented. The proposed ML method has been translated into an equivalent least squares method (ML-LS). A two stage iterative algorithm is proposed to estimate statistical parameters from the derived least squares translation. The developed algorithm is applied to a nonlinear model for prediction of ambient air CO concentration in terms of concentrations of respirable particulate matter (RSPM) and NO2. It has been shown that if censored data are ignored or estimated through simplifications such as (i) censored data are equal to detection limit, (ii) censored data are half of the difference between detection limit and lower limit (e.g., zero or background level) or (iii) censored data are equal to lower limit, this can cause significant bias in estimated parameters. The developed ML-LS method provided better estimates of parameters than any of the simplifications in censored data.  相似文献   

4.

For many clustered populations, the prior information on an initial stratification exists but the exact pattern of the population concentration may not be predicted. Under this situation, the stratified adaptive cluster sampling (SACS) may provide more efficient estimates than the other conventional sampling designs for the estimation of rare and clustered population parameters. For practical interest, we propose a generalized ratio estimator with the single auxiliary variable under the SACS design. The expressions of approximate bias and mean squared error (MSE) for the proposed estimator are derived. Numerical studies are carried out to compare the performances of the proposed generalized estimator over the usual mean and combined ratio estimators under the conventional stratified random sampling (StRS) using a real population of redwood trees in California and generating an artificial population by the Poisson cluster process. Simulation results show that the proposed class of estimators may provide more efficient results than the other estimators considered in this article for the estimation of highly clumped population.

  相似文献   

5.
6.
Hierarchical models are considered for estimating the probability of agreement between two outcomes or endpoints from an environmental toxicity experiment. Emphasis is placed on generalized regression models, under which the prior mean is related to a linear combination of explanatory variables via a monotone function. This function defines the scale over which the systematic effects are modelled as additive. Specific illustration is provided for the logistic link function. The hierarchical model employs a conjugate beta prior that leads to parametric empirical Bayes estimators of the individual agreement parameters. An example from environmental carcinogenesis illustrates the methods, with motivation derived from estimation of the concordance between two species carcinogenicity outcomes. Based on a large database of carcinogenicity studies, the inter-species concordance is seen to be reasonably informative, i.e. in the range 67–84%. Stratification into pertinent potency-related sub-groups via the logistic model is seen to improve concordance estimation: for environmental stimuli at the extremes of the potency spectrum, concordance can reach well above 90%.  相似文献   

7.
Adaptive cluster sampling (ACS) has received much attention in recent years since it yields more precise estimates than conventional sampling designs when applied to rare and clustered populations. These results, however, are impacted by the availability of some prior knowledge about the spatial distribution and the absolute abundance of the population under study. This prior information helps the researcher to select a suitable critical value that triggers the adaptive search, the neighborhood definition and the initial sample size. A bad setting of the ACS design would worsen the performance of the adaptive estimators. In particular, one of the greatest weaknesses in ACS is the inability to control the final sampling effort if, for example, the critical value is set too low. To overcome this drawback one can introduce ACS with clusters selected without replacement where one can fix in advance the number of distinct clusters to be selected or ACS with a stopping rule which stops the adaptive sampling when a predetermined sample size limit is reached or when a given stopping rule is verified. However, the stopping rule breaks down the theoretical basis for the unbiasedness of the ACS estimators introducing an unknown amount of bias in the estimates. The current study improves the performance of ACS when applied to patchy and clustered but not rare populations and/or less clustered populations. This is done by combining the stopping rule with ACS without replacement of clusters so as to further limit the sampling effort in form of traveling expenses by avoiding repeat observations and by reducing the final sample size. The performance of the proposed design is investigated using simulated and real data.  相似文献   

8.
There are many established extraction techniques regularly used in the isolation and analysis of PAHs and similar organic compounds from various phases. These include Soxhlet or ultrasonic extractions from solids, and liquid-liquid or solid-phase extraction from aqueous samples. However, these methods have some inherent disadvantages; most require large volumes of organic solvents, they can be time consuming and many involve multi-step processes that always present the risk of the loss of some analytes (Zhang et al., 1994). Solid-phase micro-extraction (SPME) is a relatively new technique that has been used with much success in the analysis of a variety of compounds including PAHs. Experiments are being carried out to determine the optimum range of conditions for the extraction of a range of PAHs. Parameters under investigation include temperature, equilibration time, salinity and compound concentration. Presented here are some preliminary experiments into the applicability of SPME for PAH analysis. Further work will investigate the reproducibility of the technique, limits of detection and matrix effects. When an optimised method has been developed the technique will be used in investigations into PAH profiles in sediment cores.  相似文献   

9.
The limitations of traditional zooplankton grazing rate equations were analysed, and the relative advantages of taking time-series measurements or single end-point measurements of grazing rate examined. For zooplankters with variable feeding rates, the time-series approach is the only acceptable method. Use of end-point measurements to calculate feeding rates results in significant error if clearance rate changes or feeding ceases during the experiment, i.e. when the grazing coefficient is not constant, as is assumed in the clearance rate equations. The use of timeseries measurements is particularly important above the critical concentration for saturated ingestion rate. The functional response plot of ingestion rate versus mean cell concentration is inappropriate statistically and should be modified to avoid compounding variables appearing on both axes of the plot.  相似文献   

10.
When sample observations are expensive or difficult to obtain, ranked set sampling is known to be an efficient method for estimating the population mean, and in particular to improve on the sample mean estimator. Using best linear unbiased estimators, this paper considers the simple linear regression model with replicated observations. Use of a form of ranked set sampling is shown to be markedly more efficient for normal data when compared with the traditional simple linear regression estimators.  相似文献   

11.
Thompson (1990) introduced the adaptive cluster sampling design. This sampling design has been shown to be a useful sampling method for parameter estimation of a clustered and scattered population (Roesch, 1993; Smith et al., 1995; Thompson and Seber, 1996). Two estimators, the modified Hansen-Hurwitz (HH) and Horvitz-Thompson (HT) estimators, are available to estimate the mean or total of a population. Empirical results from previous researches indicate that the modified HT estimator has smaller variance than the modified HH estimator. We analytically compare the properties of these two estimators. Some results are obtained in favor of the modified HT estimator so that practitioners are strongly recommended to use the HT estimator despite easiness of computations for the HH estimator.  相似文献   

12.
On parametric estimation of population abundance for line transect sampling   总被引:1,自引:0,他引:1  
Despite recent advances in nonparametric methods for estimating animal abundance, parametric methods are still used widely among biometricians due to their simplicity. In this paper, we propose an optimal shrinkage-type estimator and an empirical Bayes estimator for estimating animal density from line transect sampling data. The performances of the proposed estimators are compared with those of the maximum likelihood estimator and a bias-corrected maximum likelihood estimator both theoretically and numerically. Simulation results show that the optimal shrinkage-type estimator works the best if the detection function has a very thin tail (for example, the half normal detection function), while the maximum likelihood estimator is the best estimator if the detection function has relatively thick tail (for example, the polynomial detection function).  相似文献   

13.
This paper describes the quantitative determination of F, Cl, Br, Cd and Pb in plastic materials. The concentration of the elements Cl, Br, Cd and Pb is first semiquantitatively measured by X‐ray fluorescence spectrometry (XRF) directly in the solid sample with a detection limit of approximately 10 μg/g. Afterwards, F and any of the other elements which exceed the limit values for materials which are disposable without special precautions are measured after the digestion of the material. The samples are digested either under pressure in an oxygen atmosphere for F, Cl and Br or under pressure with nitric acid for Cd and Pb. The digestion converts the halides to the anions which are measured potentiometrically (F, Br) or with ion chromatography (Cl). Cd and Pb are measured by graphite furnace atomic absorption spectrometry (GF‐AAS). The determination limits achieved are 20 μg/g for F and Br, 250 μg/g for Cl, 0.01 μg/g for Cd and 0.2 μg/g for Pb, all below the limit values set by current regulations in Switzerland.  相似文献   

14.
Rao-Blackwellization is used to improve the unbiased Hansen–Hurwitz and Horvitz–Thompson unbiased estimators in Adaptive Cluster Sampling by finding the conditional expected value of the original unbiased estimators given the sufficient or minimal sufficient statistic. In principle, the same idea can be used to find better ratio estimators, however, the calculation of taking all the possible combinations into account can be extremely tedious in practice. The simplified analytical forms of such ratio estimators are not currently available. For practical interest, several improved ratio estimators in Adaptive Cluster Sampling are proposed in this article. The proposed ratio estimators are not the real Rao-Blackwellized versions of the original ones but make use of the Rao-Blackwellized univariate estimators. How to calculate the proposed estimators is illustrated, and their performance are evaluated by both of the Bivariate Poisson clustered process and a real data. The simulation result indicates that the proposed improved ratio estimators are able to provide considerably advantageous estimation results over the original ones.  相似文献   

15.
土壤宏蛋白质组学在揭示土壤微生物功能、代谢与环境相互作用方面具有广阔的应用前景,但由于土壤样品的特殊性,土壤蛋白质提取步骤是限制土壤宏蛋白质组学大规模应用的瓶颈之一.本文从样品制备、提取方法、影响因素等方面综述了土壤蛋白提取方面的研究进展.一般来说,根据实验目的、蛋白种类及后续研究方法设计相应的分组成收集策略才能取得较好的提取效果.土壤总蛋白、胞内蛋白与胞外蛋白分别有不同的提取方法.总蛋白提取一般采用直接提取法;胞外蛋白不需要裂解;胞内蛋白提取方法有直接提取法和间接提取法等.裂解、浓缩、去除腐殖质的方法以及提取液、pH值的选择等也会影响提取效果.此外,简单介绍了土壤宏蛋白质组学的应用,并对今后的研究工作提出展望.表1参36  相似文献   

16.
Many simulation studies have examined the properties of distance sampling estimators of wildlife population size. When assumptions hold, if distances are generated from a detection model and fitted using the same model, they are known to perform well. However, in practice, the true model is unknown. Therefore, standard practice includes model selection, typically using model comparison tools like Akaike Information Criterion. Here we examine the performance of standard distance sampling estimators under model selection. We compare line and point transect estimators with distances simulated from two detection functions, hazard-rate and exponential power series (EPS), over a range of sample sizes. To mimic the real-world context where the true model may not be part of the candidate set, EPS models were not included as candidates, except for the half-normal parameterization. We found median bias depended on sample size (being asymptotically unbiased) and on the form of the true detection function: negative bias (up to 15% for line transects and 30% for point transects) when the shoulder of maximum detectability was narrow, and positive bias (up to 10% for line transects and 15% for point transects) when it was wide. Generating unbiased simulations requires careful choice of detection function or very large datasets. Practitioners should collect data that result in detection functions with a shoulder similar to a half-normal and use the monotonicity constraint. Narrow-shouldered detection functions can be avoided through good field procedures and those with wide shoulder are unlikely to occur, due to heterogeneity in detectability.  相似文献   

17.
A new spatially balanced sampling design for environmental surveys is introduced, called Halton iterative partitioning (HIP). The design draws sample locations that are well spread over the study area. Spatially balanced designs are known to be efficient when surveying natural resources because nearby locations tend to be similar. The HIP design uses structural properties of the Halton sequence to partition a resource into nested boxes. Sample locations are then drawn from specific boxes in the partition to ensure spatial diversity. The method is conceptually simple and computationally efficient, draws spatially balanced samples in two or more dimensions and uses standard design-based estimators. Furthermore, HIP samples have an implicit ordering that can be used to define spatially balanced over-samples. This feature is particularly useful when sampling natural resources because we can dynamically add spatially balanced units from the over-sample to the sample as non-target or inaccessible units are discovered. We use several populations to show that HIP sampling draws spatially balanced samples and gives precise estimates of population totals.  相似文献   

18.
Many environmental sampling problems involve some specified regulatory or contractual limit (RL). Often the interest is in estimating the percentile of the underlying contaminant concentration distribution corresponding to RL. The focus of this paper is on obtaining a point estimate and a lower confidence limit for that percentile when all observations are nondetectable, with the ith observation known to be less than some detection limit DLI, where DLi RL. Since composite samples are being considered, it is not unreasonable to assume an underlying normal distribution.  相似文献   

19.
Geostatistics is a set of statistical techniques that is increasingly used to characterize spatial dependence in spatially referenced ecological data. A common feature of geostatistics is predicting values at unsampled locations from nearby samples using the kriging algorithm. Modeling spatial dependence in sampled data is necessary before kriging and is usually accomplished with the variogram and its traditional estimator. Other types of estimators, known as non-ergodic estimators, have been used in ecological applications. Non-ergodic estimators were originally suggested as a method of choice when sampled data are preferentially located and exhibit a skewed frequency distribution. Preferentially located samples can occur, for example, when areas with high values are sampled more intensely than other areas. In earlier studies the visual appearance of variograms from traditional and non-ergodic estimators were compared. Here we evaluate the estimators' relative performance in prediction. We also show algebraically that a non-ergodic version of the variogram is equivalent to the traditional variogram estimator. Simulations, designed to investigate the effects of data skewness and preferential sampling on variogram estimation and kriging, showed the traditional variogram estimator outperforms the non-ergodic estimators under these conditions. We also analyzed data on carabid beetle abundance, which exhibited large-scale spatial variability (trend) and a skewed frequency distribution. Detrending data followed by robust estimation of the residual variogram is demonstrated to be a successful alternative to the non-ergodic approach.  相似文献   

20.
Closed capture-recapture (CR) estimators have been used extensively to estimate population size. Most closed CR approaches have been developed and evaluated for discrete-time models, but there has been little effort to evaluate their continuous-time counterparts. Continuous-time estimators — developed using maximum likelihood theory by Craig (1953) and Darroch (1958), and martingale theory by Becker (1984) — that allow capture probabilities to vary over time were evaluated using Monte Carlo simulation. Overall, the ML estimators had a smaller MSE. The estimators performed well when model assumptions were upheld, and were somewhat robust to heterogeneity in capture probabilities. However, the estimators were not robust to behavioural effects in the capture probabilities. Time lag effects (periods when animals might be unavailable for immediate recapture) on continuous-time estimates were also investigated and results indicated a positive bias which was greater for smaller populations. There was no gain in performance when using a continuous-time estimator versus a discrete-time estimator on the same simulated data. Usefulness of the continuous-time approach may be limited to study designs where animals are easier to sample using continuous-time methodology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号