首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 129 毫秒
1.
The statistical analysis of environmental data from remote sensing and Earth system simulations often entails the analysis of gridded spatio-temporal data, with a hypothesis test being performed for each grid cell. When the whole image or a set of grid cells are analyzed for a global effect, the problem of multiple testing arises. When no global effect is present, we expect $$ \alpha $$% of all grid cells to be false positives, and spatially autocorrelated data can give rise to clustered spurious rejections that can be misleading in an analysis of spatial patterns. In this work, we review standard solutions for the multiple testing problem and apply them to spatio-temporal environmental data. These solutions are independent of the test statistic, and any test statistic can be used (e.g., tests for trends or change points in time series). Additionally, we introduce permutation methods and show that they have more statistical power. Real-world data are used to provide examples of the analysis, and the performance of each method is assessed in a simulation study. Unlike other simulation studies, our study compares the statistical power of the presented methods in a comprehensive simulation study. In conclusion, we present several statistically rigorous methods for analyzing spatio-temporal environmental data and controlling the false positives. These methods allow the use of any test statistic in a wide range of applications in environmental sciences and remote sensing.  相似文献   

2.
The objective of mutagenicity assays in regulatory toxicology is the decision on non-mutagenicity or mutagenicity. An inherent problem of statistical tests is the possibility of false decisions, i.e., a mutagenic substance will be falsely labeled as non-mutagenic or a non-mutagenic substance will be falsely labeled as mutagenic. These probabilities of false negative (consumer's risk=type II error) and/or false positive decision (producer's risk=type I error) can be limited by using suitable testing procedures as well as a design including an appropriate positive control. Using the proof of hazard concept the well-known many-to-one procedures with total order restriction for increasing effect differences are used, while using the proof of safety concept procedures on equivalence with total order restriction are discussed. Both approaches are demonstrated on a real data example.  相似文献   

3.
Testing electrophoretic data for agreement with hardy-weinberg expectations   总被引:8,自引:0,他引:8  
H. A. Lessios 《Marine Biology》1992,112(3):517-523
Electrophoretic data from marine organisms are routinely tested for conformity to expectations of the Hardy-Weinberg rule, but the statistical procedures used and the interpretation of their results are often flawed. This paper summarizes literature on statistical testing for Hardy-Weinberg equilibrium and suggests and analytical strategy based on carrying out computationally simple goodness-of-fit 2 tests (with pooling and correction factors for continuity if necessary) when appropriate, and resorting to computationally tedious, exact tests when necessary. It recommends adjustments of significance levels to avoid the large Type-I error that may result from multiple tests for Hardy-Weinberg equilibrium, one for each locus and each population. It points out the obvious but common error of interpreting non-significant tests as evidence of conformity to Hardy-Weinberg expectations, and makes suggestions as to how tests that produce significance can be used to reach conclusions of biological relevance.  相似文献   

4.
Peres-Neto PR  Legendre P  Dray S  Borcard D 《Ecology》2006,87(10):2614-2625
Establishing relationships between species distributions and environmental characteristics is a major goal in the search for forces driving species distributions. Canonical ordinations such as redundancy analysis and canonical correspondence analysis are invaluable tools for modeling communities through environmental predictors. They provide the means for conducting direct explanatory analysis in which the association among species can be studied according to their common and unique relationships with the environmental variables and other sets of predictors of interest, such as spatial variables. Variation partitioning can then be used to test and determine the likelihood of these sets of predictors in explaining patterns in community structure. Although variation partitioning in canonical analysis is routinely used in ecological analysis, no effort has been reported in the literature to consider appropriate estimators so that comparisons between fractions or, eventually, between different canonical models are meaningful. In this paper, we show that variation partitioning as currently applied in canonical analysis is biased. We present appropriate unbiased estimators. In addition, we outline a statistical test to compare fractions in canonical analysis. The question addressed by the test is whether two fractions of variation are significantly different from each other. Such assessment provides an important step toward attaining an understanding of the factors patterning community structure. The test is shown to have correct Type I. error rates and good power for both redundancy analysis and canonical correspondence analysis.  相似文献   

5.
The spatial scan statistic is a widely applied tool for cluster detection. The spatial scan statistic evaluates the significance of a series of potential circular clusters using Monte Carlo simulation to account for the multiplicity of comparisons. In most settings, the extent of the multiplicity problem varies across the study region. For example, urban areas typically have many overlapping clusters, while rural areas have few. The spatial scan statistic does not account for these local variations in the multiplicity problem. We propose two new spatially-varying multiplicity adjustments for spatial cluster detection, one based on a nested Bonferroni adjustment and one based on local averaging. Geographic variations in power for the spatial scan statistic and the two new statistics are explored through simulation studies, and the methods are applied to both the well-known New York leukemia data and data from a case–control study of breast cancer in Wisconsin.  相似文献   

6.
Statistical Power of Presence-Absence Data to Detect Population Declines   总被引:2,自引:0,他引:2  
Abstract: Population declines may be inferred from a decrease in the number of sites at which a species is detected. Although such presence-absence data often are interpreted informally, it is simple to test the statistical significance of changes in the number of sites occupied by a species. I used simulations to examine the statistical power (i.e., the probability of making the Type II error that no population decline has occurred when the population actually has declined) of presence-absence designs. Most presence-absence designs have low power to detect declines of < 20–50% in populations but have adequate power to detect steeper declines. Power was greater if the population disappeared entirely from a subset of formerly occupied sites than if it declined evenly over its entire range. Power also rose with (1) increases in the number of sites surveyed; (2) increases in population density or sampling effort at a site; and (3) decreases in spatial variance in population density. Because of potential problems with bias and inadequate power, presence-absence designs should be used and interpreted cautiously.  相似文献   

7.
T. Brey 《Marine Biology》1990,106(3):503-508
In field studies, somatic production of animals is often calculated by means of the increment summation method, which is based on consecutive samples from the population. The main disadvantage of this method is the lack of any measurement of variability, therefore the statistical significance of the calculated production value is uncertain. This paper shows that in many cases a nonparametric statistical approach called the “bootstrap” can be used to overcome this problem. By means of this procedure, natural variability of production and production to biomass ratios can be assessed by 95% confidence intervals, standard deviation or related parameters from a sample of limited size.  相似文献   

8.
It is well-known that the total metal content in soils is not a good indicator of their harmful effects, leading to an overestimation of risks. Toxicological and environmental hazards depend on the chemical species and on its bioavailability to target organisms. Because a good estimation of bioavailability is difficult, a good compromise is to assess bioaccessibility, defined as the maximum amount of a pollutant which is potentially absorbable by a target organism. This study presents a comparison of different strategies to measure metal bioaccessibility in soils. Three procedures were applied to real soil samples with different levels of metal contamination: pseudo-total metal attack, selective sequential extractions and in vitro tests (deliberately developed to simulate human or mammals digestion). Considering the first step of the selective extraction procedure, which can provide the bioaccessible fraction for deposit-feeder organisms, data obtained for each metal were lower than those obtained from in vitro tests. Therefore, it is possible to highlight that this extraction tends to underestimate metal bioaccessibility in soils for humans, while in vitro tests certainly will overestimate bioaccessibility for organisms as invertebrates. If the sum of first and second step of sequential procedure is considered, results are quite similar to those obtained from in vitro tests, but this kind of procedure would require two days of work rather than a few hours required to perform an in vitro test. Results highlight the diversity among the differently defined bioaccessible fractions and the need to apply the most suitable procedure depending on the target organism.  相似文献   

9.
Shipley B 《Ecology》2010,91(9):2794-2805
Maximum entropy (maxent) models assign probabilities to states that (1) agree with measured macroscopic constraints on attributes of the states and (2) are otherwise maximally uninformative and are thus as close as possible to a specified prior distribution. Such models have recently become popular in ecology, but classical inferential statistical tests require assumptions of independence during the allocation of entities to states that are rarely fulfilled in ecology. This paper describes a new permutation test for such maxent models that is appropriate for very general prior distributions and for cases in which many states have zero abundance and that can be used to test for conditional relevance of subsets of constraints. Simulations show that the test gives correct probability estimates under the null hypothesis. Power under the alternative hypothesis depends primarily on the number and strength of the constraints and on the number of states in the model; the number of empty states has only a small effect on power. The test is illustrated using two empirical data sets to test the community assembly model of B. Shipley, D. Vile, and E. Garnier and the species abundance distribution models of S. Pueyo, F. He, and T. Zillio.  相似文献   

10.
Environmental regulatory standards are intended to protect human health and environmental welfare. Current standards are based on scientific and policy considerations but appear to lack rigorous statistical foundations and may have unintended regulatory consequences. We examine current and proposed U.S. environmental regulatory standards for ozone from the standpoint of their formulation and performance within a statistical hypothesis testing framework. We illustrate that the standards can be regarded as representing constraints on a percentile of the ozone distribution, where the percentile involved depends on the defined length of ozone season and the constraint is stricter in regions with greater variability. A hypothesis testing framework allows consideration of error rates (probability of false declaration of violation and compliance) and we show that the existing statistics on which the standards are based can be improved upon in terms of bias and variance. Our analyses also raise issues relating to network design and the possibilities of defining a regionally based standard that acknowledges and accounts for spatial and temporal variability in the ozone distribution.  相似文献   

11.
Cleanup standards at hazardous waste sites include (i) numeric standards (often risk-based), (ii) background standards in which the remediated site is compared with data from a supposedly clean region, and (iii) interim standards in which the remediated site is compared with preremediation data from the same site. The latter are especially appropriate for verifying progress when an innovative, but unproven, technology is used for remediation. Standards of type (i) require one-sample statistical tests, while those of type (ii) and type (iii) call for two-sample tests. This paper considers two-sample tests with an emphasis upon the type (iii) scenario. Both parametric (likelihood ratio) and nonparametric (linear rank) protocols are examined. The methods are illustrated with preremediation data from a site on the National Priorities List. The results indicate that nonparametric procedures can be quite competitive (in terms of power) with distributional modelling provided a near optimal rank test is selected. Suggestions are given for identifying such rank tests. The results also confirm the importance of sound baseline sampling; no amount of post-remediation sampling can overcome baseline deficiencies.This paper has been prepared with partial support from the United States Environmental Protection Agency under a Cooperative Agreement Number CR-815273. The contents have not been subject to Agency review and therefore do not necessarily reflect the views or policies of the Agency and no official endorsement should be inferred.  相似文献   

12.
13.
The application of sequential extraction procedures to determine metal speciation in sediments is fraught with uncertainty regarding what is actually dissolving or re-precipitating at each stage. In order to choose an appropriate scheme for the investigation of contaminated anaerobic mud two different sequential extraction procedures (Kersten and Förstner, 1986; Quevauviller, 1998) were investigated using a Cryogenic SEM (CryoSEM) technique coupled with energy dispersive X-ray analysis (EDXA). This enabled assessment of the degree of reagent selectivity and any re-precipitation associated with the respective methods. Analysis of the non-leached sediment revealed the most abundant authigenic minerals in order of decreasing abundance to be Fe2+-phosphate vivianite (Fe3(PO4)2·8H2O), mixed Fe, Zn, Cu sulphides, pyrite and calcite. After each stage of the sequential extraction the sediment residue was examined using CryoSEM. After extraction of the exchangeable fraction no obvious evidence of mineral dissolution was observed. Calcite was not completely dissolved during the carbonate extraction stage of either procedure. Vivianite began to dissolve in the carbonate extraction stage of both procedures and was completely dissolved by oxide extraction stage of both procedures. The sediment leached by acidified ammonium oxalate, contained abundant Fe oxalate crystals, suggesting that a large proportion of the Fe released from the vivianite has been re-precipitated. The Fe oxalate was then dissolved with the subsequent sulphide fraction. The technique used to extract the sulphide and organic fraction is the same in both schemes and no sulphide or metal rich organic matter was found in either residue.  相似文献   

14.
Detection of discontinuities in landscape patterns is a crucial problem both in ecology and in environmental sciences since they may indicate substantial scale changes in generating and maintaining processes of landscape patches. This paper presents a statistical procedure for detecting distinct scales of pattern for irregular patch mosaics using fractal analysis. The method suggested is based on a piecewise regression model given by fitting different regression lines to different ranges of patches ordered according to patch size (area). Proper shift-points, where discontinuities occur, are then identified by means of an iterative procedure. Further statistical tests are applied in order to verify the statistical significance of the best models selected. Compared to the method proposed by Krummel et al. (1987), the procedure described here is not influenced by subjective choices of initial parameters. The procedure was applied to landscape pattern analysis of irregular patch mosaics (CORINE biotopes) of a watershed within the Map of the Italian Nature Project. Results for three different CORINE patch types are herein presented revealing different scaling properties with special pattern organizations linked to ecological traits of vegetation communities and human disturbance.  相似文献   

15.
Abstract:  We used socioeconomic models that included economic inequality to predict biodiversity loss, measured as the proportion of threatened plant and vertebrate species, across 50 countries. Our main goal was to evaluate whether economic inequality, measured as the Gini index of income distribution, improved the explanatory power of our statistical models. We compared four models that included the following: only population density, economic footprint (i.e., the size of the economy relative to the country area), economic footprint and income inequality (Gini index), and an index of environmental governance. We also tested the environmental Kuznets curve hypothesis, but it was not supported by the data. Statistical comparisons of the models revealed that the model including both economic footprint and inequality was the best predictor of threatened species. It significantly outperformed population density alone and the environmental governance model according to the Akaike information criterion. Inequality was a significant predictor of biodiversity loss and significantly improved the fit of our models. These results confirm that socioeconomic inequality is an important factor to consider when predicting rates of anthropogenic biodiversity loss.  相似文献   

16.
After several decades during which applied statistical inference in research on animal behaviour and behavioural ecology has been heavily dominated by null hypothesis significance testing (NHST), a new approach based on information theoretic (IT) criteria has recently become increasingly popular, and occasionally, it has been considered to be generally superior to conventional NHST. In this commentary, I discuss some limitations the IT-based method may have under certain circumstances. In addition, I reviewed some recent articles published in the fields of animal behaviour and behavioural ecology and point to some common failures, misunderstandings and issues frequently appearing in the practical application of IT-based methods. Based on this, I give some hints about how to avoid common pitfalls in the application of IT-based inference, when to choose one or the other approach and discuss under which circumstances a mixing of the two approaches might be appropriate.  相似文献   

17.
Random denominators and the analysis of ratio data   总被引:2,自引:0,他引:2  
Ratio data, observations in which one random value is divided by another random value, present unique analytical challenges. The best statistical technique varies depending on the unit on which the inference is based. We present three environmental case studies where ratios are used to compare two groups, and we provide three parametric models from which to simulate ratio data. The models describe situations in which (1) the numerator variance and mean are proportional to the denominator, (2) the numerator mean is proportional to the denominator but its variance is proportional to a quadratic function of the denominator and (3) the numerator and denominator are independent. We compared standard approaches for drawing inference about differences between two distributions of ratios: t-tests, t-tests with transformations, permutation tests, the Wilcoxon rank test, and ANCOVA-based tests. Comparisons between tests were based both on achieving the specified alpha-level and on statistical power. The tests performed comparably with a few notable exceptions. We developed simple guidelines for choosing a test based on the unit of inference and relationship between the numerator and denominator.  相似文献   

18.
How simple can a model be that still captures essential aspects of wildfire ecosystems at large spatial and temporal scales? The Drossel-Schwabl model (DSM) is a metaphorical forest-fire model developed to reproduce only one pattern of real systems: a frequency distribution of fire sizes resembling a power law. Consequently, and because it appears oversimplified, it remains unclear what bearings the DSM has in reality. Here, we test whether the DSM is capable of reproducing a pattern that was not considered in its design, the hump-shaped relation between the diversity of succession stages and average annual area burnt. We found that the model, once reformulated to represent succession, produces realistic landscape diversity patterns. We investigated four succession scenarios of forest-fire ecosystems in the USA and Canada. In all scenarios, landscape diversity is highest at an intermediate average annual area burnt as predicted by the intermediate disturbance hypothesis. These results show that a model based solely on the dynamics of the fuel mosaic has surprisingly high predictive power with regard to observed statistical properties of wildfire systems at large spatial scales. Parsimonious models, such as the DSM can be used as starting points for systematic development of more structurally realistic but tractable wildfire models. Due to their simplicity they allow analytical approaches that further our understanding under increasing complexity.  相似文献   

19.
Abstract: Estimating the abundance of migratory species is difficult because sources of variability differ substantially among species and populations. Recently developed state‐space models address this variability issue by directly modeling both environmental and measurement error, although their efficacy in detecting declines is relatively untested for empirical data. We applied state‐space modeling, generalized least squares (with autoregression error structure), and standard linear regression to data on abundance of wetland birds (shorebirds and terns) at Moreton Bay in southeast Queensland, Australia. There are internationally significant numbers of 8 species of waterbirds in the bay, and it is a major terminus of the large East Asian‐Australasian Flyway. In our analyses, we considered 22 migrant and 8 resident species. State‐space models identified abundances of 7 species of migrants as significantly declining and abundance of one species as significantly increasing. Declines in migrant abundance over 15 years were 43–79%. Generalized least squares with an autoregressive error structure showed abundance changes in 11 species, and standard linear regression showed abundance changes in 15 species. The higher power of the regression models meant they detected more declines, but they also were associated with a higher rate of false detections. If the declines in Moreton Bay are consistent with trends from other sites across the flyway as a whole, then a large number of species are in significant decline.  相似文献   

20.
We derive some statistical properties of the distribution of two Negative Binomial random variables conditional on their total. This type of model can be appropriate for paired count data with Poisson over-dispersion such that the variance is a quadratic function of the mean. This statistical model is appropriate in many ecological applications including comparative fishing studies of two vessels and or gears. The parameter of interest is the ratio of pair means. We show that the conditional means and variances are different from the more commonly used Binomial model with variance adjusted for over-dispersion, or the Beta-Binomial model. The conditional Negative Binomial model is complicated because it does not eliminate nuisance parameters like in the Poisson case. Maximum likelihood estimation with the unconditional Negative Binomial model can result in biased estimates of the over-dispersion parameter and poor confidence intervals for the ratio of means when there are many nuisance parameters. We propose three approaches to deal with nuisance parameters in the conditional Negative Binomial model. We also study a random effects Binomial model for this type of data, and we develop an adjustment to the full-sample Negative Binomial profile likelihood to reduce the bias caused by nuisance parameters. We use simulations with these methods to examine bias, precision, and accuracy of estimators and confidence intervals. We conclude that the maximum likelihood method based on the full-sample Negative Binomial adjusted profile likelihood produces the best statistical inferences for the ratio of means when paired counts have Negative Binomial distributions. However, when there is uncertainty about the type of Poisson over-dispersion then a Binomial random effects model is a good choice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号