共查询到20条相似文献,搜索用时 31 毫秒
1.
Model averaging (MA) has been proposed as a method of accommodating model uncertainty when estimating risk. Although the use
of MA is inherently appealing, little is known about its performance using general modeling conditions. We investigate the
use of MA for estimating excess risk using a Monte Carlo simulation. Dichotomous response data are simulated under various
assumed underlying dose–response curves, and nine dose–response models (from the USEPA Benchmark dose model suite) are fit
to obtain both model specific and MA risk estimates. The benchmark dose estimates (BMDs) from the MA method, as well as estimates
from other commonly selected models, e.g., best fitting model or the model resulting in the smallest BMD, are compared to
the true benchmark dose value to better understand both bias and coverage behavior in the estimation procedure. The MA method
has a small bias when estimating the BMD that is similar to the bias of BMD estimates derived from the assumed model. Further,
when a broader range of models are included in the family of models considered in the MA process, the lower bound estimate
provided coverage close to the nominal level, which is superior to the other strategies considered. This approach provides
an alternative method for risk managers to estimate risk while incorporating model uncertainty.
相似文献
Matthew W. WheelerEmail: |
2.
Benchmark calculations often are made from data extracted from publications. Such data may not be in a form most appropriate
for benchmark analysis, and, as a result, suboptimal and/or non-standard benchmark analyses are often applied. This problem
can be mitigated in some cases using Monte Carlo computational methods that allow the likelihood of the published data to
be calculated while still using an appropriate benchmark dose (BMD) definition. Such an approach is illustrated herein using
data from a study of workers exposed to styrene, in which a hybrid BMD calculation is implemented from dose response data
reported only as means and standard deviations of ratios of scores on neuropsychological tests from exposed subjects to corresponding
scores from matched controls. The likelihood of the data is computed using a combination of analytic and Monte Carlo integration
methods.
相似文献
Kenny S. CrumpEmail: |
3.
R. Webster West Daniela K. Nitcheva Walter W. Piegorsch 《Environmental and Ecological Statistics》2009,16(1):63-73
A primary objective in quantitative risk assessment is the characterization of risk which is defined to be the likelihood
of an adverse effect caused by an environmental toxin or chemcial agent. In modern risk-benchmark analysis, attention centers
on the “benchmark dose” at which a fixed benchmark level of risk is achieved, with a lower confidence limits on this dose
being of primary interest. In practice, a range of benchmark risks may be under study, so that the individual lower confidence
limits on benchmark dose must be corrected for simultaneity in order to maintain a specified overall level of confidence.
For the case of quantal data, simultaneous methods have been constructed that appeal to the large sample normality of parameter
estimates. The suitability of these methods for use with small sample sizes will be considered. A new bootstrap technique
is proposed as an alternative to the large sample methodology. This technique is evaluated via a simulation study and examples
from environmental toxicology.
相似文献
R. Webster WestEmail: |
4.
Brooke E. Buckley Walter W. Piegorsch R. Webster West 《Environmental and Ecological Statistics》2009,16(1):53-62
In modern environmental risk analysis, inferences are often desired on those low dose levels at which a fixed benchmark risk
is achieved. In this paper, we study the use of confidence limits on parameters from a simple one-stage model of risk historically
popular in benchmark analysis with quantal data. Based on these confidence bounds, we present methods for deriving upper confidence
limits on extra risk and lower bounds on the benchmark dose. The methods are seen to extend automatically to the case where
simultaneous inferences are desired at multiple doses. Monte Carlo evaluations explore characteristics of the parameter estimates
and the confidence limits under this setting.
相似文献
R. Webster WestEmail: |
5.
The influence of multiple anchored fish aggregating devices (FADs) on the spatial behavior of yellowfin (Thunnus albacares) and bigeye tuna (T. obesus) was investigated by equipping all thirteen FADs surrounding the island of Oahu (HI, USA) with automated sonic receivers
(“listening stations”) and intra-peritoneally implanting individually coded acoustic transmitters in 45 yellowfin and 12 bigeye
tuna. Thus, the FAD network became a multi-element passive observatory of the residence and movement characteristics of tuna
within the array. Yellowfin tuna were detected within the FAD array for up to 150 days, while bigeye tuna were only observed
up to a maximum of 10 days after tagging. Only eight yellowfin tuna (out of 45) and one bigeye tuna (out of 12) visited FADs
other than their FAD of release. Those nine fish tended to visit nearest neighboring FADs and, in general, spent more time
at their FAD of release than at the others. Fish visiting the same FAD several times or visiting other FADs tended to stay
longer in the FAD network. A majority of tagged fish exhibited some synchronicity when departing the FADs but not all tagged
fish departed a FAD at the same time: small groups of tagged fish left together while others remained. We hypothesize that
tuna (at an individual or collective level) consider local conditions around any given FAD to be representative of the environment
on a larger scale (e.g., the entire island) and when those conditions become unfavorable the tuna move to a completely different
area. Thus, while the anchored FADs surrounding the island of Oahu might concentrate fish and make them more vulnerable to
fishing, at a meso-scale they might not entrain fish longer than if there were no (or very few) FADs in the area. At the existing
FAD density, the ‘island effect’ is more likely to be responsible for the general presence of fish around the island than
the FADs. We recommend further investigation of this hypothesis.
相似文献
Laurent Dagorn (Corresponding author)Email: |
Kim N. HollandEmail: |
David G. ItanoEmail: |
6.
B. Gail Ivanoff 《Environmental and Ecological Statistics》2009,16(2):153-171
The concept of the renewal property is extended to processes indexed by a multidimensional time parameter. The definition
given includes not only partial sum processes, but also Poisson processes and many other point processes whose jump points
are not totally ordered. Various properties of renewal processes are discussed. Renewal processes are proposed as a basis
for modelling the spread of a forest fire under a prevailing wind.
相似文献
B. Gail IvanoffEmail: |
7.
Frederic Paik Schoenberg Jamie Pompa Chien-Hsun Chang 《Environmental and Ecological Statistics》2009,16(2):251-269
This paper explores the use of, and problems that arise in, kernel smoothing and parametric estimation of the relationships
between wildfire incidence and various meteorological variables. Such relationships may be treated as components in separable
point process models for wildfire activity. The resulting models can be used for comparative purposes in order to assess the
predictive performance of the Burning Index.
相似文献
Frederic Paik SchoenbergEmail: |
8.
Fiat boundaries: some implications for interpretation,decision-support,and multi-temporal analysis 总被引:1,自引:0,他引:1
Kim Lowell 《Environmental and Ecological Statistics》2008,15(4):369-383
Polygon-based thematic maps can be composed of boundaries that exist by definition—i.e., bona fide boundaries—or those that
exist relative to a specific interpretation of a spatial phenomenon—i.e., fiat boundaries. The construction of maps composed
of fiat boundaries is usually based on a subjective interpretive methodology that is affected by the data used to construct
the map and the minimum mapping unit employed. That fiat boundaries are not the same as bona fide boundaries affects their
use in computer-based spatial decision support tools. This is discussed both in terms of an analysis conducted at one specific
moment, and in respect to increasingly common multi-temporal analysis.
相似文献
Kim LowellEmail: |
9.
Chang Xuan Mao 《Environmental and Ecological Statistics》2007,14(4):473-481
Consider the removal experiment used to estimate population sizes. Statistical methods towards testing the homogeneity of
capture probabilities of animals, including a graphical diagnostic and a formal test, are presented and illustrated by real
biological examples. Simulation is used to assess the test and compare it with the χ2 test.
相似文献
Chang Xuan MaoEmail: |
10.
Lance A. Waller 《Environmental and Ecological Statistics》2008,15(3):259-263
The three papers included in this special issue represent a set of presentations in an invited session on disease ecology
at the 2005 Spring Meeting of the Eastern North American Region of the International Biometric Society. The papers each address
statistical estimation and inference for particular components of different disease processes and, taken together, illustrate
the breadth of statistical issues arising in the study of the ecology and public health impact of disease. As an introduction,
we provide a very brief overview of the area of “disease ecology”, a variety of synonyms addressing different aspects of disease
ecology, and present a schematic structure illustrating general components of the underlying disease process, data collection
issues, and different disciplinary perspectives ranging from microbiology to public health surveillance.
相似文献
Lance A. WallerEmail: |
11.
Den Boychuk W. John Braun Reg J. Kulperger Zinovi L. Krougly David A. Stanford 《Environmental and Ecological Statistics》2009,16(2):133-151
We consider a stochastic fire growth model, with the aim of predicting the behaviour of large forest fires. Such a model can
describe not only average growth, but also the variability of the growth. Implementing such a model in a computing environment
allows one to obtain probability contour plots, burn size distributions, and distributions of time to specified events. Such
a model also allows the incorporation of a stochastic spotting mechanism.
相似文献
Reg J. KulpergerEmail: |
12.
Rudolf Izsák 《Environmental and Ecological Statistics》2008,15(2):143-156
In this paper some properties and analytic expressions regarding the Poisson lognormal distribution such as moments, maximum
likelihood function and related derivatives are discussed. The author provides a sharp approximation of the integrals related
to the Poisson lognormal probabilities and analyzes the choice of the initial values in the fitting procedure. Based on these
he describes a new procedure for carrying out the maximum likelihood fitting of the truncated Poisson lognormal distribution.
The method and results are illustrated on real data. The computer program for calculations is freely available.
相似文献
Rudolf IzsákEmail: |
13.
In this paper we examine the use of data augmentation techniques for simplifying iterative simulation in the context of both
Bayesian and classical statistical inference for survival rate estimation. We examine two distinct model families common in
population ecology to illustrate our ideas, ring-recovery models and capture–recapture models, and we present the computational
advantage of this approach. We discuss also the fact that problems associated with identifiability in the classical framework
can be overcome using data augmentation, but highlight the dangers in doing so under both inferential paradigms.
相似文献
I. C. OlsenEmail: |
14.
John E. Hathaway G. Bruce Schaalje Richard O. Gilbert Brent A. Pulsipher Brett D. Matzke 《Environmental and Ecological Statistics》2008,15(3):313-327
Composite sampling can be more cost effective than simple random sampling. This paper considers how to determine the optimum
number of increments to use in composite sampling. Composite sampling terminology and theory are outlined and a method is
developed which accounts for different sources of variation in compositing and data analysis. This method is used to define
and understand the process of determining the optimum number of increments that should be used in forming a composite. The
blending variance is shown to have a smaller range of possible values than previously reported when estimating the number
of increments in a composite sample. Accounting for differing levels of the blending variance significantly affects the estimated
number of increments.
相似文献
John E. HathawayEmail: |
15.
The Wadden Sea is an important habitat for harbour seals and grey seals. They regularly haul-out on sandbanks and islands
along the coast. Comparably little is known about the time seals spend at sea and how they use the remainder of the North
Sea. Yet, human activity in offshore waters is increasing and information on seal distribution in the North Sea is crucial
for conservation and management. Aerial line transect surveys were conducted in the German bight from 2002 to 2007 to investigate
the distribution and abundance of marine mammals. Distance sampling methodology was combined with density surface modelling
for a spatially explicit analysis of seal distribution in the German North Sea. Depth and distance to coast were found to
be relevant predictor variables for seal density. Density surface modelling allowed for a depiction of seal distribution in
the study area as well as an abundance estimate. This is the first study to use aerial survey data to develop a density surface
model (DSM) for a spatially explicit distribution estimate of seals at sea.
相似文献
Helena HerrEmail: |
16.
Mehdi Razzaghi 《Environmental and Ecological Statistics》2009,16(1):25-36
To establish allowable daily intakes for humans from animal bioassay experiments, benchmark doses corresponding to low levels
of risk have been proposed to replace the no-observed-adverse-effect level for non-cancer endpoints. When the experimental
outcomes are quantal, each animal can be classified with or without the disease. The proportion of affected animals is observed
as a function of dose and calculation of the benchmark dose is relatively simple. For quantitative responses, on the other
hand, one method is to convert the continuous data to quantal data and proceed with benchmark dose estimation. Another method
which has found more popularity (Crump, Risk Anal 15:79–89; 1995) is to fit an appropriate dose–response model to the continuous
data, and directly estimate the risk and benchmark doses. The normal distribution has often been used in the past as a dose–response
model. However, for non-symmetric data, the normal distribution can lead to erroneous results. Here, we propose the use of
the class of beta-normal distribution and demonstrate its application in risk assessment for quantitative responses. The most
important feature of this class of distributions is its generality, encompassing a wide range of distributional shapes including
the normal distribution as a special case. The properties of the model are briefly discussed and risk estimates are derived
based on the asymptotic properties of the maximum likelihood estimates. An example is used for illustration.
相似文献
Mehdi RazzaghiEmail: |
17.
David Fletcher 《Environmental and Ecological Statistics》2008,15(2):175-189
Data that are skewed and contain a relatively high proportion of zeros can often be modelled using a delta-lognormal distribution.
We consider three methods of calculating a 95% confidence interval for the mean of this distribution, and use simulation to
compare the methods, across a range of realistic scenarios. The best method, in terms of coverage, is that based on the profile-likelihood.
This gives error rates that are within 1% (lower limit) or 3% (upper limit) of the nominal level, unless the sample size is
small and the level of skewness is moderate to high. Our results will also apply to the delta-lognormal linear model, when
we wish to calculate a confidence interval for the expected value of the response variable, given the value of one or more
explanatory variables. We illustrate the three methods using data on red cod densities, taken from a fisheries trawl survey
in New Zealand.
相似文献
David FletcherEmail: |
18.
We present a method for detecting the zones where an irregularly sampled variable changes abruptly in the plane. Such zones
are called Zones of Abrupt Change (ZACs). This method not only allows estimation of ZACs, but also testing of their statistical
significance against the null hypothesis of a stationary correlated random field. The sampling pattern, in particular its
local density, is crucial in the detection of potential ZACs. In this paper, we address the problem of evaluating the sampling
pattern by assessing the power of the local test used for detecting ZACs. It is shown that mapping the power allows us to
identify zones where ZACs may or may not be detected. The methodology is applied to a soil data set sampled at eight different
dates in an agricultural field. Detecting ZACs for the soil water content allowed us to identify permanent structures in the
agricultural field related to the boundaries between different soil types. Mapping the power for various sampling densities
proved to be useful to determine the minimal sampling density necessary for detecting ZACs.
相似文献
Edith GabrielEmail: |
19.
M. M. Manzano-Sarabia E. A. Aragón-Noriega C. A. Salinas-Zavala D. B. Lluch-Cota 《Marine Biology》2007,152(5):1021-1029
Life histories of penaeid shrimp have been classified according to the preferred habitats of postlarval, juvenile, and adult
stages, ranging from exclusively estuarine to exclusively offshore waters. Brown shrimp Farfantepenaeus californiensis migrate to an offshore habitat at the juvenile stage or even a smaller body size. This paper presents results of monthly
samplings from 24 stations over 1 year in the Agiabampo Lagoon complex, a hypersaline lagoon in northwestern Mexico. Five
species of penaeid shrimp were identified, with brown shrimp the most abundant during the year of sampling. Results suggest
that residency of brown shrimp inside this lagoon is longer than reported in previous studies. An interaction between length
and environmental variables (near-surface temperature, salinity, and rainfall) appear to be cues concerning migration.
相似文献
C. A. Salinas-ZavalaEmail: |
20.
Dimitris Karlis Vassilis G. S. Vasdekis Maria Banti 《Environmental and Ecological Statistics》2009,16(3):355-367
Heteroscedastic additive and multiplicative models are proposed to disaggregate household data on water consumption from Athens
and provide individual consumption estimates. The models adjust for heteroscedasticity assuming that variances relate to covariates.
Household characteristics that can influence consumption are also included into models in order to allow for a clearer measurement
of individual characteristics effects. Estimation is accomplished through a penalized least squares approach. The method is
applied to a sample of real data related to domestic water consumption in Athens. The results show a greater consumption of
water for males while the single-female households are these that use the lowest quantities of water. The consumption curves
by age and gender are constructed presenting differences between the two sexes.
相似文献
Vassilis G. S. VasdekisEmail: |