首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this paper, we explore a range of concerns that arise in measuring short-term effects of ozone on health. In particular, we tackle the problem of measuring exposure using alternative daily measures of ozone derived from hourly concentrations. We adopt the exposure paradigm of Chiogna and Bellini (Environmetrics 13:55–69, 2002) extending it to ozone concentrations, and we compare its performances with respect to traditional exposure measures by exploiting model selection. To investigate the stability of model selection, we then apply the idea of bootstrapping the modelling process.  相似文献   

2.
In this paper, we propose a semiparametric survival model to investigate the pattern of spatial and temporal variation in disease prevalence of chronic wasting disease (CWD) in wild deer in Wisconsin over the years 2002 and 2006. The semiparametric survival model we suggested allows to build a more flexible model than the parametric model with fewer parametric assumptions by modeling the baseline hazard using a Gamma process prior. Based on the proposed model, we investigate the geographical distribution of CWD, and assess the effect of sex on disease prevalence. We use a Bayesian hierarchical framework where latent parameters capture temporal and spatial trends in disease incidence, incorporating sex and spatially correlated random effects. We also propose bivariate baseline hazard which change over age and time simultaneously to adopt different effects of age and time on the baseline hazard. Inference is carried out by using MCMC simulation techniques in a fully Bayesian framework. Our results suggest that disease has been spreaded mainly in the disease eradication zone and male deer show a significantly higher infection probability than female deer.  相似文献   

3.
We consider contracting of a principal with an agent if multilateral externalities are present. The motivating example is that of an international climate agreement given private information about the willingness-to-pay (WTP) for emissions abatement. Due to multilateral externalities the principal uses her own emissions besides subsidies to incentivize the agent and to assure his participation. Optimal contracts equalize marginal abatement costs and, thus, can be implemented by a system of competitive permit trading. Moreover, optimal contracts can include a boundary part (i.e., the endogenous, type dependent participation constraint is binding), which is not a copy of the outside option of no contract. Compared to this outside option, a contract can increase emissions of the principal for types with a low WTP, and reduce her payoff for high types. Subsidies can be constant or even decreasing in emission reductions, and turn negative so that the agent reduces emissions and pays the principal.  相似文献   

4.
In this paper we present a hierarchical Bayesian analysis for a predator–prey model applied to ecology considering the use of Markov Chain Monte Carlo methods. We consider the introduction of a random effect in the model and the presence of a covariate vector. An application to ecology is considered using a data set related to the plankton dynamics of lake Geneva for the year 1990. We also discuss some aspects of discrimination of the proposed models.  相似文献   

5.
Environmental Fluid Mechanics - The accurate simulation of wetting–drying processes in floodplains and coastal zones is a challenge for hydrodynamic modelling, especially for long time...  相似文献   

6.
The comparison of increasing doses of a treatment to a negative control is frequently part of toxicological studies. For normally distributed data Williams (1971, 1972) introduced a maximum likelihood test under total order restriction. But until now there seems to have been no solution for the arbitrary unbalanced case. According to the idea proposed by Robertson et al. (1988) we will apply in this article the basic concept of Williams on the class of multiple contrast tests for the general unbalanced parametric set-up. Simulation results for size and power and two examples for estimating the minimal toxic dose (MTD) are given.  相似文献   

7.
8.
The objective of this study was to develop regression models to estimate the total concentration of polybrominated diphenyl ethers in serum based on the known concentrations of a limited number of congeners. Because of the possible adverse health effects associated with the exposure to polybrominated diphenyl ethers, it is of interest to know their total concentrations. Data from the National Health and Nutrition Examination Survey for 2003–2004 (N = 1859) were used to develop regression models to estimate both wet weight and lipid-adjusted total concentrations. Only the knowledge of three congeners, namely, 2,2′,4,4′-tetrabromodiphenyl ether, 2,2′,4,4′,5-pentadibromodiphenyl ether, and 2,2′,4,4′,5,5′-hexabromodiphenyl ether was required to use these models. Other than the concentrations of these three congeners, age, gender, and smoking status were the only information needed to use these models. Optionally, models were developed that could also use the race/ethnicity of the participants. All models explained more than 98% of the known variability in the observed total concentration levels. Over 98% of the model generated, predicted values were found to be within 5% of the observed values.  相似文献   

9.
The statistical analysis of continuous data that is non-negative is a common task in quantitative ecology. An example, and our motivation, is the weight of a given fish species in a fish trawl. The analysis task is complicated by the occurrence of exactly zero observations. It makes many statistical methods for continuous data inappropriate. In this paper we propose a model that extends a Tweedie generalised linear model. The proposed model exploits the fact that a Tweedie distribution is equivalent to the distribution obtained by summing a Poisson number of gamma random variables. In the proposed model, both the number of gamma variates, and their average size, are modelled separately. The model has a composite link and has a flexible mean-variance relationship that can vary with covariates. We illustrate the model, and compare it to other models, using data from a fish trawl survey in south-east Australia.  相似文献   

10.
Traditional occupancy–abundance and abundance–variance–occupancy models do not take into account zero-inflation, which occurs when sampling rare species or in correlated counts arising from repeated measures. In this paper we propose a novel approach extending occupancy–abundance relationships to zero-inflated count data. This approach involves three steps: (1) selecting distributional assumptions and parsimonious models for the count data, (2) estimating abundance, occupancy and variance parameters as functions of site- and/or time-specific covariates, and (3) modelling the occupancy–abundance relationship using the parameters estimated in step 2. Five count datasets were used for comparing standard Poisson and negative binomial distribution (NBD) occupancy–abundance models. Zero-inflated Poisson (ZIP) and zero-inflated negative binomial (ZINB) occupancy–abundance models were introduced for the first time, and these were compared with the Poisson, NBD, He and Gaston's and Wilson and Room's abundance–variance–occupancy models. The percentage of zero counts ranged from 45 to 80% in the datasets analysed. For most of the datasets, the ZINB occupancy–abundance model performed better than the traditional Poisson, NBD and Wilson and Room's model. He and Gaston's model performed better than the ZINB in two out of the five datasets. However, the occupancy predicted by all models increased faster than the observed as density increased resulting in significant mismatch at the highest densities. Limitations of the various models are discussed, and the need for careful choice of count distributions and predictors in estimating abundance and occupancy parameter are indicated.  相似文献   

11.
12.
13.
Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack–Jolly–Seber model and its extensions.  相似文献   

14.
Most of the statistical techniques used to evaluate the data obtained from toxicity studies are based on the assumption that the data show a normal distribution and homogeneity of variance. Literature review on toxicity studies on laboratory animals reveals that in most of the cases homogeneity of variance alone is examined for the data obtained from these studies. But the data that show homogeneity of variance need not always show a normal distribution. In fact, most of the data derived from toxicity studies, including hematological and biochemical parameters show a non-normal distribution. On examining normality of data obtained from various toxicity studies using different normality tests, we observed that Shapiro-Wilk test is more appropriate than Kolmogorov-Smimov test, Lilliefors test, the normal probability paper analysis and Chi square test. But there are situations, especially in the long-term toxicity studies, where normality is not shown by one or more than one of the dosage groups. In this situation, we propose that the data maybe analyzed using Dunnett multiple comparison test after excluding the data of the groups that do not show normality However, the biological relevance of the excluded data has to be carefully scrutinized. We also observed that the tendency of the data to show a normal distribution seems to be related to the age of the animals. Present paper describes various tests commonly used to test normality and their power, and also emphasizes the need of subjecting the data obtained from toxicity studies to both normality and homogeneity tests. A flow chart suggesting the statistical techniques that maybe used for both the types of data showing a normal or non-normal distribution is also proposed.  相似文献   

15.
The performance of discrete mathematical models to describe the population dynamics of diamondback moth (DBM) (Plutella xylostella L.) and its parasitoid Diadegma semiclausum was investigated. The parameter values for several well-known models (Nicholson–Bailey, Hassell and Varley, Beddington, Free and Lawton, May, Holling type 2, 3 and Getz and Mills functional responses) were estimated. The models were tested on 20 consecutive sets of time series data collected at 14 days interval for pest and parasitoid populations obtained from a highland cabbage growing area in eastern Kenya. Model parameters were estimated from minimized squared difference between the numerical solution of the model equations and the empirical data using Powell's method. Maximum calculated DBM growth rates varied between 0.02 and 0.07. The carrying capacity determined at 16.5 DBM/plant by the Beddington et al. model was within the range of field data. However, all the estimated parameter values relating to the parasitoid, including the instantaneous searching rate (0.07–0.28), per capita searching efficiency (0.20–0.27), search time (5.20–5.33), handling time (0.77–0.90), and parasitism aggregation index (0.33), were well outside the range encountered empirically. All models evaluated for DBM under Durbin–Watson criteria, except the May model, were not autocorrelated with respect to residuals. In contrast, the criteria applied to the parasitoid residuals showed strong autocorrelations. Thus, these models failed to estimate parasitoid dynamics. We conclude that the interactions of the DBM with its parasitoid cannot be explained by any of the models tested. Two factors may be associated with this failure. First, the parasitoid in this integrated biological control system may not be playing a major role in regulating DBM population. Second, and perhaps more likely, poor correlations reflect gross inadequacies in the theoretical assumptions that underlie the existing models.  相似文献   

16.
● Data quality assessment criteria for MP/NPs in food products were developed. ● Data quality of 71 data records (69 of them only focused on MPs) was assessed. ● About 96% of the data records were considered unreliable in at least one criterion. ● Improvements need to be made regarding positive controls and polymer identification. ● A mismatch between MP/NPs used in toxicity studies and those in foods was recorded. Data on the occurrence of microplastics and nanoplastics (MP/NPs) in foods have been used to assess the human health risk caused by the consumption of MP/NPs. The reliability of the data, however, remains unclear because of the lack of international standards for the analysis of MP/NPs in foods. Therefore, the data quality needs to be assessed for accurate health risk assessment. This study developed 10 criteria applicable to the quality assessment of data on MP/NPs in foods. Accordingly, the reliability of 71 data records (69 of them only focused on MPs) was assessed by assigning a score of 2 (reliable without restrictions), 1 (reliable but with restrictions), or 0 (unreliable) on each criterion. The results showed that only three data records scored 2 or 1 on all criteria, and six data records scored 0 on as many as six criteria. A total of 58 data records did not include information on positive controls, and 12 data records did not conduct the polymer identification, which could result in the overestimation or underestimation of MP/NPs. Our results also indicated that the data quality of unprocessed foods was more reliable than that of processed foods. Furthermore, we proposed a quality assurance and quality control protocol to investigate MP/NPs in foods. Notably, the characteristics of MP/NPs used in toxicological studies and those existing in foods showed a remarkable discrepancy, causing the uncertainty of health risk assessment. Therefore, both the estimated exposure of MP/NPs and the claimed potential health risks should be treated with caution.  相似文献   

17.
Phosphorus-31 nuclear magnetic resonance (NMR) spectroscopy has become popular for the characterization of P species in environmental samples. However, these are commonly made alkaline (pH?>?13) to facilitate sample comparison and ease peak identification, but this may cause hydrolysis of some compounds. This study examined the chemical shift of known P compounds and supplemented this with published data to determine the viability of examining samples at their native pH, thereby minimizing sample disturbance. A 31P NMR pH titration of known P compounds resulted in chemical shifts ranging from about ?22 to 8 ppm in the pH range 5–13. Categorization and calculation of chemical shifts for over 100 naturally occurring compounds indicated that good distinction between orthophosphate diesters, orthophosphate monoesters, nucleotides, phosphonates, and phosphagens was best at ≥pH 7, but unlikely below this pH. Analysis of several water extracts of soil and dung, overland flow samples, and lake water indicated a wide variety of well-defined peaks that were assigned to orthophosphate, orthophosphate monoesters, orthophosphate diesters, pyrophosphate, polyphosphate, or phosphonates. Changing the sample pH to >13 caused many species (such as phosphonates, orthophosphate diesters, and polyphosphates) to decrease either by hydrolysis or precipitation. Hence, it is recommended that samples be analysed at their native pH but, if poorly resolved, should have their pH raised to ≥7.  相似文献   

18.
Metal-organic frameworks(MOFs) are highly promising porous materials known for their exceptional porosity, extensive surface area, and customizable pore structures, making them an ideal solution for hydrogen storage. However, most MOFs research remains confined to the laboratory, lacking practical applications. To address this, the author proposes a shift towards practical applications, the creation of a comprehensive MOFs database, alignment of synthesis with practical considerations, and diversification of MOFs applications. These steps are crucial for harnessing the full potential of MOFs in real-world energy challenges.  相似文献   

19.
Habitat connectivity is a key objective of current conservation policies and is commonly modeled by landscape graphs (i.e., sets of habitat patches [nodes] connected by potential dispersal paths [links]). These graphs are often built based on expert opinion or species distribution models (SDMs) and therefore lack empirical validation from data more closely reflecting functional connectivity. Accordingly, we tested whether landscape graphs reflect how habitat connectivity influences gene flow, which is one of the main ecoevolutionary processes. To that purpose, we modeled the habitat network of a forest bird (plumbeous warbler [Setophaga plumbea]) on Guadeloupe with graphs based on expert opinion, Jacobs’ specialization indices, and an SDM. We used genetic data (712 birds from 27 populations) to compute local genetic indices and pairwise genetic distances. Finally, we assessed the relationships between genetic distances or indices and cost distances or connectivity metrics with maximum-likelihood population-effects distance models and Spearman correlations between metrics. Overall, the landscape graphs reliably reflected the influence of connectivity on population genetic structure; validation R2 was up to 0.30 and correlation coefficients were up to 0.71. Yet, the relationship among graph ecological relevance, data requirements, and construction and analysis methods was not straightforward because the graph based on the most complex construction method (species distribution modeling) sometimes had less ecological relevance than the others. Cross-validation methods and sensitivity analyzes allowed us to make the advantages and limitations of each construction method spatially explicit. We confirmed the relevance of landscape graphs for conservation modeling but recommend a case-specific consideration of the cost-effectiveness of their construction methods. We hope the replication of independent validation approaches across species and landscapes will strengthen the ecological relevance of connectivity models.  相似文献   

20.
Systemic understanding of marine and coastal environment needs data integration following a respective concept e.g. multi-dimensional and functional mapping. A number of new activities will improve data supply for coasts and seas. This data needs to be integrated and combined with socio-economic drivers and resulting pressures. Resulting knowledge base should be able to inform effectively ecosystem-based management actions, such as integrated coastal zone management, maritime spatial planning, extension of Natura 2000 areas or climate change adaptation in coastal regions and maritime sectors. Assessment that aims to inform such processes will require rethinking of priorities for spatial data collection and analysis, in particular building on data sharing and standardization, improved spatial data integration, promoting interoperability of relevant information systems and possibility of assimilating different data types in to models. Different aspects of spatial data should be addressed in coherent implementation of spatial data infrastructure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号