首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
How do additional data of the same and/or different type contribute to reducing model parameter and predictive uncertainties? Most modeling applications of soil organic carbon (SOC) time series in agricultural field trial datasets have been conducted without accounting for model parameter uncertainty. There have been recent advances with Monte Carlo-based uncertainty analyses in the field of hydrological modeling that are applicable, relevant and potentially valuable in modeling the dynamics of SOC. Here we employed a Monte Carlo method with threshold screening known as Generalized Likelihood Uncertainty Estimation (GLUE) to calibrate the Introductory Carbon Balance Model (ICBM) to long-term field trail data from Ultuna, Sweden and Machang’a, Kenya. Calibration results are presented in terms of parameter distributions and credibility bands on time series simulations for a number of case studies. Using these methods, we demonstrate that widely uncertain model parameters, as well as strong covariance between inert pool size and rate constant parameters, exist when root mean square simulation errors were within uncertainties in input estimations and data observations. We show that even rough estimates of the inert pool (perhaps from chemical analysis) can be quite valuable to reduce uncertainties in model parameters. In fact, such estimates were more effective at reducing parameter and predictive uncertainty than an additional 16 years time series data at Ultuna. We also demonstrate an effective method to jointly, simultaneously and in principle more robustly calibrate model parameters to multiple datasets across different climatic regions within an uncertainty framework. These methods and approaches should have benefits for use with other SOC models and datasets as well.  相似文献   

2.
The first step in developing travel time and water quality models in streams is to correctly model solute transport mechanisms. In this paper a comparison between two solute transport models is performed. The parameters of the Transient Storage model (TS) and the Aggregated Dead Zone model (ADZ) are estimated using data of thirty seven tracer experiments carried out under different discharges in five mountain streams of Colombian Los Andes. Calibration is performed with the generalized uncertainty estimation method (GLUE) based on Monte-Carlo simulations. Aspects of model parameters identifiability and model parsimony are analyzed and discussed. The TS model with four parameters shows excellent results during calibration but the model parameters present high interaction and poor identifiability. The ADZ model with two independent and clearly identifiable parameters gives sufficiently precise calibration results. As a conclusion, it is stated that the ADZ model with only two parameters is a parsimonious model that is able to represent solute transport mechanisms of advection and longitudinal dispersion in the studied mountain streams. A simple model parameter estimation methodology as a function of discharge is proposed in this work to be used in prediction mode of travel time and solute transport applications along mountain streams.  相似文献   

3.
Recent trends in lake and stream water quality modeling indicate a conflict between the search for improved accuracy through increasing model size and complexity, and the search for applicability through simplification of already existing models. Much of this conflict turns on the fact that that which can be simulated in principle issimply not matched by that which can be observed and verified in practice. This paper is concerned with that conflict. Its aim is to introduce and clarify some of the arguments surrounding two issues of key importance in resolving the conflict: uncertainty in the mathematical relationships hypothesized for a particular model (calibration and model structure identification); and uncertainty associated with the predictions obtained from the model (prediction error analysis). These are issues concerning the reliability of models and model-based forecasts. The paper argues, in particular, that there is an intimate relationship between prediction and model calibration. This relationship is especially important in accounting for uncertainty in the development and use of models. Using this argument it is possible to state a dilemma which captures some limiting features of both large and small models.  相似文献   

4.
Abstract:   In conservation biology, uncertainty about the choice of a statistical model is rarely considered. Model-selection uncertainty occurs whenever one model is chosen over plausible alternative models to represent understanding about a process and to make predictions about future observations. The standard approach to representing prediction uncertainty involves the calculation of prediction (or confidence) intervals that incorporate uncertainty about parameter estimates contingent on the choice of a "best" model chosen to represent truth. However, this approach to prediction based on statistical models tends to ignore model-selection uncertainty, resulting in overconfident predictions. Bayesian model averaging (BMA) has been promoted in a range of disciplines as a simple means of incorporating model-selection uncertainty into statistical inference and prediction. Bayesian model averaging also provides a formal framework for incorporating prior knowledge about the process being modeled. We provide an example of the application of BMA in modeling and predicting the spatial distribution of an arboreal marsupial in the Eden region of southeastern Australia. Other approaches to estimating prediction uncertainty are discussed.  相似文献   

5.
Reliable prediction of the effects of landscape change on species abundance is critical to land managers who must make frequent, rapid decisions with long-term consequences. However, due to inherent temporal and spatial variability in ecological systems, previous attempts to predict species abundance in novel locations and/or time frames have been largely unsuccessful. The Effective Area Model (EAM) uses change in habitat composition and geometry coupled with response of animals to habitat edges to predict change in species abundance at a landscape scale. Our research goals were to validate EAM abundance predictions in new locations and to develop a calibration framework that enables absolute abundance predictions in novel regions or time frames. For model validation, we compared the EAM to a null model excluding edge effects in terms of accurate prediction of species abundance. The EAM outperformed the null model for 83.3% of species (N=12) for which it was possible to discern a difference when considering 50 validation sites. Likewise, the EAM outperformed the null model when considering subsets of validation sites categorized on the basis of four variables (isolation, presence of water, region, and focal habitat). Additionally, we explored a framework for producing calibrated models to decrease prediction error given inherent temporal and spatial variability in abundance. We calibrated the EAM to new locations using linear regression between observed and predicted abundance with and without additional habitat covariates. We found that model adjustments for unexplained variability in time and space, as well as variability that can be explained by incorporating additional covariates, improved EAM predictions. Calibrated EAM abundance estimates with additional site-level variables explained a significant amount of variability (P < 0.05) in observed abundance for 17 of 20 species, with R2 values >25% for 12 species, >48% for six species, and >60% for four species when considering all predictive models. The calibration framework described in this paper can be used to predict absolute abundance in sites different from those in which data were collected if the target population of sites to which one would like to statistically infer is sampled in a probabilistic way.  相似文献   

6.
There is a need for decadal predictions of the seabed evolution, for example to inform resurvey strategies when maintaining navigation channels. The understanding of the physical processes involved in morphological evolution, and the viability of process models to accurately model evolution over these time scales, are currently limited. As a result, statistical approaches are used to supply long-term forecasts. In this paper, we introduce a novel statistical approach for this problem: the autoregressive Hilbertian model (ARH). This model naturally assesses the time evolution of spatially-distributed measurements. We apply the technique to a coastal area in the East Anglian coast over the period 1846 to 2002, and compare with two other statistical methods used recently for seabed prediction: the autoregressive model and the EOF model. We evaluate the performance of the three methods by comparing observations and predictions for 2002. The ARH model enables a reduction of 10% of the root mean squared errors. Finally, we compute the variability in the predictions related to time sampling using the jackknife, a method that uses subsamples to quantify uncertainties.  相似文献   

7.
In the present study, we demonstrate an integrated modeling approach for predicting internal tissue concentrations of chemicals by coupling a multimedia environmental model and a generic physiologically based pharmacokinetic (PBPK) model. A case study was designed for a region situated on the Seine river watershed, downstream of the Paris megacity, and for benzo(a)pyrene emitted from industrial zones in the region. In this case study, these two models are linked only by water intake from riverine system for the multimedia model into human body for the PBPK model. The limited monitoring data sets of B(a)P concentrations in bottom sediment and in raw river water, obtained at the downstream of Paris, were used to re-construct long-term daily concentrations of B(a)P in river water. The re-construction of long-term series of B(a)P level played a key role for the intermediate model calibration (conducted in multimedia model) and thus for improving model input to PBPK model. In order to take into account the parametric uncertainty in the model inputs, some input parameters relevant for the multimedia model were given by probability density functions (PDFs); some generic PDFs were updated with site-specific measurements by a Bayesian approach. The results of this study showed that the multimedia model fits well with actual annual measurements in sediments over one decade. No accumulation of B(a)P in the organs was observed. In conclusion, this case study demonstrated the feasibility of a full-chain assessment combining multimedia environmental predictions and PBPK modeling, including uncertainty and sensitivity analyses.  相似文献   

8.
We investigate how the viability and harvestability predicted by population models are affected by details of model construction. Based on this analysis we discuss some of the pitfalls associated with the use of classical statistical techniques for resolving the uncertainties associated with modeling population dynamics. The management of the Serengeti wildebeest (Connochaetes taurinus) is used as a case study. We fitted a collection of age-structured and unstructured models to a common set of available data and compared model predictions in terms of wildebeest viability and harvest. Models that depicted demographic processes in strikingly different ways fitted the data equally well. However, upon further analysis it became clear that models that fit the data equally well could nonetheless have very different management implications. In general, model structure had a much larger effect on viability analysis (e.g., time to collapse) than on optimal harvest analysis (e.g., harvest rate that maximizes harvest). Some modeling decisions, such as including age-dependent fertility rates, did not affect management predictions, but others had a strong effect (e.g., choice of model structure). Because several suitable models of comparable complexity fitted the data equally well, traditional model selection methods based on the parsimony principle were not practical for judging the value of alternative models. Our results stress the need to implement analytical frameworks for population management that explicitly consider the uncertainty about the behavior of natural systems.  相似文献   

9.
Risk-Based Viable Population Monitoring   总被引:3,自引:0,他引:3  
Abstract:  We describe risk-based viable population monitoring, in which the monitoring indicator is a yearly prediction of the probability that, within a given timeframe, the population abundance will decline below a prespecified level. Common abundance-based monitoring strategies usually have low power to detect declines in threatened and endangered species and are largely reactive to declines. Comparisons of the population's estimated risk of decline over time will help determine status in a more defensible manner than current monitoring methods. Monitoring risk is a more proactive approach; critical changes in the population's status are more likely to be demonstrated before a devastating decline than with abundance-based monitoring methods. In this framework, recovery is defined not as a single evaluation of long-term viability but as maintaining low risk of decline for the next several generations. Effects of errors in risk prediction techniques are mitigated through shorter prediction intervals, setting threshold abundances near current abundance, and explicitly incorporating uncertainty in risk estimates. Viable population monitoring also intrinsically adjusts monitoring effort relative to the population's true status and exhibits considerable robustness to model misspecification. We present simulations showing that risk predictions made with a simple exponential growth model can be effective monitoring indicators for population dynamics ranging from random walk to density dependence with stable, decreasing, or increasing equilibrium. In analyses of time-series data for five species, risk-based monitoring warned of future declines and demonstrated secure status more effectively than statistical tests for trend.  相似文献   

10.
Boreal forest soils such as those in Sweden contain a large active carbon stock. Hence, a relatively small change in this stock can have a major impact on the Swedish national CO2 balance. Understanding of the uncertainties in the estimations of soil carbon pools is critical for accurately assessing changes in carbon stocks in the national reports to UNFCCC and the Kyoto Protocol. Our objective was to analyse the parameter uncertainties of simulated estimates of the soil organic carbon (SOC) development between 1994 and 2002 in Swedish coniferous forests with the Q model. Both the sensitivity of model parameters and the uncertainties in simulations were assessed. Data of forests with Norway spruce, Scots pine and Lodgepole pine, from the Swedish Forest Soil Inventory (SFSI) were used. Data of 12 Swedish counties were used to calibrate parameter settings; and data from another 11 counties to validate. The “limits of acceptability” within GLUE were set at the 95% confidence interval for the annual, mean measured SOC at county scale. The calibration procedure reduced the parameter uncertainties and reshaped the distributions of the parameters county-specific. The average measured and simulated SOC amounts varied from 60 t C ha−1 in northern to 140 t C ha−1 in the southern Sweden. The calibrated model simulated the soil carbon pool within the limits of acceptability for all calibration counties except for one county during one year. The efficiency of the calibrated model varied strongly; for five out of 12 counties the model estimates agreed well with measurements, for two counties agreement was moderate and for five counties the agreement was poor. The lack of agreement can be explained with the high inter-annual variability of the down-scaled measured SOC estimates and changes in forest areas over time. We conclude that, although we succeed in reducing the uncertainty in the model estimates, calibrating of a regional scale process-oriented model using a national scale dataset is a sensitive balance between introducing and reducing uncertainties. Parameter distributions showed to be scale sensitive and county specific. Further analysis of uncertainties in the methods used for reporting SOC changes to the UNFCCC and Kyoto protocol is recommended.  相似文献   

11.
This paper presents an uncertainty and sensitivity analysis of a pharmacokinetic modeling of inorganic arsenic deposition in rodents for a short‐term exposure. Efforts to develop the pharmacokinetic model are directed towards predicting the kinetic behavior of inorganic arsenic in the body, including tissue and blood concentrations, and especially, the urinary excretion of arsenic and its methylated metabolites. However, the use of the model raises an important question when fixed values of model parameters are used: how is the uncertainty in the model prediction based on the collective uncertainties in the model inputs? This study focuses on an “epistemic”; uncertainty in order to handle this problem. In this case, the uncertainty refers to an input that has a single value which cannot be known with precision due to a lack of knowledge about items or its measurement. The combination of the pharmacokinetic model and the uncertainty analysis would help understand the uncertainties in risk assessment associated with inorganic arsenic.  相似文献   

12.
We present a strategy for using an empirical forest growth model to reduce uncertainty in predictions made with a physiological process-based forest ecosystem model. The uncertainty reduction is carried out via Bayesian melding, in which information from prior knowledge and a deterministic computer model is conditioned on a likelihood function. We used predictions from an empirical forest growth model G-HAT in place of field observations of aboveground net primary productivity (ANPP) in a deciduous temperate forest ecosystem. Using Bayesian melding, priors for the inputs of the process-based forest ecosystem PnET-II were propagated through the model, and likelihoods for the PnET-II output ANPP were calculated using the G-HAT predictions. Posterior distributions for ANPP and many PnET-II inputs obtained using the G-HAT predictions largely matched posteriors obtained using field data. Since empirical growth models are often more readily available than extensive field data sets, the method represents a potential gain in efficiency for reducing the uncertainty of process-based model predictions when reliable empirical models are available but high-quality data are not.  相似文献   

13.
Large, fine-grained samples are ideal for predictive species distribution models used for management purposes, but such datasets are not available for most species and conducting such surveys is costly. We attempted to overcome this obstacle by updating previously available coarse-grained logistic regression models with small fine-grained samples using a recalibration approach. Recalibration involves re-estimation of the intercept or slope of the linear predictor and may improve calibration (level of agreement between predicted and actual probabilities). If reliable estimates of occurrence likelihood are required (e.g., for species selection in ecological restoration) calibration should be preferred to other model performance measures. This updating approach is not expected to improve discrimination (the ability of the model to rank sites according to species suitability), because the rank order of predictions is not altered. We tested different updating methods and sample sizes with tree distribution data from Spain. Updated models were compared to models fitted using only fine-grained data (refitted models). Updated models performed reasonably well at fine scales and outperformed refitted models with small samples (10-100 occurrences). If a coarse-grained model is available (or could be easily developed) and fine-grained predictions are to be generated from a limited sample size, updating previous models may be a more accurate option than fitting a new model. Our results encourage further studies on model updating in other situations where species distribution models are used under different conditions from their training (e.g., different time periods, different regions).  相似文献   

14.
Dynamic vegetation models are useful tools for analysing terrestrial ecosystem processes and their interactions with climate through variations in carbon and water exchange. Long-term changes in structure and composition (vegetation dynamics) caused by altered competitive strength between plant functional types (PFTs) are attracting increasing attention as controls on ecosystem functioning and potential feedbacks to climate. Imperfect process knowledge and limited observational data restrict the possibility to parameterise these processes adequately and potentially contribute to uncertainty in model results. This study addresses uncertainty among parameters scaling vegetation dynamic processes in a process-based ecosystem model, LPJ-GUESS, designed for regional-scale studies, with the objective to assess the extent to which this uncertainty propagates to additional uncertainty in the tree community structure (in terms of the tree functional types present and their relative abundance) and thus to ecosystem functioning (carbon storage and fluxes). The results clearly indicate that the uncertainties in parameterisation can lead to a shift in competitive balance, most strikingly among deciduous tree PFTs, with dominance of either shade-tolerant or shade-intolerant PFTs being possible, depending on the choice of plausible parameter values. Despite this uncertainty, our results indicate that the resulting effect on ecosystem functioning is low. Since the vegetation dynamics in LPJ-GUESS are representative for the more complex Earth system models now being applied within ecosystem and climate research, we assume that our findings will be of general relevance. We suggest that, in terms of carbon storage and fluxes, the heavier parameterisation requirement of the processes involved does not widen the overall uncertainty in model predictions.  相似文献   

15.
Natural capital is complex to value notably because of the high uncertainties surrounding the substitutability of its future ecosystem services. We examine a Lucas economy in which a consumption good is produced by combining different inputs, one of them being an ecosystem service that is partially substitutable with other inputs. The growth rate of these inputs and the elasticity of substitution evolve in a stochastic way. We characterize the socially efficient ecological discount rates that should be used to value future ecosystem services at different time horizons. We show that the inverse of the elasticity of substitution can be interpreted as the CCAPM beta of natural capital. We also show that any increase in risk of this beta reduces the ecological discount rate. If our collective beliefs about the elasticity of substitution of ecosystem services are Gaussian, the ecological discount rates go to minus infinity for finite maturities. In that case, a marginal increase in natural capital has an infinite value. We provide a realistic calibration of the model that is coherent with observed asset prices by using the model of extreme events of Barro (2006). The bliss maturity for infinite discount factors is less than 100 years in this calibration.  相似文献   

16.
《Ecological modelling》2005,185(1):13-27
This paper describes an approach for conducting spatial uncertainty analysis of spatial population models, and illustrates the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial population models typically simulate birth, death, and migration on an input map that describes habitat. Typically, only a single “reference” map is available, but we can imagine that a collection of other, slightly different, maps could be drawn to represent a particular species’ habitat. As a first approximation, our approach assumes that spatial uncertainty (i.e., the variation among values assigned to a location by such a collection of maps) is constrained by characteristics of the reference map, regardless of how the map was produced. Our approach produces lower levels of uncertainty than alternative methods used in landscape ecology because we condition our alternative landscapes on local properties of the reference map. Simulated spatial uncertainty was higher near the borders of patches. Consequently, average uncertainty was highest for reference maps with equal proportions of suitable and unsuitable habitat, and no spatial autocorrelation. We used two population viability models to evaluate the ecological consequences of spatial uncertainty for landscapes with different properties. Spatial uncertainty produced larger variation among predictions of a spatially explicit model than those of a spatially implicit model. Spatially explicit model predictions of final female population size varied most among landscapes with enough clustered habitat to allow persistence. In contrast, predictions of population growth rate varied most among landscapes with only enough clustered habitat to support a small population, i.e., near a spatially mediated extinction threshold. We conclude that spatial uncertainty has the greatest effect on persistence when the amount and arrangement of suitable habitat are such that habitat capacity is near the minimum required for persistence.  相似文献   

17.
《Ecological modelling》2005,183(4):463-476
A mass-balance model was developed to simulate organic matter (OM) dynamics in headwater stream ecosystems of south-western British Columbia, Canada. Empirical data from two streams were used to structure and test a mass-balance model of the riparian–stream system. The model was driven by data on inputs, outputs, processing rates, discharge and water temperature. Statistical sub-models were derived for different processes (e.g. decomposition rates and periphyton growth). Inputs and outputs of OM were modelled on the basis of a series of assumptions of system properties, such as temperature and hydrological regimes. Major uncertainties identified through Monte-Carlo simulations of model predictions and variables important in controlling OM dynamics in these streams were dissolved OM (DOM) import and export, stream area and litterfall import. DOM was quantitatively the most important source of OM, accounting for 80% of total export of OM, followed by export of fine particulate organic matter (FPOM) at 15%. Different scenarios of logging and temperature regimes on the system were simulated to predict how these factors would affect standing stock of OM in the stream. When inputs of riparian litterfall were simulated to mirror reductions predicted from forest harvesting in the riparian area particulate OM (POM) standing stock was reduced by almost 80%. In comparison, a 3 °C increase in water temperature resulted in only a 20% reduction of POM standing stock due to enhanced mineralisation.  相似文献   

18.
19.
《Ecological modelling》2007,207(1):34-44
A simple simulation model has been used to investigate whether large fires in Mediterranean regions are a result of extreme weather conditions or the cumulative effect of a policy of fire suppression over decades. The model reproduced the fire regime characteristics for a wide variety of regions of Mediterranean climate in California, France and Spain. The Generalised Likelihood Uncertainty Estimation (GLUE) methodology was used to assess the possibility of multiple model parameter sets being consistent with the available calibration data. The resulting set of behavioural models was used to assess uncertainty in the predictions. The results suggested that (1) for a given region, the total area burned is much the same whether suppression or prescribed fire policies are used or not; however fire suppression enhances fire intensity and prescribed burning reduces it; (2) the proportion of large fires can be reduced, but not eliminated, using prescribed fires, especially in areas which have the highest proportion of large fires.  相似文献   

20.
Making Consistent IUCN Classifications under Uncertainty   总被引:5,自引:0,他引:5  
Abstract: The World Conservation Union (IUCN) defined a set of categories for conservation status supported by decision rules based on thresholds of parameters such as distributional range, population size, population history, and risk of extinction. These rules have received international acceptance and have become one of the most important decision tools in conservation biology because of their wide applicability, objectivity, and simplicity of use. The input data for these rules are often estimated with considerable uncertainty due to measurement error, natural variation, and vagueness in definitions of parameters used in the rules. Currently, no specific guidelines exist for dealing with uncertainty. Interpretation of uncertain data by different assessors may lead to inconsistent classifications because attitudes toward uncertainty and risk may have an important influence on the classification of threatened species. We propose a method of dealing with uncertainty that can be applied to the current IUCN criteria without altering the rules, thresholds, or intent of these criteria. Our method propagates the uncertainty in the input parameters and assigns the evaluated species either to a single category (as the current criteria do) or to a range of plausible categories, depending on the nature and extent of uncertainties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号