首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
2.
One of the main goals in decision-making for sustainable development is to identify and choose the most sustainable option among different alternatives. This process usually involves a large number of stakeholders with multiple, often conflicting objectives. Facilitating and resolving such difficult decision situations can be complex, so that a more formal and systematic approach to decision-making may be necessary. This paper proposes an integrated multiple criteria decision-support framework specifically developed to provide a systematic, step-by-step guidance to decision-makers. The framework, which is suitable for both corporate and public policy-making in the context of sustainable development, comprises three steps: problem structuring, problem analysis and problem resolution. This paper concentrates on problem analysis and resolution, where decision-makers articulate their preferences for different decision criteria. A suitable Multiple Criteria Decision Analysis (MCDA) technique, such as multi-objective optimisation, goal programming, value-based and outranking approaches, is then used to model the preferences. These techniques are discussed here in some detail, to provide guidance on the choice of the most appropriate MCDA method. Based on the outcome of preference modelling, which estimates the overall 'value' of each alternative being considered, decision-makers can then choose the 'best' or most sustainable option. Such an integrated decision-support framework is useful for providing structure to the debate, ensuring dialogue among decision-makers and showing trade-offs between conflicting objectives. In this way, it may be possible to create shared understanding about the issues, generate a sense of common purpose and, often, resolve 'difficult' decision problems.  相似文献   

3.
One of the main goals in decision-making for sustainable development is to identify and choose the most sustainable option from among different alternatives. This process usually involves a large number of stakeholders with multiple, often conflicting, objectives. Facilitating and resolving such difficult decision situations can be complex, so that a more formal and systematic approach to decision-making may be necessary. This two-part paper proposes an integrated multiple criteria decision-support framework specifically developed to provide systematic, step-by-step guidance to decision-makers. The framework, which is suitable for both corporate and public policy-making in the context of sustainable development, comprises three steps: problem structuring, problem analysis and problem resolution. In this paper, the focus is on problem structuring while Part II concentrates on problem analysis and resolution. Problem structuring includes identification of stakeholders, sustainability issues and indicators relevant for a particular decision problem. Sustainability indicators are used as decision criteria for identifying and choosing the most sustainable option. In the problem analysis step, decision makers articulate their preferences for different decision criteria. A suitable Multiple Criteria Decision Analysis (MCDA) technique, such as multi-objective optimisation, goal programming, value-based and outranking approaches, is then used to model the preferences. These techniques are discussed in Part II, which also gives guidance on the choice of the most appropriate MCDA method. Based on the outcome of preference modelling, which estimates the overall 'value' of each alternative being considered, decision-makers can then choose the 'best' or most sustainable option. Such an integrated decision-support framework is useful for providing structure to the debate, ensuring dialogue among decision-makers and showing trade-offs between conflicting objectives. In this way, it may be possible to create shared understanding about the issues, generate a sense of common purpose, and often, resolve 'difficult' decision problems.  相似文献   

4.
5.
The Eastern Arc Mountains (EAMs) of Tanzania and Kenya support some of the most ancient tropical rainforest on Earth. The forests are a global priority for biodiversity conservation and provide vital resources to the Tanzanian population. Here, we make a first attempt to predict the spatial distribution of 40 EAM tree species, using generalised additive models, plot data and environmental predictor maps at sub 1 km resolution. The results of three modelling experiments are presented, investigating predictions obtained by (1) two different procedures for the stepwise selection of predictors, (2) down-weighting absence data, and (3) incorporating an autocovariate term to describe fine-scale spatial aggregation. In response to recent concerns regarding the extrapolation of model predictions beyond the restricted environmental range of training data, we also demonstrate a novel graphical tool for quantifying envelope uncertainty in restricted range niche-based models (envelope uncertainty maps). We find that even for species with very few documented occurrences useful estimates of distribution can be achieved. Initiating selection with a null model is found to be useful for explanatory purposes, while beginning with a full predictor set can over-fit the data. We show that a simple multimodel average of these two best-model predictions yields a superior compromise between generality and precision (parsimony). Down-weighting absences shifts the balance of errors in favour of higher sensitivity, reducing the number of serious mistakes (i.e., falsely predicted absences); however, response functions are more complex, exacerbating uncertainty in larger models. Spatial autocovariates help describe fine-scale patterns of occurrence and significantly improve explained deviance, though if important environmental constraints are omitted then model stability and explanatory power can be compromised. We conclude that the best modelling practice is contingent both on the intentions of the analyst (explanation or prediction) and on the quality of distribution data; generalised additive models have potential to provide valuable information for conservation in the EAMs, but methods must be carefully considered, particularly if occurrence data are scarce. Full results and details of all species models are supplied in an online Appendix.  相似文献   

6.
Although predator–prey cycles can be easily predicted with mathematical models it is only since recently that oscillations observed in a chemostat predator–prey (rotifer–algal) experiment offer an interesting workbench for testing model soundness. These new observations have highlighted the limitations of the conventional modelling approach in correctly reproducing some unexpected characteristics of the cycles. Simulations are improved when changes in algal community structure, resulting from natural selection operating on an assemblage of algal clones differing in competitive ability and defence against rotifer predation, is considered in multi-prey models. This approach, however, leads to extra complexity in terms of state variables and parameters. We show here that multi-prey models with one predator can be effectively approximated with a simpler (only a few differential equations) model derived in the context of adaptive dynamics and obtained with a moment-based approximation. The moment-based approximation has been already discussed in the literature but mostly in a theoretical context, therefore we focus on the strength of this approach in downscaling model complexity by relating it to the chemostat predator–prey experiment. Being based on mechanistic concepts, our modelling framework can be applied to any community of competing species for which a trade-off between competitive ability and resistance to predators can be appropriately defined. We suggest that this approach can be of great benefit for reducing complexity in biogeochemical modelling studies at the basin or global ocean scale.  相似文献   

7.
We consider one and two-dimensional minimal models in plankton dynamics. The influence of oscillating boundary forcing functions as agents for triggering pattern formation is discussed. In particular it is found that in these conditions population waves arise for one dimensional models, while for two dimensional models, different amplitudes and frequencies in the boundary forcing generate definite patterns, mimicking the boundary term. This happens even though the model we investigate is very simple. The emergence of these features is an interesting metaphor for the fundamental biological problem of how pattern formation processes may be inevitable in natural heterogeneous ecosystems.  相似文献   

8.
Conservation biologists increasingly rely on spatial predictive models of biodiversity to support decision-making. Therefore, highly accurate and ecologically meaningful models are required at relatively broad spatial scales. While statistical techniques have been optimized to improve model accuracy, less focus has been given to the question: How does the autecology of a single species affect model quality? We compare a direct modelling approach versus a cumulative modelling approach for predicting plant species richness, where the latter gives more weight to the ecology of functional species groups. In the direct modelling approach, species richness is predicted by a single model calibrated for all species. In the cumulative modelling approach, the species were partitioned into functional groups, with each group calibrated separately and species richness of each group was cumulated to predict total species richness. We hypothesized that model accuracy depends on the ecology of individual species and that the cumulative modelling approach would predict species richness more accurately. The predictors explained plant species richness by ca. 25%. However, depending on the functional group the deviance explained varied from 3 to 67%. While both modelling approaches performed equally well, the models of the different functional groups highly varied in their quality and their spatial richness pattern. This variability helps to improve our understanding on how plant functional groups respond to ecological gradients.  相似文献   

9.
In this article, we describe a parallel agent-based model of spatial opinion diffusion that is driven by graphics processing units (GPUs). Modeling opinion exchange and diffusion across landscapes often involves the simulation of large numbers of geographically located individual decision-makers and a massive number of individual-level interactions. This simulation requires substantial computational power. GPU-enabled computing resources provide a massively parallel processing platform based on a fine-grained shared memory paradigm. This massively parallel processing platform holds considerable promise for meeting the computing requirement of agent-based models of spatial problems. In this article, we focus on the parallelization of an agent-based spatial opinion model using GPU technologies. We discussed key algorithms designed for parallel agent-based opinion modeling: including domain decomposition and mutual exclusion. Experiments conducted to examine computing performance show that GPUs provide a computationally efficient alternative to traditional parallel computing architectures and substantially accelerate agent-based models of large-scale opinion exchange among individual decision makers.  相似文献   

10.
Mining development can potentially lead to cumulative impacts on ecosystems and their services across a range of scales. Site-specific environmental impact assessments are commonly assessed for mining projects; however, large-scale cumulative impacts of multiple mines that aggregate and interact in resources regions have had little attention in the literature and there are few examples where regional-scale mining impacts have been assessed on ecosystem services. The objective of this study is to quantify regional-scale cumulative impacts of mining on sediment retention ecosystem services. We apply the sediment delivery ratio model of Integrated Valuation of Ecosystem Services and Trade-offs to calculate and map the sediment retention and export using a synthetic catchment model and a real case study under different mining scenarios in an Australian mining region. Two impact indices were created to quantify the cumulative impacts associated with a single mine and the interactions between multiple mines. The indices clarified the magnitude of impacts and the positive/negative impacts associated with regional-scale sediment retention and export. We found cumulative impacts associated with multiple mines’ interaction occurred but the influence of these interactions was relatively weak. This research demonstrated the potential for utilising ecosystem services modelling for the quantitative assessment of the cumulative impacts. Such research provide decision-makers and planners with a tool for sustainable regional and landscape planning that balances the needs of mining and the provision of ecosystem services.  相似文献   

11.
Charles Darwin aided his private decision making by an explicit deliberation, famously deciding whether or not to marry by creating a list of points in a table with two columns: “Marry” and “Not Marry”. One hundred seventy-two years after Darwin’s wedding, we reconsider whether this process of choice, under which individuals assign values to their options and compare their relative merits at the time of choosing (the tug-of-war model), applies to our experimental animal, the European Starling, Sturnus vulgaris. We contrast this with the sequential choice model that postulates that decision-makers make no comparison between options at the time of choice. According to the latter, behaviour in simultaneous choices reflects adaptations to contexts with sequential encounters, in which the choice is whether to take an opportunity or let it pass. We postulate that, in sequential encounters, the decision-maker assigns (by learning) a subjective value to each option, reflecting its payoff relative to background opportunities. This value is expressed as latency and/or probability to accept each opportunity as opposed to keep searching. In simultaneous encounters, choice occurs through each option being processed independently, by a race between the mechanisms that generate option-specific latencies. We describe these alternative models and review data supporting the predictions of the sequential choice model.  相似文献   

12.
In this paper I demonstrate some of the techniques for the analysis of spatial point patterns that have become available due to recent developments in point process modelling software. These developments permit convenient exploratory data analysis, model fitting, and model assessment. Efficient model fitting, in particular, makes possible the testing of statistical hypotheses of genuine interest, even when interaction between points is present, via Monte Carlo methods. The discussion of these techniques is conducted jointly with and in the context of some preliminary analyses of a collection of data sets which are of considerable interest in their own right. These data sets (which were kindly provided to me by the New Brunswick Department of Natural Resources) consist of the complete records of wildfires which occurred in New Brunswick during the years 1987 through 2003. In treating these data sets I deal with data-cleaning problems, methods of exploratory data analysis, means of detecting interaction, fitting of statistical models, and residual analysis and diagnostics. In addition to demonstrating modelling techniques, I include a discussion on the nature of statistical models for point patterns. This is given with a view to providing an understanding of why, in particular, the Strauss model fails as a model for interpoint attraction and how it has been modified to overcome this difficulty. All actual modelling of the New Brunswick fire data is done only with the intent of illustrating techniques. No substantive conclusions are or can be drawn at this stage. Realistic modelling of these data sets would require incorporation of covariate information which I do not so far have available.
Rolf TurnerEmail:
  相似文献   

13.
The paper deals with two major problems in ecological modelling today, namely how to get reliable parameters? and how to build ecosystem properties into our models? The use of new mathematical tools to answer these questions is mentioned briefly, but the main focus of the paper is on development of structural dynamic models which are models using goal functions to reflect a current change of the properties of the biological components in the models. These changes of the properties are due to the enormous adaptability of the biological components to the prevailing conditions. All species in an ecosystem attempt to obtain most biomass, i.e. to move as far away as possible from thermodynamic equilibrium which can be measured by the thermodynamic concept exergy. Consequently, exergy has been proposed as a goal function in ecological models with dynamic structure, meaning currently changed properties of the biological components and in model language currently changed parameters. An equation to compute an exergy index of a model is presented. The theoretical considerations leading to this equation are not presented here but references to literature where the basis theory can be found are given. Two case studies of structural dynamic modelling are presented: a shallow lake where the structural dynamic changes have been determined before the model was developed, and the application of biomanipulation in lake management, where the structural dynamic changes are generally known. Moreover. it is also discussed how the same idea of using exergy as a goal function in ecological modelling may be applied to facilitate the estimation of parameters.  相似文献   

14.
Nonequilibrium coexistence in a competition model with nutrient storage   总被引:1,自引:0,他引:1  
Revilla T  Weissing FJ 《Ecology》2008,89(3):865-877
Resource competition theory predicts that, in equilibrium, the number of coexisting species cannot exceed the number of limiting resources. In some competition models, however, competitive interactions may result in nonequilibrium dynamics, allowing the coexistence of many species on few resources. The relevance of these findings is still unclear, since some assumptions of the underlying models are unrealistic. Most importantly, these models assume that individual growth directly reflects the availability of external resources, whereas real organisms can store resources, thereby decoupling their growth from external fluctuations. Here we study the effects of resource storage by extending the well-known Droop model to the context of multiple species and multiple resources. We demonstrate that the extended Droop model shows virtually the same complex dynamics as models without storage. Depending on the model parameters, one may obtain competitive exclusion, stable equilibrium coexistence, periodic and non-periodic oscillations, and chaos. Again, nonequilibrium dynamics allows for the coexistence of many species on few resources. We discuss our findings in the light of earlier work on resource competition, highlighting the role of luxury consumption, trade-offs in competitive abilities, and ecological stoichiometry.  相似文献   

15.
Testing ecological models: the meaning of validation   总被引:9,自引:0,他引:9  
The ecological literature reveals considerable confusion about the meaning of validation in the context of simulation models. The confusion arises as much from semantic and philosophical considerations as from the selection of validation procedures. Validation is not a procedure for testing scientific theory or for certifying the ‘truth’ of current scientific understanding, nor is it a required activity of every modelling project. Validation means that a model is acceptable for its intended use because it meets specified performance requirements.Before validation is undertaken, (1) the purpose of the model, (2) the performance criteria, and (3) the model context must be specified. The validation process can be decomposed into several components: (1) operation, (2) theory, and (3) data. Important concepts needed to understand the model evaluation process are verification, calibration, validation, credibility, and qualification. These terms are defined in a limited technical sense applicable to the evaluation of simulation models, and not as general philosophical concepts. Different tests and standards are applied to the operational, theoretical, and data components. The operational and data components can be validated; the theoretical component cannot.The most common problem with ecological and environmental models is failure to state what the validation criteria are. Criteria must be explicitly stated because there are no universal standards for selecting what test procedures or criteria to use for validation. A test based on comparison of simulated versus observed data is generally included whenever possible. Because the objective and subjective components of validation are not mutually exclusive, disagreements over the meaning of validation can only be resolved by establishing a convention.  相似文献   

16.
17.
18.
We describe the development of a neural network model for estimating primary production of phytoplankton. Data from an enriched estuary in the eastern United States, Chesapeake Bay, were used to train, validate and test the model. Two error backpropagation multilayer perceptrons were trained: a simpler one (3-5-1) and a more complex one (12-5-1). Both neural networks outperformed conventional empirical models, even though only the latter, which exploits a larger suite of predictive variables, provided truly accurate outputs. The application of this neural network model is thoroughly discussed and the results of a sensitivity analysis are also presented.  相似文献   

19.
Population models for multiple species provide one of the few means of assessing the impact of alternative management options on the persistence of biodiversity, but they are inevitably uncertain. Is it possible to use population models in multiple-species conservation planning given the associated uncertainties? We use information-gap decision theory to explore the impact of parameter uncertainty on the conservation decision when planning for the persistence of multiple species. An information-gap approach seeks robust outcomes that are most immune from error. We assess the impact of uncertainty in key model parameters for three species, whose extinction risks under four alternative management scenarios are estimated using a metapopulation model. Three methods are described for making conservation decisions across the species, taking into account uncertainty. We find that decisions based on single species are relatively robust to uncertainty in parameters, although the estimates of extinction risk increase rapidly with uncertainty. When identifying the best conservation decision for the persistence of all species, the methods that rely on the rankings of the management options by each species result in decisions that are similarly robust to uncertainty. Methods that depend on absolute values of extinction risk are sensitive to uncertainty, as small changes in extinction risk can alter the ranking of the alternative scenarios. We discover that it is possible to make robust conservation decisions even when the uncertainties of the multiple-species problem appear overwhelming. However, the decision most robust to uncertainty is likely to differ from the best decision when uncertainty is ignored, illustrating the importance of incorporating uncertainty into the decision-making process.  相似文献   

20.
Fitting generalised linear models (GLMs) with more than one predictor has become the standard method of analysis in evolutionary and behavioural research. Often, GLMs are used for exploratory data analysis, where one starts with a complex full model including interaction terms and then simplifies by removing non-significant terms. While this approach can be useful, it is problematic if significant effects are interpreted as if they arose from a single a priori hypothesis test. This is because model selection involves cryptic multiple hypothesis testing, a fact that has only rarely been acknowledged or quantified. We show that the probability of finding at least one ‘significant’ effect is high, even if all null hypotheses are true (e.g. 40% when starting with four predictors and their two-way interactions). This probability is close to theoretical expectations when the sample size (N) is large relative to the number of predictors including interactions (k). In contrast, type I error rates strongly exceed even those expectations when model simplification is applied to models that are over-fitted before simplification (low N/k ratio). The increase in false-positive results arises primarily from an overestimation of effect sizes among significant predictors, leading to upward-biased effect sizes that often cannot be reproduced in follow-up studies (‘the winner's curse’). Despite having their own problems, full model tests and P value adjustments can be used as a guide to how frequently type I errors arise by sampling variation alone. We favour the presentation of full models, since they best reflect the range of predictors investigated and ensure a balanced representation also of non-significant results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号