首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Turbulence closures are fundamental for modelling the atmospheric diffusion in numerical codes and the resulting eddy diffusivities are key parameters in describing the transport and dispersion in the boundary layer. In this work, four turbulence closure schemes have been applied for reproducing a neutral flow over schematic complex terrain using the meteorological model RAMS. Two of the closures, a one-equation (E-l) and a two-equations (E-) model, have been implemented in RAMS in alternative to the ones originally available. In these cases, an analytical method based on the similarity theory for the atmospheric surface layer and boundary layer is adopted to calculate the empirical constants of the turbulence closures. Some examples of numerical studies performed to simulate the flow and turbulence over a 3-D hill in wind-tunnel experiment in neutral stratification are presented and discussed. An intercomparison of simulations related to different closures is considered by analysing the main features of the flow over the hill and by comparing calculated vertical profiles of turbulent kinetic energy with measured ones.  相似文献   

2.
Many species of platyrrhine primates are characterised by sex-linked color vision polymorphism. This presents an opportunity to study the biology and ecology of individuals with different phenotypes living in the same group. Several evolutionary processes could maintain polymorphic genes in populations. In this study, we evaluate the hypothesis that foraging niche divergence among monkeys explains the presence of multiple color vision phenotypes. Specifically, we test whether dichromats and trichromats differ in foraging time devoted to cryptic vs brightly colored resources. We did not find any differences in foraging time spent on different food types by dichromatic and trichromatic monkeys in two groups of white-faced capuchins (Cebus capucinus) living in a tropical dry forest. We conclude that in so far as these variables are concerned, niche divergence does not likely explain color vision polymorphism in our study population.  相似文献   

3.
There are presently few tools available for estimating epidemic risks from forest pathogens, and hence informing pro-active disease management. In this study we demonstrated that a bioclimatic niche model can be used to examine questions of epidemic risk in temperate eucalypt plantations. The bioclimatic niche model, CLIMEX, was used to identify regional variation in climate suitability for Mycosphaerella leaf disease (MLD), a major cause of foliage damage in temperate eucalypt plantations around the world. Using historical observations of MLD damage, we were able to convert the relative score of climatic suitability generated by CLIMEX into a severity ranking ranging from low to high, providing for the first time a direct link between risk and impact, and allowing us to explore disease severity in a way meaningful to forest managers. We determined that the ‘Compare Years’ function in CLIMEX could be used for site-specific risk assessment to identify severity, frequency and seasonality of MLD epidemics. We explored appropriate scales of risk assessment for forest managers. Applying the CLIMEX model of MLD using a 0.25° or coarser grid size to areas of sharp topographic relief frequently misrepresented the risk posed by MLD, because considerable variation occurred between individual forest sites encompassed within a single grid cell. This highlighted the need for site-specific risk assessment to address many questions pertinent to managing risk in plantations.  相似文献   

4.
《Ecological modelling》2007,208(1):9-16
Food webs are constructed as structural directed graphs that describe “who eats whom,” but it is common to interpret them as energy flow diagrams where predation represents an energy transfer from the prey to the predator. It is the aim of this work to demonstrate that food webs are incomplete as energy flow diagrams if they ignore passive flows to detritus (dead organic material). While many ecologists do include detritus in conceptual and mathematical models, the detrital omission is still commonly found. Often detritus is either ignored or treated as an unlimited energy source, yet all organisms contribute to the detritus pool, which can be an energy source for other species in the system. This feedback loop is of high importance, since it increases the number of pathways available for energy flows, revealing the significance of indirect effects, and making the functional role of the top predators less clear. In this work we propose the modified niche model by adding a detritus compartment to the niche model. We demonstrate the effect of structural loops that result from feeding on detritus, by comparing empirical data sets to five different assembly models: (1) cascade, (2) constant connectance, (3) niche, (4) modified niche (original in this work), and (5) cyber-ecosystem. Of these models, only the last two explicitly include detritus. We show that when passive flows to detritus are included in the food web structure, the structure becomes more robust to the removal of individual nodes or connections. In addition, we show that food web models that include the detritus feedback loop perform better with respect to several structural network metrics.  相似文献   

5.
The Chinese mitten crab (Eriocheir sinensis) is native to east Asia, is established throughout Europe, and is introduced but geographically restricted in North America. We developed and compared two separate environmental niche models using genetic algorithm for rule set prediction (GARP) and mitten crab occurrences in Asia and Europe to predict the species' potential distribution in North America. Since mitten crabs must reproduce in water with >15% per hundred salinity, we limited the potential North American range to freshwater habitats within the highest documented dispersal distance (1260 km) and a more restricted dispersal limit (354 km) from the sea. Applying the higher dispersal distance, both models predicted the lower Great Lakes, most of the eastern seaboard, the Gulf of Mexico and southern extent of the Mississippi River watershed, and the Pacific northwest as suitable environment for mitten crabs, but environmental match for southern states (below 35 degrees N) was much lower for the European model. Use of the lower range with both models reduced the expected range, especially in the Great Lakes, Mississippi drainage, and inland areas of the Pacific Northwest. To estimate the risk of introduction of mitten crabs, the amount of reported ballast water discharge into major United States ports from regions in Asia and Europe with established mitten crab populations was used as an index of introduction effort. Relative risk of invasion was estimated based on a combination of environmental match and volume of unexchanged ballast water received (July 1999-December 2003) for major ports. The ports of Norfolk and Baltimore were most vulnerable to invasion and establishment, making Chesapeake Bay the most likely location to be invaded by mitten crabs in the United States. The next highest risk was predicted for Portland, Oregon. Interestingly, the port of Los Angeles/Long Beach, which has a large shipping volume, had a low risk of invasion. Ports such as Jacksonville, Florida, had a medium risk owing to small shipping volume but high environmental match. This study illustrates that the combination of environmental niche- and vector-based models can provide managers with more precise estimates of invasion risk than can either of these approaches alone.  相似文献   

6.
Research questions at the regional, national and global scales frequently require the upscaling of existing models. At large scales, simple model aggregation may have a prohibitive computational cost and lead to over-detailed problem representation. Methods that guide model simplification and revision have the potential to support the choice of the appropriate level of detail or heterogeneity within upscaled models. Efficient upscaling will retain only the heterogeneity that contributes to accurate aggregated results. This approach to model revision is challenging, because automatic generation of alternative models is difficult and the set of possible revised models is very large. In the case where simplification alone is considered, there are at least n2−1 possible simplified models where n is the number of model variables. Even with the availability of High Performance Computing, it is not possible to evaluate every possible simplified model if the number of model variables is greater than roughly 35. To address these issues, we propose a method that extends an existing procedure for simplifying and aggregating mechanistic models based on replacing model variables with constants. The method generates simplified models by selectively aggregating existing model variables, retaining existing model structure while reducing the size of the set of possible models and ordering them into a search tree. The tree is then searched selectively. We illustrate the method using a catchment scale optimization model with c. 50,000 variables (Farm-adapt) in the context of adaptation to climatic change. The method was successful in identifying redundant model variables and an adequate model 10% smaller than the original model. We discuss how the procedure can be extended to other large models and compare the method to those proposed by others. We conclude by urging model developers to regard their models as a starting point and to consider the need for alternative models during model development.  相似文献   

7.
The area under the curve (AUC) of the receiver operating characteristic (ROC) has become a dominant tool in evaluating the accuracy of models predicting distributions of species. ROC has the advantage of being threshold-independent, and as such does not require decisions regarding thresholds of what constitutes a prediction of presence versus a prediction of absence. However, we show that, comparing two ROCs, using the AUC systematically undervalues models that do not provide predictions across the entire spectrum of proportional areas in the study area. Current ROC approaches in ecological niche modeling applications are also inappropriate because the two error components are weighted equally. We recommend a modification of ROC that remedies these problems, using partial-area ROC approaches to provide a firmer foundation for evaluation of predictions from ecological niche models. A worked example demonstrates that models that are evaluated favorably by traditional ROC AUCs are not necessarily the best when niche modeling considerations are incorporated into the design of the test.  相似文献   

8.
9.
Various methods exist to model a species’ niche and geographic distribution using environmental data for the study region and occurrence localities documenting the species’ presence (typically from museums and herbaria). In presence-only modelling, geographic sampling bias and small sample sizes represent challenges for many species. Overfitting to the bias and/or noise characteristic of such datasets can seriously compromise model generality and transferability, which are critical to many current applications - including studies of invasive species, the effects of climatic change, and niche evolution. Even when transferability is not necessary, applications to many areas, including conservation biology, macroecology, and zoonotic diseases, require models that are not overfit. We evaluated these issues using a maximum entropy approach (Maxent) for the shrew Cryptotis meridensis, which is endemic to the Cordillera de Mérida in Venezuela. To simulate strong sampling bias, we divided localities into two datasets: those from a portion of the species’ range that has seen high sampling effort (for model calibration) and those from other areas of the species’ range, where less sampling has occurred (for model evaluation). Before modelling, we assessed the climatic values of localities in the two datasets to determine whether any environmental bias accompanies the geographic bias. Then, to identify optimal levels of model complexity (and minimize overfitting), we made models and tuned model settings, comparing performance with that achieved using default settings. We randomly selected localities for model calibration (sets of 5, 10, 15, and 20 localities) and varied the level of model complexity considered (linear versus both linear and quadratic features) and two aspects of the strength of protection against overfitting (regularization). Environmental bias indeed corresponded to the geographic bias between datasets, with differences in median and observed range (minima and/or maxima) for some variables. Model performance varied greatly according to the level of regularization. Intermediate regularization consistently led to the best models, with decreased performance at low and generally at high regularization. Optimal levels of regularization differed between sample-size-dependent and sample-size-independent approaches, but both reached similar levels of maximal performance. In several cases, the optimal regularization value was different from (usually higher than) the default one. Models calibrated with both linear and quadratic features outperformed those made with just linear features. Results were remarkably consistent across the examined sample sizes. Models made with few and biased localities achieved high predictive ability when appropriate regularization was employed and optimal model complexity was identified. Species-specific tuning of model settings can have great benefits over the use of default settings.  相似文献   

10.
《Ecological modelling》2005,188(1):112-136
This paper considers the state of the art of the numerical solution of age-structured population models. The different numerical approaches to this kind of problems and the stability and convergence results for them are reviewed. Both characteristic curves methods and finite difference methods are compared with regards to accuracy, efficiency and their qualitative behaviour depending on the compatibility conditions between initial and boundary data of the problems. The paper is the first of a series of two considering the numerical solution of general structured population models.  相似文献   

11.
Five regression models (Poisson, negative binomial, quasi-Poisson, the hurdle model and the zero-inflated Poisson) were used to assess the relationship between the abundance of a vulnerable plant species, Leionema ralstonii, and the environment. The methods differed in their capacity to deal with common properties of ecological data. They were assessed theoretically, and their predictive performance was evaluated with correlation, calibration and error statistics calculated within a bootstrap evaluation procedure that simulated performance for independent data.  相似文献   

12.
Williams RJ  Purves DW 《Ecology》2011,92(9):1849-1857
The structure of food webs, complex networks of interspecies feeding interactions, plays a crucial role in ecosystem resilience and function, and understanding food web structure remains a central problem in ecology. Previous studies have shown that key features of empirical food webs can be reproduced by low-dimensional "niche" models. Here we examine the form and variability of food web niche structure by fitting a probabilistic niche model to 37 empirical food webs, a much larger number of food webs than used in previous studies. The model relaxes previous assumptions about parameter distributions and hierarchy and returns parameter estimates for each species in each web. The model significantly outperforms previous niche model variants and also performs well for several webs where a body-size-based niche model performs poorly, implying that traits other than body size are important in structuring these webs' niche space. Parameter estimates frequently violate previous models' assumptions: in 19 of 37 webs, parameter values are not significantly hierarchical, 32 of 37 webs have nonuniform niche value distributions, and 15 of 37 webs lack a correlation between niche width and niche position. Extending the model to a two-dimensional niche space yields networks with a mixture of one- and two-dimensional niches and provides a significantly better fit for webs with a large number of species and links. These results confirm that food webs are strongly niche-structured but reveal substantial variation in the form of the niche structuring, a result with fundamental implications for ecosystem resilience and function.  相似文献   

13.
Stomatal conductance (g) is a key parameter in controlling energy and water exchanges between canopy and the atmosphere. Stomatal conductance models proposed by Ball, Woodrow and Berry (BWB) and Leuning have been increasingly used in land surface schemes. In a recent study, a new diagnostic index was developed by Wang et al. to examine the response of g to humidity and new models were proposed to resolve problems identified in the BWB and Leuning models. This approach is theoretically sound, but relies on canopy latent heat and CO2 fluxes and environmental variables at the leaf surface which are not available at most eddy correlation (EC) observation sites. In this study, we tested the diagnostic index by empirically correcting EC measurements to canopy-level fluxes and by replacing leaf surface variables by their corresponding ambient air variables, and re-examined the stomatal conductance models of BWB, Leuning, and Wang et al. We found that the impact of the above modifications on the evaluation of g–humidity relationships is very small. This study provides a practical approach to investigate the stomatal response to humidity using routine EC measurements.  相似文献   

14.
This paper presents modeling methods for mapping fire hazard and fire risk using a research model called FIREHARM (FIRE Hazard and Risk Model) that computes common measures of fire behavior, fire danger, and fire effects to spatially portray fire hazard over space. FIREHARM can compute a measure of risk associated with the distribution of these measures over time using 18 years of gridded DAYMET daily weather data used to simulate fuel moistures to compute fire variables. We detail the background, structure, and application of FIREHARM and then present validation results of six of the FIREHARM output variables that revealed accuracy rates ranging from 20 to 80% correct depending on the quality of input data and the behavior of the fire behavior simulation framework. Overall accuracies appeared acceptable for prioritization analysis and large scale assessments because precision was high. We discuss advantages and disadvantages of the fire hazard and risk approaches and a possible agenda for future development of comprehensive fire hazard and risk mapping is presented.  相似文献   

15.
Freshwater aquatic systems in North America are being invaded by many different species, ranging from fish, mollusks, cladocerans to various bacteria and viruses. These invasions have serious ecological and economic impacts. Human activities such as recreational boating are an important pathway for dispersal. Gravity models are used to quantify the dispersal effect of human activity. Gravity models currently used in ecology are deterministic. This paper proposes the use of stochastic gravity models in ecology, which provides new capabilities both in model building and in potential model applications. These models allow us to use standard statistical inference tools such as maximum likelihood estimation and model selection based on information criteria. To facilitate prediction, we use only those covariates that are easily available from common data sources and can be forecasted in future. This is important for forecasting the spread of invasive species in geographical and temporal domain. The proposed model is portable, that is it can be used for estimating relative boater traffic and hence relative propagule pressure for the lakes not covered by current boater surveys. This makes our results broadly applicable to various invasion prediction and management models.  相似文献   

16.
17.
The ODD protocol: A review and first update   总被引:8,自引:0,他引:8  
  相似文献   

18.
Most performance criteria which have been applied to train ecological models focus on the accuracy of the model predictions. However, these criteria depend on the prevalence of the training set and often do not take into account ecological issues such as the distinction between omission and commission errors. Moreover, a previous study indicated that model training based on different performance criteria results in different optimised models. Therefore, model developers should train models based on different performance criteria and select the most appropriate model depending on the modelling objective. This paper presents a new approach to train fuzzy models based on an adjustable performance criterion, called the adjusted average deviation (aAD). This criterion was applied to develop a species distribution model for spawning grayling in the Aare River near Thun, Switzerland. To analyse the strengths and weaknesses of this approach, it was compared to model training based on other performance criteria. The results suggest that model training based on accuracy-based performance criteria may produce unrealistic models at extreme prevalences of the training set, whereas the aAD allows for the identification of more accurate and more reliable models. Moreover, the adjustable parameter in this criterion enables modellers to situate the optimised models in the search space and thus provides an indication of the ecological model relevance. Consequently, it may support modellers and river managers in the decision making process by improving model reliability and insight into the modelling process. Due to the universality and the flexibility of the approach, it could be applied to any other ecosystem or species, and may therefore be valuable to ecological modelling and ecosystem management in general.  相似文献   

19.
Program MARK provides > 65 data types in a common configuration for the estimation of population parameters from mark-encounter data. Encounter information from live captures, live resightings, and dead recoveries can be incorporated to estimate demographic parameters. Available estimates include survival (S or ϕ), rate of population change (λ), transition rates between strata (Ψ), emigration and immigration rates, and population size (N). Although N is the parameter most often desired by biologists, N is one of the most difficult parameters to estimate precisely without bias for a geographically and demographically closed population. The set of closed population estimation models available in Program MARK incorporate time (t) and behavioral (b) variation, and individual heterogeneity (h) in the estimation of capture and recapture probabilities in a likelihood framework. The full range of models from M 0 (null model with all capture and recapture probabilities equal) to M tbh are possible, including the ability to include temporal, group, and individual covariates to model capture and recapture probabilities. Both the full likelihood formulation of Otis et al. (1978) and the conditional model formulation of Huggins (1989, 1991) and Alho (1990) are provided in Program MARK, and all of these models are incorporated into the robust design (Kendall et al. 1995, 1997; Kendall and Nichols 1995) and robust-design multistrata (Hestbeck et al. 1991, Brownie et al. 1993) data types. Model selection is performed with AICc (Burnham and Anderson 2002) and model averaging (Burnham and Anderson 2002) is available in Program MARK to provide estimates of N with standard error that reflect model selection uncertainty.  相似文献   

20.
Aquatic biogeochemical models are widely used as tools for understanding aquatic ecosystems and predicting their response to various stimuli (e.g., nutrient loading, toxic substances, climate change). Due to the complexity of these systems, such models are often elaborate and include a large number of estimated parameters. However, correspondingly large data sets are rarely available for calibration purposes, leading to models that may be overfit and possess reduced predictive capabilities. We apply, for the first time, information-theoretic model-selection techniques to a set of spatially explicit (1D) algal dynamics models of varying parameter dimension. We demonstrate that increases in complexity tend to produce a better model fit to calibration data, but beyond a certain degree of complexity the benefits of adding parameters are diminished (the risk of overfitting becomes greater). The particular approach taken here is computationally expensive, but several suggestions are made as to how multimodel methods may practically be extended to more sophisticated models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号