首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines the consequences of using a static model of recreation trip-taking behavior when the underlying decision problem is dynamic. Specifically we examine the implications for trip forecasting and welfare estimation using a panel dataset of Lake Michigan salmon anglers for the 1996 and 1997 fishing seasons. We derive and estimate both a structural dynamic model using Bellman's equation, and a reduced-form static model with trip probability expressions mimicking those of the dynamic model. We illustrate an inherent identification problem in the reduced-form model that creates biased welfare estimates, and we discuss the general implications of this for the interpretation of preference parameters in static models. We then use both models to simulate trip taking behavior and show that although their in-sample trip forecasts are similar, their welfare estimates and out-of-sample forecasts are quite different.  相似文献   

2.
This paper examines the effects of congestion on recreational behavior within a household production model of consumer behavior. We assume that congestion affects the household's ability to produce constant quality recreational service flows and derive a reduced-form model for participation decisions in remote and developed camping. Empirical estimates of the effects of a congestion measure on the conditional probability of participation as well as on the level of participation are estimated for each activity by the type of trip using information from the 1972 National Recreation Survey. The findings suggest that congestion was most likely to affect the decision to participate and not the level once that decision had been made. While differences in these effects were observed across the activities studied, it is not clear how they should be interpreted since our congestion measure was a proxy variable likely to perform better for remote camping than developed camping.  相似文献   

3.
Historically, the National Agricultural Statistics Service crop forecasts and estimates have been determined by a group of commodity experts called the Agricultural Statistics Board (ASB). The corn yield forecasts for the “speculative region,” ten states that account for approximately 85 % of corn production, are based on two sets of monthly surveys, a farmer interview survey and a field measurement survey. The members of the ASB subjectively determine a forecast on the basis of a discussion of the survey data and auxiliary information about weather, average planting dates, and crop maturity. The ASB uses an iterative procedure, where initial state estimates are adjusted so that the weighted sum of the final state estimates is equal to a previously-determined estimate for the speculative region. Deficiencies of the highly subjective ASB process are lack of reproducibility and a measure of uncertainty. This paper describes the use of Bayesian methods to model the ASB process in a way that leads to objective forecasts and estimates of the corn yield. First, we use small area estimation techniques to obtain state-level forecasts. Second, we describe a way to adjust the state forecasts so that the weighted sum of the state forecasts is equal to a previously-determined regional forecast. We use several diagnostic techniques to assess the goodness of fit of various models and their competitors. We use Markov chain Monte Carlo methods to fit the models to both historic and current data from the two monthly surveys. Our results show that our methodology can provide reasonable and objective forecasts of corn yields for states in the speculative region.  相似文献   

4.
Several studies document that iterative question formats used in contingent valuation studies produce anomalies in respondent behavior that appear to threaten the validity of welfare estimates. By decomposing iterative question formats into their ascending and descending sequences, we show that these anomalies occur only in ascending sequences. We describe the conditions under which comparable patterns of behavior are likely to be found in other iterative question formats that begin with a discrete willingness-to-pay question. We then develop a unified explanation of these anomalies using our model of framing based on prospect theory. We provide the first head-to-head test of rival explanations of these behavioral patterns by developing refutable hypotheses for the strategic behavior, yea-saying, anchoring, and cost-expectations models. Finally, based on our own statistically robust model, we show how these anomalies can be eliminated without loss of the statistical efficiency of most iterative question formats.  相似文献   

5.
Model averaging (MA) has been proposed as a method of accommodating model uncertainty when estimating risk. Although the use of MA is inherently appealing, little is known about its performance using general modeling conditions. We investigate the use of MA for estimating excess risk using a Monte Carlo simulation. Dichotomous response data are simulated under various assumed underlying dose–response curves, and nine dose–response models (from the USEPA Benchmark dose model suite) are fit to obtain both model specific and MA risk estimates. The benchmark dose estimates (BMDs) from the MA method, as well as estimates from other commonly selected models, e.g., best fitting model or the model resulting in the smallest BMD, are compared to the true benchmark dose value to better understand both bias and coverage behavior in the estimation procedure. The MA method has a small bias when estimating the BMD that is similar to the bias of BMD estimates derived from the assumed model. Further, when a broader range of models are included in the family of models considered in the MA process, the lower bound estimate provided coverage close to the nominal level, which is superior to the other strategies considered. This approach provides an alternative method for risk managers to estimate risk while incorporating model uncertainty.
Matthew W. WheelerEmail:
  相似文献   

6.
Most population viability analyses (PVA) assume that the effects of species interactions are subsumed by population-level parameters. We examine how robust five commonly used PVA models are to violations of this assumption. We develop a stochastic, stage-structured predator-prey model and simulate prey population vital rates and abundance. We then use simulated data to parameterize and estimate risk for three demographic models (static projection matrix, stochastic projection matrix, stochastic vital rate matrix) and two time series models (diffusion approximation [DA], corrupted diffusion approximation [CDA]). Model bias is measured as the absolute deviation between estimated and observed quasi-extinction risk. Our results highlight three generalities about the application of single-species models to multi-species conservation problems. First, our collective model results suggest that most single-species PVA models overestimate extinction risk when species interactions cause periodic variation in abundance. Second, the DA model produces the most (conservatively) biased risk forecasts. Finally, the CDA model is the most robust PVA to population cycles caused by species interactions. CDA models produce virtually unbiased and relatively precise risk estimates even when populations cycle strongly. High performance of simple time series models like the CDA owes to their ability to effectively partition stochastic and deterministic sources of variation in population abundance.  相似文献   

7.
In this paper we use a dynamic three sector model to examine the neutrality and welfare effects of land income taxes. We find that (1) taxes that are neutral in long run equilibrium need not be neutral in the short run; (2) short run neutrality depends upon the tax treatment of development costs and losses; and (3) many of the neutrality results hold under both static and rational expectations assumptions. We also find that, even without externality assumptions, nonneutrality in the short run does not necessarily entail a welfare cost and may be welfare-enhancing when agents have less-than-perfect foresight.  相似文献   

8.
Coral reefs are threatened ecosystems, so it is important to have predictive models of their dynamics. Most current models of coral reefs fall into two categories. The first is simple heuristic models which provide an abstract understanding of the possible behaviour of reefs in general, but do not describe real reefs. The second is complex simulations whose parameters are obtained from a range of sources such as literature estimates. We cannot estimate the parameters of these models from a single data set, and we have little idea of the uncertainty in their predictions.We have developed a compromise between these two extremes, which is complex enough to describe real reef data, but simple enough that we can estimate parameters for a specific reef from a time series. In previous work, we fitted this model to a long-term data set from Heron Island, Australia, using maximum likelihood methods. To evaluate predictions from this model, we need estimates of the uncertainty in our parameters. Here, we obtain such estimates using Bayesian Metropolis-Coupled Markov Chain Monte Carlo. We do this for versions of the model in which corals are aggregated into a single state variable (the three-state model), and in which corals are separated into four state variables (the six-state model), in order to determine the appropriate level of aggregation. We also estimate the posterior distribution of predicted trajectories in each case.In both cases, the fitted trajectories were close to the observed data, but we had doubts about the biological plausibility of some parameter estimates. We suggest that informative prior distributions incorporating expert knowledge may resolve this problem. In the six-state model, the posterior distribution of state frequencies after 40 years contained two divergent community types, one dominated by free space and soft corals, and one dominated by acroporid, pocilloporid, and massive corals. The three-state model predicts only a single community type. We conclude that the three-state model hides too much biological heterogeneity, but we need more data if we are to obtain reliable predictions from the six-state model. It is likely that there will be similarly large, but currently unevaluated, uncertainty in the predictions of other coral reef models, many of which are much more complex and harder to fit to real data.  相似文献   

9.
The incidence function model (IFM) uses area and connectivity to predict metapopulation dynamics. However, false absences and missing data can lead to underestimates of the number of sites contributing to connectivity, resulting in overestimates of dispersal ability and turnovers (extinctions plus colonizations). We extend estimation methods for the IFM by using a hierarchical Bayesian model to account both for false absences due to imperfect detection and for missing data due to sites not surveyed in some years. We compare parameter estimates, measures of metapopulation dynamics, and forecasts using stochastic patch occupancy models (SPOMs) among three IFM models: (1) a Bayesian formulation assuming no false absences and omitting site-year combinations with missing data; (2) a hierarchical Bayesian formulation assuming no false absences but incorporating missing data; and (3) a hierarchical Bayesian formulation allowing for imperfect detection and incorporating missing data. We fit the models to multiyear data sets of occupancy for two bird species that differ in body size and presumed dispersal ability but inhabit the same network of sites: the small Black Rail (Laterallus jamaicensis) and the medium-sized Virginia Rail (Rallus limicola). Incorporating missing data affected colonization parameters and led to lower estimates of dispersal ability for the Black Rail. Detection rates were high for the Black Rail in most years but moderate for the Virginia Rail. Incorporating imperfect detection resulted in higher occupancy and lower turnover rates for both species, with largest effects for the Virginia Rail. Forecasts using SPOMs were sensitive to both missing data and false absences; persistence in models assuming no false absences was more optimistic than from robust models. Our results suggest that incorporating false absences and missing data into the IFM can improve (1) estimates of dispersal ability and the effect of connectivity on colonization, (2) the scaling of extinction risk with patch area, and (3) forecasts of occupancy and turnover rates.  相似文献   

10.
The existing literature (i) examines bycatch and discard behavior in a static framework and (ii) treats bycatch as a deterministic process uniform across vessels. Using a dynamic representative agent model in a two-stock resource, this paper explores strategic interactions between a social planner and two groups of harvesters, one of which imposes a stochastic “technological externality” (bycatch) on the other. In addition to limitations on entry and the number of trips taken in each industry, three bycatch control instruments are compared to the unconstrained case: taxes, trip limits, and value-based quotas. Implementation and enforcement costs aside, taxes dominate both types of quota, and value limits outperform trip limits by eliminating one type of discarding. In simulations, relative performance depends upon variance in the bycatch process, differences in the ex vessel prices of stocks, relative efficiency of the harvester types, and fixed costs on the trip and industry margins.  相似文献   

11.
As large carnivores recover throughout Europe, their distribution needs to be studied to determine their conservation status and assess the potential for human-carnivore conflicts. However, efficient monitoring of many large carnivore species is challenging due to their rarity, elusive behavior, and large home ranges. Their monitoring can include opportunistic sightings from citizens in addition to designed surveys. Two types of detection errors may occur in such monitoring schemes: false negatives and false positives. False-negative detections can be accounted for in species distribution models (SDMs) that deal with imperfect detection. False-positive detections, due to species misidentification, have rarely been accounted for in SDMs. Generally, researchers use ad hoc data-filtering methods to discard ambiguous observations prior to analysis. These practices may discard valuable ecological information on the distribution of a species. We investigated the costs and benefits of including data types that may include false positives rather than discarding them for SDMs of large carnivores. We used a dynamic occupancy model that simultaneously accounts for false negatives and positives to jointly analyze data that included both unambiguous detections and ambiguous detections. We used simulations to compare the performances of our model with a model fitted on unambiguous data only. We tested the 2 models in 4 scenarios in which parameters that control false-positive detections and true detections varied. We applied our model to data from the monitoring of the Eurasian lynx (Lynx lynx) in the European Alps. The addition of ambiguous detections increased the precision of parameter estimates. For the Eurasian lynx, incorporating ambiguous detections produced more precise estimates of the ecological parameters and revealed additional occupied sites in areas where the species is likely expanding. Overall, we found that ambiguous data should be considered when studying the distribution of large carnivores through the use of dynamic occupancy models that account for misidentification.  相似文献   

12.
In a permit market with endogenous emissions, both firms and citizens purchase permits. Presented here are static and dynamic models of pollution permit markets with endogenous emissions. The optimal permit endowments are characterized when the regulator faces uncertainty about damages and uncertainty about the severity of the citizens’ collective action problem. Due to the possibility of learning over time, the regulator issues a larger number of permits in the first period of the dynamic model than in the static model.  相似文献   

13.
Bashari et al. (2009) propose combining state and transition models (STMs) with Bayesian networks for decision support tools where the focus is on modelling the system dynamics. There is already an extension of Bayesian networks - so-called dynamic Bayesian networks (DBNs) - for explicitly modelling systems that change over time, that has also been applied in ecological modelling. In this paper we propose a combination of STMs and DBNs that overcome some of the limitations of Bashari et al.’s approach including providing an explicit representation of the next state, while retaining its advantages, such an the explicit representation of transitions. We then show that the new model can be applied iteratively to predict into the future consistently with different time frames. We use Bashari et al.’s rangeland management problem as an illustrative case study. We present a comparative complexity analysis of the different approaches, based on the structure inherent in the problem being modelled. This analysis showed that any models that explicitly represent all the transitions only remain tractable when there are natural constraints in the domain. Thus we recommend modellers should analyse these aspects of their problem before deciding whether to use the framework.  相似文献   

14.
Does self-regulation improve social welfare? We develop a policy game featuring a regulator and a firm that can unilaterally commit to better environmental or social behavior in order to preempt future public policy efforts. We show that the answer depends on the set of policy instruments available to the regulator. Self-regulation improves welfare if the regulator can only use mandatory regulation, but it reduces welfare when the regulator opts for a voluntary agreement. This suggests that self-regulation and voluntary agreements are not good complements from a welfare point of view. We derive policy implications, and extend the basic model in several dimensions.  相似文献   

15.
16.
The development of approaches to estimate the vulnerability of biological communities and ecosystems to extirpations and reductions of species is a central challenge of conservation biology. One key aim of this challenge is to develop quantitative approaches to estimate and rank interaction strengths and keystoneness of species and functional groups, i.e. to quantify the relative importance of species. Network analysis can be a powerful tool for this because certain structural aspects of ecological networks are good indicators of the mechanisms that maintain co-evolved, biotic interactions. A static view of ecological networks would lead us to focus research on highly-central species in food webs (topological key players in ecosystems). There are a variety of centrality indices, developed for several types of ecological networks (e.g. for weighted and un-weighted webs). However, truly understanding extinction and its community-wide effects requires the use of dynamic models. Deterministic dynamic models are feasible when population sizes are sufficiently large to minimize noise in the overall system. In models with small population sizes, stochasticity can be modelled explicitly. We present a stochastic simulation-based ecosystem model for identification of “dynamic key species” in situations where stochastic models are appropriate. To demonstrate this approach, we simulated ecosystem dynamics and performed sensitivity analysis using data from the Prince William Sound, Alaska ecosystem model. We then compare these results to those of purely topological analyses and deterministic dynamic (Ecosim) studies. We present the relationships between various topological and dynamic indices and discuss their biological relevance. The trophic group with the largest effect on others is nearshore demersals, the species mostly sensitive to others is halibut, and the group of both considerable effect on and sensitivity to others is juvenile herring. The most important trophic groups in our dynamical simulations appear to have intermediate trophic levels.  相似文献   

17.
Ensemble Bayesian model averaging using Markov Chain Monte Carlo sampling   总被引:2,自引:0,他引:2  
Bayesian model averaging (BMA) has recently been proposed as a statistical method to calibrate forecast ensembles from numerical weather models. Successful implementation of BMA however, requires accurate estimates of the weights and variances of the individual competing models in the ensemble. In their seminal paper (Raftery et al. Mon Weather Rev 133:1155–1174, 2005) has recommended the Expectation–Maximization (EM) algorithm for BMA model training, even though global convergence of this algorithm cannot be guaranteed. In this paper, we compare the performance of the EM algorithm and the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) Markov Chain Monte Carlo (MCMC) algorithm for estimating the BMA weights and variances. Simulation experiments using 48-hour ensemble data of surface temperature and multi-model streamflow forecasts show that both methods produce similar results, and that their performance is unaffected by the length of the training data set. However, MCMC simulation with DREAM is capable of efficiently handling a wide variety of BMA predictive distributions, and provides useful information about the uncertainty associated with the estimated BMA weights and variances.  相似文献   

18.
ABSTRACT

Sustainable forest management on a regional scale requires accurate biomass estimation. At present, technologically comprehensive forecasting estimates are generated using process-based ecological models. However, isolation of the ecological factors that cause uncertainty in model behavior is difficult. To solve this problem, this study aimed to construct a meliorization model evaluation framework to explain uncertainty in model behavior with respect to both the mechanisms and algorithms involved in ecological forecasting based on the principle of landsenses ecology. We introduce a complicated ecological driving mechanism to the process-based ecological model using analytical software and algorithms. Subsequently, as a case study, we apply the meliorization model evaluation framework to detect Eucalyptus biomass forest patches at a regional scale (196,158 ha) using the 3PG2 (Physiological Principles in Predicting Growth) model. Our results show that this technique improves the accuracy of ecological simulation for ecological forecasting and prevents new uncertainties from being produced by adding a new driving mechanism to the original model structure. This result was supported by our Eucalyptus biomass simulation using the 3PG2 model, in which ecological factors caused 21.83% and 9.05% uncertainty in model behavior temporal and spatial forecasting, respectively. In conclusion, the systematic meliorization model evaluation framework reported here provides a new method that could be applied to research requiring comprehensive ecological forecasting. Sustainable forest management on regional scales contributes to accurate forest biomass simulation through the principle of landsenses ecology, in which mix-marching data and a meliorization model are combined.  相似文献   

19.
Imperfectly optimal animals   总被引:1,自引:0,他引:1  
Summary We consider models of behavior that apply to two different problems: when a predator should leave a foraging site and how a female should choose the best available male. In each case we derive rules for an optimal solution to the problem. We also derive models based on very simple, plausible rules of behavior that we suspect animals may actually use. Although the expected payoffs from optimality models always exceed the expected payoffs from our simpler behavioral models, under certain conditions the difference is not large. When good foraging sites last but a short time and when females' mobility in their habitat is limited, the results of simple models and optimal models are very close indeed.Because of the difficulty of distinguishing between the results of each type of model and because natural selection will presumably provide a best mix of solutions to a range of problems rather than a best solution to any one problem, we suggest that behavioral ecologists expend more effort on simple, plausible models of animal behavior. Such models provide ready-made testable hypotheses about the animal's approximation to optimality and about the actual mechanisms of behavior.  相似文献   

20.
There is a need for decadal predictions of the seabed evolution, for example to inform resurvey strategies when maintaining navigation channels. The understanding of the physical processes involved in morphological evolution, and the viability of process models to accurately model evolution over these time scales, are currently limited. As a result, statistical approaches are used to supply long-term forecasts. In this paper, we introduce a novel statistical approach for this problem: the autoregressive Hilbertian model (ARH). This model naturally assesses the time evolution of spatially-distributed measurements. We apply the technique to a coastal area in the East Anglian coast over the period 1846 to 2002, and compare with two other statistical methods used recently for seabed prediction: the autoregressive model and the EOF model. We evaluate the performance of the three methods by comparing observations and predictions for 2002. The ARH model enables a reduction of 10% of the root mean squared errors. Finally, we compute the variability in the predictions related to time sampling using the jackknife, a method that uses subsamples to quantify uncertainties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号