首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
A time series of estimates of irrigated area was developed for the Lower Rio Grande valley (LRG) in New Mexico from the 1970s to present day. The objective of the project was to develop an independent, accurate, and scientifically justifiable evaluation of irrigated area in the region for the period spanning from the mid‐1970s to the present. These area estimates were used in support of groundwater modeling of the LRG region, as well as for other analyses. This study used a remote‐sensing‐based methodology to evaluate overall irrigated area within the LRG. We applied a methodology that involved the normalization of vegetation indices derived from satellite imagery to get a more accurate estimation of irrigated area across multiple time periods and multiple Landsat platforms. The normalization allows more accurate evaluation of vegetation index data that span several decades. An accuracy assessment of the methodology and results from this study was performed using field‐collected crop data from the 2008 growing season. The comparisons with field data indicate that the accuracy of the remote‐sensing‐based estimates of historical irrigated area is very good, with rates of false positives (areas identified as irrigated that are not truly irrigated) of only about 4%, and rates of false negatives (areas identified as not irrigated that are truly irrigated) in the range of 0.6‐2.0%.  相似文献   

2.
There is an increasing need to document the impacts of conservation‐related best management practices (BMPs) on water quality within a watershed. However, this impact analysis depends upon accurate geospatial locations of existing practices, which are difficult to obtain. This study demonstrates and evaluates three different methods for obtaining geospatial information for BMPs. This study was focused on the Eagle Creek Watershed, a mixed use watershed in central Indiana. We obtained geospatial information for BMPs through government records, producer interviews, and remote‐sensing aerial photo interpretation. Aerial photos were also used to validate the government records and producer interviews. This study shows the variation in results obtained from the three sources of information as well as the benefits and drawbacks of each method. Using only one method for obtaining BMP information can be incomplete, and this study demonstrates how multiple methods can be used for the most accurate picture.  相似文献   

3.
ABSTRACT: A review of methods for planning-level estimates of pollutant loads in urban stormwater focuses on transfer of charac. teristic runoff quality data to unmonitored sites, runoff monitoring, and simulation models. Load estimation by transfer of runoff quality data is the least expensive, but the accuracy of estimates is unknown. Runoff monitoring methods provide best estimates of existing loads, but cannot be used to predict load changes resulting from runoff controls, or other changes of the urban system. Simulation models require extensive calibration for reliable application. Models with optional formulations of pollutant build up, washoff, and transport can be better calibrated and the selection of options should be based on a statistical analysis of calibration data. Calibrated simulation models can be used for evaluation of control alternatives.  相似文献   

4.
ABSTRACT: In this paper a new set of soil texture data is used to estimate the spatial distribution of saturated hydraulic conductivity values for a small rangeland catchment. The estimates of conductivity are used to re-excite and re-evaluate a quasi-physically based rainfall-runoff model. The performance of the model is significantly reduced with conductivity estimates gleaned from soil texture data rather than the infiltration data used in our previous efforts.  相似文献   

5.
Recent debate following the rejection of the Environment Agency case regarding an application for water abstraction at Axford on the River Kennet has focused upon the benefits procedure employed for aggregating non-user benefits which underpinned the economic case put forward by the Agency (although this was not the reason cited by the inquiry for rejection of the case). Commentators have seen this case as setting an unfortunate precedent for the use of economic assessments in such resource management issues. The paper presents a number of highly tractable alternative methods for the aggregation of benefits estimates designed to address the central problems of the definition of a relevant aggregation population and a potential decay of values with increasing distance from a given valuation site. These methods are tested using data obtained from a national survey of non-users of a specific natural area. Results from this application indicate that simpler approaches such as that used at the Axford inquiry may result in aggregate benefits estimates which are very substantially larger than those produced by our proposed alternative approaches to aggregation.  相似文献   

6.
The United Kingdom, under the Large Combustion Plant Directive of the European Community, is committed to cutting sulphur dioxide (SO2) emissions by 60% of 1980 levels by the year 2003. In order to justify this action and to support new decisions on further emission reductions, policy makers require knowledge of the economic benefits of abatement. Benefit estimates for the recovery of freshwater fish populations present difficulties since the effect of reduced acid deposition on environmental processes is complex and because fishery records are often inadequate or absent. This paper predicts the economic benefits of acid rain abatement to the rod and line salmon fishery of Galloway, South West Scotland. It achieves this by linking output on long term changes in water chemistry and fish population status from MAGIC, a process based catchment model for acidification, with catch and market value data. Predicted increases in the market value of the fishery are presented and the role of the model in economic analysis of environmental policy discussed.  相似文献   

7.
Stakeholders developing water quality improvement plans for lakes and reservoirs are challenged by the sparsity of in-situ data and the uncertainty ingrained in management decisions. This study explores how satellite images can fill gaps in water quality databases and provide more holistic assessments of impairments. The study site is an impaired water body that is serving as a pilot for improving state-wide nutrient management planning processes. An existing in-situ database was used to calibrate semi-analytical models that relate satellite reflectance values to turbidity and total suspended solids (TSS). Landsat-7 images from 1999 to 2020 that overpass High Rock Lake, North Carolina were downloaded and processed, providing 42 turbidity and 39 TSS satellite and in-situ match-ups for model calibration and validation. Model r-squared values for the fitted turbidity and TSS models are 0.72 and 0.74, and the mean absolute errors are 14.6 NTU and 3.2 mg/L. The satellite estimates were compared to the in-situ data and simulated TSS values produced by a calibrated hydrologic-hydrodynamic model. The process-based model is considered less accurate than the satellite model based on statistical performance metrics. Comparisons between data sources are illustrated with time series plots, frequency curves, and aggregate decision metrics to highlight the dependence of lake impairment assessments on the spatial and temporal frequency of available data and model accuracy.  相似文献   

8.
The flows of paper are analyzed throughout the papermaking processes, with the year 2007 and Korea defined as the system boundaries. In practice, the statistical data on the production, import and export of paper or pulp can be collected with relative ease from the government and industrial associations. However, the input data regarding the volumes of pulp and wastepaper used in different paper products, such as newsprint, printing papers, sanitary and household papers, specialty papers, and corrugating board base, are difficult to obtain because such information is generally kept confidential in the course of corporate operations.The production processes of paper products in Korea are modeled using information on raw materials, their compositions and production yields of products in order to identify and quantify the amounts of pulp and wastepaper used in each paper product. The material flows of paper are then analyzed based on the calculation model derived from the correlation of input and output flows between the individual processes throughout the entire paper lifecycle. Accuracy analysis using both mean absolute error (MAE) and mean absolute percent error (MAPE) is conducted to verify the amounts of pulp and wastepaper calculated from the proposed model against the volumes of domestically consumed pulp and wastepaper provided in the national statistics. Although the calculated values for the past (i.e., the 1980s and 1990s) differ to some degree from the statistical values, the data for the 2000s have a relatively higher level of accuracy, with the MAPE of the total pulp and recycling volume at 5.39% and 5.30%, respectively, thus validating the adequacy of the proposed modeling method. The proposed calculation model can be effectively used in the material flow analysis (MFA) of paper to reduce the burden of data collection and obtain relatively accurate results.  相似文献   

9.
ABSTRACT: In order to make economically efficient decisions about water quality improvements, data on both the costs and benefits of these improvements is needed. However, there has been little research on the benefits of reducing phosphorus pollution which implies that policy decisions are not able to make the comparison of costs and benefits that is essential for economic efficiency. This research attempts to ameliorate this situation by providing an estimate of the benefits of a 40 percent reduction in phosphorus pollution in the Minnesota River. A 1997 mail survey gathered information on Minnesota residents'use of a recreational site on the Minnesota River, the Minnesota Valley National Wildlife Refuge, and their willingness to pay for phosphorus reductions in the Minnesota River. The random effects probit model used in this research to investigate household willingness to pay for phosphorus pollution reductions in the Minnesota River incorporates recent innovations in nonmarket valuation methodology by using both revealed and stated preference data. This model estimated annual household willingness to pay for phosphorus reductions in the Minnesota River at $140. These results may be used in combination with cost estimates to determine the economic efficiency of phosphorus clean up.  相似文献   

10.
Abstract: We proposed a step‐by‐step approach to quantify the sensitivity of ground‐water discharge by evapotranspiration (ET) to three categories of independent input variables. To illustrate the approach, we adopt a basic ground‐water discharge estimation model, in which the volume of ground water lost to ET was computed as the product of the ground‐water discharge rate and the associated area. The ground‐water discharge rate was assumed to equal the ET rate minus local precipitation. The objective of this study is to outline a step‐by‐step procedure to quantify the contributions from individual independent variable uncertainties to the uncertainty of total ground‐water discharge estimates; the independent variables include ET rates of individual ET units, areas associated with the ET units, and precipitation in each subbasin. The specific goal is to guide future characterization efforts by better targeting data collection for those variables most responsible for uncertainty in ground‐water discharge estimates. The influential independent variables to be included in the sensitivity analysis are first selected based on the physical characteristics and model structure. Both regression coefficients and standardized regression coefficients for the selected independent variables are calculated using the results from sampling‐based Monte Carlo simulations. Results illustrate that, while as many as 630 independent variables potentially contribute to the calculation of the total annual ground‐water discharge for the case study area, a selection of seven independent variables could be used to develop an accurate regression model, accounting for more than 96% of the total variance in ground‐water discharge. Results indicate that the variability of ET rate for moderately dense desert shrubland contributes to about 75% of the variance in the total ground‐water discharge estimates. These results point to a need to better quantify ET rates for moderately dense shrubland to reduce overall uncertainty in estimates of ground‐water discharge. While the approach proposed here uses a basic ground‐water discharge model taken from an earlier study, the procedure of quantifying uncertainty and sensitivity can be generalized to handle other types of environmental models involving large numbers of independent variables.  相似文献   

11.
Mountain biking is an increasingly popular leisure pursuit. Consequences are trail degradation and conflicts with hikers and other users. Resource managers often attempt to resolve these problems by closing trails to mountain biking. In order to estimate the impact of these developments, a model has been devised that predicts the effects of changes in trail characteristics and introduction of access fees, and correlates these with biker preference on trail selection. It estimates each individual's per-ride consumer's surplus associated with implementing different policies. The surplus varies significantly as a function of each individual's gender, budget, and interest in mountain biking. Estimation uses stated preference data, specifically choice experiments. Hypothetical mountain bike trails were created and each surveyed biker was asked to make five pair-wise choices. A benefit-transfer simulation is used to show how the model and parameter estimates can be transferred to estimate the benefits and costs to mountain bikers in a specific area.  相似文献   

12.
Reservoir management is a critical component of flood management, and information on reservoir inflows is particularly essential for reservoir managers to make real‐time decisions given that flood conditions change rapidly. This study's objective is to build real‐time data‐driven services that enable managers to rapidly estimate reservoir inflows from available data and models. We have tested the services using a case study of the Texas flooding events in the Lower Colorado River Basin in November 2014 and May 2015, which involved a sudden switch from drought to flooding. We have constructed two prediction models: a statistical model for flow prediction and a hybrid statistical and physics‐based model that estimates errors in the flow predictions from a physics‐based model. The study demonstrates that the statistical flow prediction model can be automated and provides acceptably accurate short‐term forecasts. However, for longer term prediction (2 h or more), the hybrid model fits the observations more closely than the purely statistical or physics‐based prediction models alone. Both the flow and hybrid prediction models have been published as Web services through Microsoft's Azure Machine Learning (AzureML) service and are accessible through a browser‐based Web application, enabling ease of use by both technical and nontechnical personnel.  相似文献   

13.
Valuing freshwater salmon habitat on the west coast of Canada   总被引:3,自引:0,他引:3  
Changes in land use can potentially reduce the quality of fish habitat and affect the economic value of commercial and sport fisheries that rely on the affected stocks. Parks and protected areas that restrict land-use activities provide benefits, such as ecosystem services, in addition to recreation and preservation of wildlife. Placing values on these other benefits of protected areas poses a major challenge for land-use planning. In this paper, we present a framework for valuing benefits for fisheries from protecting areas from degradation, using the example of the Strait of Georgia coho salmon fishery in southern British Columbia, Canada. Our study improves upon previous methods used to value fish habitat in two major respects. First, we use a bioeconomic model of the coho fishery to derive estimates of value that are consistent with economic theory. Second, we estimate the value of changing the quality of fish habitat by using empirical analyses to link fish population dynamics with indices of land use in surrounding watersheds. In our example, we estimated that the value of protecting habitat ecosystem services is C$0.93 to C$2.63 per ha of drainage basin or about C$1322 to C$7010 per km of salmon stream length (C$1.00=US$0.71). Sensitivity analyses suggest that these values are relatively robust to different assumptions, and if anything, are likely to be minimum estimates. Thus, when comparing alternative uses of land, managers should consider ecosystem services from maintaining habitat for productive fish populations along with other benefits of protected areas.  相似文献   

14.
ABSTRACT: Low-flow estimates, as determined by probabilistic modeling of observed data sequences, are commonly used to describe certain streamflow characteristics. Unfortunately, however, reliable low-flow estimates can be difficult to come by, particularly for gaging sites with short record lengths. The shortness of records leads to uncertainties not only in the selection of a distribution for modeling purposes but also in the estimates of the parameters of a chosen model. In flood frequency analysis, the common approach to mitigation of some of these problems is through the regionalization of frequency behavior. The same general approach is applied here to the case of low-flow estimation, with the general intent of not only improving low-flow estimates but also illustrating the gains that might be attained in so doing. Data used for this study is that which has been systematically observed at 128 streamflow gaging sites across the State of Alabama. Our conclusions are that the log Pearson Type 3 distribution is a suitable candidate for modeling of Alabama low-flows, and that the shape parameter of that distribution can be estimated on a regional basis. Low-flow estimates based on the regional estimator are compared with estimates based on the use of only at-site estimation techniques.  相似文献   

15.
A dichotomous-choice contingent-valuation survey was conducted in the State of Louisiana (USA) to estimate compensating surplus (CS) and equivalent surplus (ES) welfare measures for the prevention of future coastal wetland losses in Louisiana. Valuations were elicited using both willingness to pay (WTP) and willingness to accept compensation (WTA) payment vehicles. Mean CS (WTP) estimates based on a probit model using a Box-Cox specification on income was $825 per household annually, and mean ES (WTA) was estimated at $4444 per household annually. Regression results indicate that the major factors influencing support for land-loss prevention were income (positive, WTP model only), perceived hurricane protection benefits (positive), environmental and recreation protection (positive), distrust of government (negative), age (positive, WTA model only), and race (positive for whites).  相似文献   

16.
ABSTRACT: A travel cost model is developed to estimate the potential reductions in recreational benefits from sedimentation in Reelfoot Lake in northwestern Tennessee. In addition to the consumer surplus estimates generated by the model, three other aspects of the study were significant. First, the study applied a relatively untested methodology for deriving the opportunity cost of travel time. The study resulted in a value that is less than one-half of the Water Resource Council's “one-third of the wage rate” rule-of-thumb. Second, water quality perceptions were unsuccessfully incorporated into the model as a demand shifter. This raised questions as to the appropriate manner in which perceptions could be included in a travel cost model. Finally, a simple methodology was outlined by which estimates of the recreational value of Reelfoot Lake could be used to suggest how much cost could be justified for soil erosion control on agricultural land surrounding the lake.  相似文献   

17.
Radon gas occurs naturally in the environment with a variable distribution. In some areas radon concentrates sufficiently within the built environment that it is considered as a public health risk. It is possible, successfully, to reduce radon levels in the built environment, and it has been shown that such remediation programmes can be justified in terms of the costs and benefits accruing. However, the estimated dose received by people in their homes depends on the time spent indoors. The research presented here uses data derived from time activity surveys in Northamptonshire, together with radon data from a representative home, to model potential exposures for different population sub-groups. Average home occupancy ranged from 14.8h (probable error 2.5h) for students to 17.7 (3.1) h for adults; schoolchildren spent an average of 14.9 (1.2) h at home. Over a quarter of adults, however, were in the home for 22 h on more. These differences in occupancy patterns lead to substantial differences in radon exposure. In a home with an average hourly ground floor radon concentration of 467 Bqm(-3), modelled hourly average exposures ranged from ca. 250 Bqm(-3) for students and school children, to over 340 Bqm(-3), for women based at home. Modelled exposures show a non-linear association with total time spent at home, suggesting that exposure estimates based on linear models may provide misleading estimates of health risks from radon and the potential benefits of radon remediation. Highest hourly exposures are likely to be experienced by people with highly occupancy, living in single-storey, ground floor accommodation (for example, the elderly the infirm and non-working young mothers). Since these may be least aware of radon risks, and least able to take up remediation measures, they should be specifically targeted for radon monitoring and for assistance in remediation schemes.  相似文献   

18.
In hydrology, projected climate change impact assessment studies typically rely on ensembles of downscaled climate model outputs. Due to large modeling uncertainties, the ensembles are often averaged to provide a basis for studying the effects of climate change. A key issue when analyzing averages of a climate model ensemble is whether to weight all models in the ensemble equally, often referred to as the equal-weights or unweighted approach, or to use a weighted approach, where, in general, each model would have a different weight. Many studies have advocated for the latter, based on the assumption that models that are better at simulating the past, that is, the models with higher hindcast accuracy, will give more accurate forecasts for the future and thus should receive higher weights. To examine this issue, observed and modeled daily precipitation frequency (PF) estimates for three urban areas in the United States, namely Boston, Massachusetts; Houston, Texas; and Chicago, Illinois, were analyzed. The comparison used the raw output of 24 Coupled Model Intercomparison Project Phase 5 (CMIP5) models. The PFs from these models were compared with the observed PFs for a specific historical training period to determine model weights for each area. The unweighted and weighted averaged model PFs from a more recent testing period were then compared with their corresponding observed PFs to determine if weights improved the estimates. These comparisons indeed showed that the weighted averages were closer to the observed values than the unweighted averages in nearly all cases. The study also demonstrated how weights can help reduce model spread in future climate projections by comparing the unweighted and weighted ensemble standard deviations in these projections. In all studied scenarios, the weights actually reduced the standard deviations compared to the equal-weights approach. Finally, an analysis of the results' sensitivity to the areal reduction factor used to allow comparisons between point station measurements and grid-box averages is provided.  相似文献   

19.
ABSTRACT: An evaluation of flood frequency estimates simulated from a rainfall/runoff model is based on (1) computation of the equivalent years of record for regional estimating equations based on 50 small stream sites in Oklahoma and (2) computation of the bias for synthetic flood estimates as compared to observed estimates at 97 small stream sites with at least 20 years of record in eight eastern states. Because of the high intercorrelation of synthetic flood estimates between watersheds, little or no regional (spatial) information may be added to the network as a result of the modeling activity. The equivalent years of record for the regional estimating equations based totally on synthetic flood discharges is shown to be considerably less than the length of rainfall record used to simulate the runoff. Furthermore, the flood estimates from the rainfall/runoff model consistently underestimate the flood discharges based on observed record, particularly for the larger floods. Depending on the way bias is computed, the synthetic estimate of the 100-year flood discharge varies from 11 to 29 percent less than the value based on observed record. In addition, the correlation between observed and synthetic flood frequency estimates at the same site is also investigated. The degree of correlation between these estimates appears to vary with recurrence interval. Unless the correlation between these two estimates is known, it is not possible to compute a weighted estimate with minimum variance.  相似文献   

20.
Geological surveys worldwide are involved with research in support of sustainable mineral resource development.The socio-economic benefits to be derived from these activities, however, continue to raise organisational and government sector questions. Fundamental questions include whether or not the resources committed are appropriate and in economic balance with the total benefits to be derived. Another question concerns the degree to which such services should be funded by the community at large. These questions in turn raise important issues regarding the role and cost of geological surveys, the impact of their services, and how they should maximise community benefit from their activities and expertise. To assess the value of geoscientific information,standard valuation processes need to be modified. This paper reports on a methodology designed to quantify the 'worth' of programmes upgrading regional geoscientific infrastructure. An interdisciplinary approach is used to measure the impact of geoscientific information using quantitative resource assessment, computer-based mineral potential modelling, statistical analysis and risk quantification to model decision-processes and assess the impact of additional data. These modelling stages are used to address problems of complexity,uncertainty and credibility in the valuation of geoscientific data. A case study demonstrates the application of the methodology to generate a dollar value for current regional data upgrade programmes in the Geological Survey of Queensland. The results obtained are used for strategic planning of future data acquisition programmes aimed at supporting mineral resource management and stimulating effective exploration activity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号