首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
: The modeling of dissolved oxygen in streams is a widely used technique, upon which a great deal of money has been spent. This paper concludes that the standard methods of DO modeling by computer are unnecessarily complex, and that for some purposes, they can be replaced without loss of accuracy by desk top BOD models. Taking as an example, a set of data used in DO modeling, it is shown (a) that the data are grossly inconsistent, (b) that simultaneous gathering of data introduces errors in streams of long travel time, (c) that much more data as to pollutant concentrations should have been obtained, and (d) that 24-hour DO data could have been dispensed with.  相似文献   

2.
The main objective of this research is to model the uncertainty associated with GIS-based multi-criteria decision analysis (MCDA) for crop suitability assessment. To achieve this goal, an integrated approach using GIS-MCDA in association with Monte Carlo simulation (MCS) and global sensitivity analysis (GSA) were applied for Saffron suitability mapping in East-Azerbaijan Province in Iran. The results of this study indicated that integration of MCDA with MCS and GSA could improve modeling precision by reducing data variance. Results indicated that applying the MCS method using the local training data leads to computing the spatial correlation between criteria weights and characteristics of the study area. Results of the GSA method also allow us to obtain the priority of criteria and identify the most important criteria and the variability of outputs under uncertainty conditions for model inputs. The findings showed that, commonly used primary zoning methods, without considering the interaction effects of variables, had significant errors and uncertainty in the output of MCDA-based suitability models, which should be minimized by the advanced complementarity of sensitivity and uncertainty analysis.  相似文献   

3.
The Precautionary Principle is a legal mechanism for managing the environmental risk arising from incomplete scientific knowledge of a proposal's impacts. The Precautionary Principle is applied to actions that carry with them the potential for serious or irreversible environmental change. The model proposed in this paper draws on methods used in a range of disciplines for modeling (potentially highly nonlinear) interactions between multiple parts of a complex system. These methods have been drawn together under the common mathematical umbrella of Fitness Landscape Theory. It is argued that the model, called “Environmental Impact Fitness Landscapes,” allows statements about the sensitivity of the gross effect from a set of impacts to be made when the number of impacts in the set, and/or their degree of interaction, is varied. It is argued that this can be achieved through identification of “meta” or “emergent” properties of the set itself, without reference to the specific causal chains determining behavior in specific instances. While such properties are very general, they may at least allow for the parameterization of the effects of sets of impacts where interactions are highly uncertain and empirical data severely limited, i.e., situations that would typically invoke the Precautionary Principle.  相似文献   

4.
ABSTRACT: Gage-induced biases in monthly precipitation are estimated and removed at 1818 stations across the continental United States from 1950 through 1987. Deleterious effects of the wind and wetting losses on the interior walls of the gage were considered. These “corrected” estimates were obtained using site-specific information including wind speed, shelter-height air temperature, gage height, and sheltering. Wind speed and air temperature were interpolated at stations for which these data were not available using a spherically-based, nearest neighbor interpolation procedure. Results indicate that, as expected, biases are greater in the winter than the summer owing to the increased problems (particularly wind-induced) of measuring snowfall. In summer, percent errors range between 4 and 6 percent over nearly three-quarters of the United States with slightly larger errors over the Rocky Mountains. By contrast, winter biases are highly correlated with snowfall totals and percentage errors increase poleward, mimicking patterns of snowfall frequency. Since these biases are not trivial, they must be accounted for in order to obtain accurate and reliable time-series. If these biases are not properly addressed, serious errors can be introduced into climate change, hydrologic modeling, and environmental impact research.  相似文献   

5.
ABSTRACT: A major contaminant monitoring and modeling study is underway for Green Bay, Lake Michigan. Monitoring programs in support of contaminant modeling of large waterbodies, such as for Green Bay, are expensive and their extent is often limited by budget limitations, laboratory capacity, and logistic constraints. Therefore, it is imperative that available resources be used in the most efficient manner possible. This use, or allocation of resources, may be aided through the application of readily available models in the planning stages of projects. To aid in the planning effort for the Green Bay project, a workshop was held and studies designed to aid in the allocation of resources for a year-long intensive field study. Physical/chemical and food chain models were applied using historical data to aid in project planning by identifying processes having the greatest impact on the predictive capability of mass balance models. Studies were also conducted to estimate errors in computed tributary loadings and in-Bay concentrations and contaminant mass associated with different sampling strategies. The studies contributed to the overall project design, which was a collaborative effort with many participants involved in budgeting, field data collection, analysis, processing of data, quality assurance, data management and modeling activities.  相似文献   

6.
ABSTRACT: By employing a set of criteria for classifying the capabilities of time series models, recent developments in time series analysis are assessed and put into proper perspective. In particular, the inherent attributes of a wide variety of time series models and modeling procedures presented by the authors of the 18 papers contained in this volume are clearly pointed out. Additionally, it is explained how these models can address many of the time series problems encountered when modeling hydrologic, water quality and other kinds of time series. For instance, families of time series models are now available for modeling series which may contain nonlinearities or may follow nonGaussian distributions. Based upon a sound physical understanding of a problem and results from exploratory data analyses, the most appropriate model to fit to a data set can be found during confirmatory data analyses by following the identification, estimation and diagnostic check stages of model construction. Promising future research projects for developing flexible classes of time series models for use in water resources applications are suggested.  相似文献   

7.
Determining a remeasurement frequency of variables over time is required in monitoring environmental systems. This article demonstrates methods based on regression modeling and spatio-temporal variability to determine the time interval to remeasure the ground and vegetation cover factor on permanent plots for monitoring a soil erosion system. The spatio-temporal variability methods include use of historical data to predict semivariograms, modeling average temporal variability, and temporal interpolation by two-step kriging. The results show that for the cover factor, the relative errors of the prediction increase with an increased length of time interval between remeasurements when using the regression and semivariogram models. Given precision or accuracy requirements, appropriate time intervals can be determined. However, the remeasurement frequency also varies depending on the prediction interval time. As an alternative method, the range parameter of a semivariogram model can be used to quantify average temporal variability that approximates the maximum time interval between remeasurements. This method is simpler than regression and semivariogram modeling, but it requires a long-term dataset based on permanent plots. In addition, the temporal interpolation by two-step kriging is also used to determine the time interval. This method is applicable when remeasurements in time are not sufficient. If spatial and temporal remeasurements are sufficient, it can be expanded and applied to design spatial and temporal sampling simultaneously.  相似文献   

8.
The Great Lakes Basin Commission has initiated a Framework Study to assess the present and projected water- and related land-resource problems and demands in the Great Lakes Basin. Poorly defined objectives; incomplete and inconsistent data arrays; unknown air, biota, water, and sediment interactions; and multiple planning considerations for interconnected, large lake systems hinder objective planning. To incorporate mathematical modeling as a planning tool for the Great Lakes, a two-phase program, comprising a feasibility and design study followed by contracted and in-house modeling, data assembly, and plan development, has been initiated. The models will be used to identify sensitivities of the lakes to planning and management alternatives, insufficiencies in the data base, and inadequately understood ecosystem interactions. For the first time objective testing of resource-utilization plans to identify potential conflicts will provide a rational and cost-effective approach to Great Lakes management. Because disciplines will be interrelated, the long-term effects of planning alternatives and their impacts on neighboring lakes and states can be evaluated. Testing of the consequences of environmental accidents and increased pollution levels can be evaluated, and risks to the resource determined. Examples are cited to demonstrate the use of such planning tools.  相似文献   

9.
ABSTRACT: In geohydrology, three-dimensional surfaces are typically represented as a series of contours. Water levels, saturated thickness, precipitation, and geological formation boundaries are a few examples of this practice. These surfaces start as point measurements that are then analyzed to interpolate between the known point measurements. This first step typically creates a raster or a set of grid points. In modeling, subsequent processing uses these to represent the shape of a surface. For display, they are usually converted to contour lines. Unfortunately, in many field applications, the (x, y) location on the earth's surface is much less confidently known than the data in the z dimension. To test the influence of (x, y) locational accuracy on z dimension point predictions and their resulting contours, a Monte Carlo study was performed on water level data from northwestern Kansas. Four levels of (x, y) uncertainty were tested ranging in accuracy from one arc degree-minute (± 2384 feet in the x dimension and ± 3036 feet in the y dimension) to Global Positioning Systems (GPS) accuracy (± 20 feet for relatively low cost systems). These span the range of common levels of locational uncertainty in data available to hydrologists in the United States. This work examines the influence that locational uncertainty can have on both point predictions and contour lines. Results indicate that overall mean error exhibits a small sensitivity to locational uncertainty. However, measures of spread and maximum errors in the z domain are greatly affected. In practical application, this implies that estimates over large regions should be asymptotically consistent. However, local errors in z can be quite large and increase with (x, y) uncertainty.  相似文献   

10.
ABSTRACT: Potential evapotranspiration (PET) is an important index of hydrologic budgets at different spatial scales and is a critical variable for understanding regional biological processes. It is often an important variable in estimating actual evapotranspiration (AET) in rainfall‐runoff and ecosystem modeling. However, PET is defined in different ways in the literature and quantitative estimation of PET with existing mathematical formulas produces inconsistent results. The objectives of this study are to contrast six commonly used PET methods and quantify the long term annual PET across a physiographic gradient of 36 forested watersheds in the southeastern United States. Three temperature based (Thornthwaite, Hamon, and Hargreaves‐Samani) and three radiation based (Turc, Makkink, and Priestley‐Taylor) PET methods are compared. Long term water balances (precipitation, streamflow, and AET) for 36 forest dominated watersheds from 0.25 to 8213 km2 in size were estimated using associated hydrometeorological and land use databases. The study found that PET values calculated from the six methods were highly correlated (Pearson Correlation Coefficient 0.85 to 1.00). Multivariate statistical tests, however, showed that PET values from different methods were significantly different from each other. Greater differences were found among the temperature based PET methods than radiation based PET methods. In general, the Priestley‐Taylor, Turc, and Hamon methods performed better than the other PET methods. Based on the criteria of availability of input data and correlations with AET values, the Priestley‐Taylor, Turc, and Hamon methods are recommended for regional applications in the southeastern United States.  相似文献   

11.
Abstract:  Data interpretation and visualization software tools with geostatistical capabilities were adapted, customized, and tested to assist the Chesapeake Bay Program in improving its water‐quality modeling protocols. Tools were required to interpolate, map, and visualize three‐dimensional (3D) water‐quality data, with the capability to determine estimation errors. Components of the software, originally developed for ground‐water modeling, were customized for application in estuaries. Additional software components were developed for retrieval, and for pre‐ and post‐ processing of data. The Chesapeake Bay Program uses the 3D mapped data for input to the Bay water‐quality model that projects the future health of the Bay and its tidal tributary system. In determining water‐quality attainment criteria, 3D kriging estimation errors are needed as a statistical measure of uncertainty. Furthermore, given the high cost of installing and operating new monitoring stations, geostatistical techniques can assist the Chesapeake Bay Program in the identification of suitable data collection locations. Following the evaluation, selection, and development of the software components phase, 3D ordinary kriging techniques with directional semi‐variograms to account for anisotropy were successfully demonstrated for mapping 3D fixed station water‐quality data, such as dissolved oxygen and salinity. Additionally, an improved delineation tool was implemented to simulate the upper and lower pycnocline boundary surfaces allowing the segregation of the interpolated 3D data into three separate zones for a better characterization of the pycnocline layer.  相似文献   

12.
Field surveys of biological responses can provide valuable information about environmental status and anthropogenic stress. However, it is quite usual for biological variables to differ between sites or change between two periods of time also in the absence of an impact. This means that there is an obvious risk that natural variation will be interpreted as environmental impact, or that relevant effects will be missed due to insufficient statistical power. Furthermore, statistical methods tend to focus on the risks for Type-I error, i.e. false positives. For environmental management, the risk for false negatives is (at least) equally important. The aim of the present study was to investigate how the probabilities for false positives and negatives are affected by experimental set up (number of reference sites and samples per site), decision criteria (statistical method and α-level) and effect size. A model was constructed to simulate data from multiple reference sites, a negative control and a positive control. The negative control was taken from the same distribution as the reference sites and the positive control was just outside the normal range. Using the model, the probabilities to get false positives and false negatives were calculated when a conventional statistical test, based on a null hypothesis of no difference, was used along with alternative tests that were based on the normal range of natural variation. Here, it is tested if an investigated site is significantly inside (equivalence test) and significantly outside (interval test) the normal range. Furthermore, it was tested how the risks for false positives and false negatives are affected by changes in α-level and effect size. The results of the present study show that the strategy that best balances the risks between false positives and false negatives is to use the equivalence test. Besides tests with tabulated p-values, estimates generated using a bootstrap routine were included in the present study. The simulations showed that the probability for management errors was smaller for the bootstrap compared to the traditional test and the interval test.  相似文献   

13.
ABSTRACT: Three methods of modeling acid mine drainage effects are discussed. A net alkalinity routing model is the simplest of these, but can be potentially misleading. It typically overestimates the effect of acid sources on pH by neglecting carbon dioxide transfer to the atmosphere. Inclusion of a simple carbon dioxide transfer function can substantially reduce errors in stream quality prediction. A plug flow reaeration equation, coupled with mass balancing at mixing points in a stream network provides modeling results comparable to those of more complex computerized solutions of chemical equilibrium equations. None of the models accounts for carbonate dissolution or oxidation and hydrolysis of ferrous iron.  相似文献   

14.
: Despite the advances in catchment modeling in recent years, engineers still face major problems in estimating flood flows. Application of unit hydrograph and runoff routing models to five United Kingdom catchments shows that either can be tuned to predict, on a test event, the routing effects of a catchment with equal accuracy. The larger remaining problem is the prediction of losses from rainfall and this study shows how alternative ways of describing the within event distribution of these losses can be an important factor controlling the success of the overall model. Other problems include the risks of extrapolation to larger events, baseflow separation methods, rainfall patterns, and inevitable errors in the data.  相似文献   

15.
Humans have transformed much of Earth’s land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human–landscape interactions. Analysts are now faced with a staggering array of approaches in the human–landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human–landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.  相似文献   

16.
Production possibility modeling has been applied to a variety of wildlife management issues. Although it has seen only limited employment in modeling human-wildlife output decisions, it can be expected that the theory's use in this area will increase as human interactions with and impacts on wildlife become more frequent. At present, most models applying production possibility theory to wildlife production can be characterized in that wildlife output quantities are determined by physically quantifiable functions representing rivalrous resources. When the theory is applied to human-wildlife interactions, it may not be sufficient to model the production tradeoffs using only physical constraints. As wildlife are known to respond to human presence, it could be expected that human activity may appear in wildlife production functions as an externality. Behavioral externalities are revealed by an output's response to the presence of another output and can result in a loss of concavity of the production possibilities frontier. Ignoring the potential of a behavioral externality can result in an unexpected and inefficient output allocation that may compromise a wildlife population's well-being. Behavioral externalities can be included in PPF models in a number of ways, including the use of data or cumulative effects modeling. While identifying that behavioral externalities exist and incorporating them into a model is important, correctly interpreting their implications will be critical to improve the efficiency of natural resource management. Behavioral externalities may cause a loss of concavity anywhere along a PPF that may compel managerial decisions that are inconsistent with multiple use doctrines. Convex PPFs may result when wildlife species are extremely sensitive to any level of human activity. It may be possible to improve the PPF's concavity by reducing the strength of the behavioral effect. Any change in the PPF that increases the convexity of the production set could offer natural resource managers additional opportunities to optimally provide multiple natural resource outputs. Techniques that minimize the effect could focus on either the human or wildlife outputs, or both. Methods could focus on reducing the externality itself through changing the production of the offending output or to reduce the impact of the externality through a change in the production of the affected output. Managers unfamiliar with PPF modeling can employ PPF thinking by recognizing that every decision involves tradeoffs and that sometimes these tradeoffs are unnecessary negative impacts that could be mitigated without compromising the resource.  相似文献   

17.
ABSTRACT. A relatively straightforward illustration of the potential uses of State Estimation techniques in water resources modeling is given. Background theory for Linear and Extended Kalman Filters is given; application of the filter techniques to modeling BOD and oxygen deficit in a stream illustrates the importance of model conceptualization, model completeness, uncertainty in model dynamics and incorporation of measurements and measurement errors. Potential applications of state estimation techniques to measurement system design; model building, assessment and calibration; and data extension are explored.  相似文献   

18.
ABSTRACT: The Nebraska Sand Hills have a unique hydrologic system with very little runoff and thick aquifers that constantly supply water to rivers, lakes, and wetlands. A ground water flow model was developed to determine the interactions between ground water and streamflow and to simulate the changes in ground water systems by reduced precipitation. The numerical modeling method includes a water balance model for the vadose zone and MOD‐FLOW for the saturated zone. The modeling results indicated that, between 1979 and 1990, 13 percent of the annual precipitation recharged to the aquifer and annual ground water loss by evapotranspiration (ET) was only about one‐fourth of this recharge. Ground water discharge to rivers accounts for about 96 percent of the streamflow in the Dismal and Middle Loup rivers. When precipitation decreased by half the average amount of the 1979 to 1990 period, the average decline of water table over the study area was 0.89 m, and the streamflow was about 87 percent of the present rate. This decline of the water table results in significant reductions in ET directly from ground water and so a significant portion of the streamflow is maintained by capture of the salvaged ET.  相似文献   

19.
Two perspectives in the analysis of pointing and mapping tasks as the measure of representations of the large-scale environment are examined. These two perspectives are: (1) an individual difference approach; and (2) a cognitive representational approach. Convergence between methods assessing the same geographical/spatial knowledge is necessary as evidence for the existence of unified cognitive-spatial representations of the environment.Three sets of analyses interrelate performance on pointing and mapping tasks. In the first analysis, a confirmatory factor analytic model is applied to short tests of pointing and mapping accuracy to determine whether one or two factors are needed to account for covariation between the tests. In the second analysis, covariation among errors in pointing and mapping of specific locations is partitioned into general and specific method factors using the Schmid-Leiman procedure. In the third analysis, pointing errors for identical locations within the mapping and pointing tasks are directly compared on the basis of directional errors.The three analyses indicate that: (1) tests of pointing and mapping measure highly related abilities; (2) the targets used in pointing and mapping tasks are of differential importance in identifying general and specific method factors; and (3) there is little or no direct correspondence between directional errors made in pointing tasks and those occurring in mapping tasks for the same locations.When results of the three analyses are examined in relationship to criteria for convergence of pointing and mapping tasks, little evidence is found to suggest that directional errors in these tasks arise from a unified mental representation of the geographical environment. However, substantial predictable individual differences are apparent for both tasks.  相似文献   

20.
Policy enabling tropical forests to approach their potential contribution to global-climate-change mitigation requires forecasts of land use and carbon storage on a large scale over long periods. In this paper, we present an integrated modeling methodology that addresses these needs. We model the dynamics of the human land-use system and of C pools contained in each ecosystem, as well as their interactions. The model is national scale, and is currently applied in a preliminary way to Costa Rica using data spanning a period of over 50 years. It combines an ecological process model, parameterized using field and other data, with an economic model, estimated using historical data to ensure a close link to actual behavior. These two models are linked so that ecological conditions affect land-use choices and vice versa. The integrated model predicts land use and its consequences for C storage for policy scenarios. These predictions can be used to create baselines, reward sequestration, and estimate the value in both environmental and economic terms of including C sequestration in tropical forests as part of the efforts to mitigate global climate change. The model can also be used to assess the benefits from costly activities to increase accuracy and thus reduce errors and their societal costs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号