首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 222 毫秒
1.
ABSTRACT: Two major challenges face today's water professionals. The first is finding solutions to increasingly complicated water resources problems. The second challenge is nontechnical. It is effective interaction with the public recognizing both the public's increasingly elevated goals relative to water and the public's growing understanding of water science and technology. The traditional DAD approach, that is, decide-announce-defend, is no longer appropriate. The much more progressive, and inclusive POP approach, that is, public owns project, is more likely to be effective given the changing nature of the public's expectations and knowledge. A water resources planning or design effort that fails to include a public interaction program plans to fail. Described in this paper are three suggested objectives of the POP approach, namely demonstrating awareness, gathering supplemental data and information, and building a base of support. Having established specific objectives for a particular water resources project, appropriate public interaction programs and events must be selected, scheduled, and implemented. Many and varied programs and events are described in the paper.  相似文献   

2.
Being able to determine in advance whether certain events occur or not enables a decision maker to reduce the uncertainty of a two-action lottery, although the exact outcome of the lottery may still not be known with certainty. This paper studies the á priori value of information in such a decision making environment. Of interest to the decision maker is to compare the value of information about different events in advance of gathering the information. Using buying price as value of information, we show that when information about the occurrence of two different events are offered to a risk neutral decision maker, the event with a greater contribution in absolute value to the expected value of the lottery has higher value in terms of its buying price. For risk averse decision makers, a preference condition needs to be imposed on the set difference of two events to obtain a generic conclusion. We provide several examples that demonstrate the usefulness of these results.  相似文献   

3.
ABSTRACT: As watersheds are urbanized, their surfaces are made less pervious and more channelized, which reduces infiltration and speeds up the removal of excess runoff. Traditional storm water management seeks to remove runoff as quickly as possible, gathering excess runoff in detention basins for peak reduction where necessary. In contrast, more recently developed “low impact” alternatives manage rainfall where it falls, through a combination of enhancing infiltration properties of pervious areas and rerouting impervious runoff across pervious areas to allow an opportunity for infiltration. In this paper, we investigate the potential for reducing the hydrologic impacts of urbanization by using infiltration based, low impact storm water management. We describe a group of preliminary experiments using relatively simple engineering tools to compare three basic scenarios of development: an undeveloped landscape; a fully developed landscape using traditional, high impact storm water management; and a fully developed landscape using infiltration based, low impact design. Based on these experiments, it appears that by manipulating the layout of urbanized landscapes, it is possible to reduce impacts on hydrology relative to traditional, fully connected storm water systems. However, the amount of reduction in impact is sensitive to both rainfall event size and soil texture, with greatest reductions being possible for small, relatively frequent rainfall events and more pervious soil textures. Thus, low impact techniques appear to provide a valuable tool for reducing runoff for the events that see the greatest relative increases from urbanization: those generated by the small, relatively frequent rainfall events that are small enough to produce little or no runoff from pervious surfaces, but produce runoff from impervious areas. However, it is clear that there still needs to be measures in place for flood management for larger, more intense, and relatively rarer storm events, which are capable of producing significant runoff even for undeveloped basins.  相似文献   

4.
Automated monitoring devices are useful technologies for communities seeking to document and solve environmental problems. However, without deeper scrutiny of their design and deployment, there is a risk that they will fail to have the impact that many of their promoters intend. We develop a rubric for analysing how different kinds of monitoring devices help environmental advocates influence public debates. We apply this rubric in a study of environmental organizations in Pennsylvania that are choosing between recruiting volunteer citizen scientists and using automated sensor-based devices to gather water quality data in streams threatened by hydraulic fracturing for natural gas. Many organizations rely on volunteers using simple monitoring tools because they are affordable and produce easily managed data sets. An argument for this method of monitoring is that volunteering in the field also fosters citizen engagement in environmental debates. By comparison, we find the increased use of automated devices tends to reinforce hierarchies of expertise and constrains the agendas of nonprofessionals who participate in monitoring projects. We argue that these findings suggest that automated technologies, however effective they may be in gathering data on environmental quality, are not well designed to support broad public participation in environmental science and politics.  相似文献   

5.
ABSTRACT: Watersheds are widely accepted as a useful geography for organizing natural resource management in Australia and the United States. It is assumed that effective action needs to be underpinned by an understanding of the interactions between people and the environment. While there has been some social research as part of watershed planning, there have been few attempts to integrate socio‐economic and biophysical data to improve the efficacy of watershed management. This paper explores that topic. With limited resources for social research, watershed partners in Australia chose to focus on gathering spatially referenced socio‐economic data using a mail survey to private landholders that would enable them to identify and refine priority issues, develop and improve communication with private landholders, choose policy options to accomplish watershed targets, and evaluate the achievement of intermediate watershed plan objectives. Experience with seven large watershed projects provides considerable insight about the needs of watershed planners, how to effectively engage them, and how to collect and integrate social data as part of watershed management.  相似文献   

6.
The White House Conference on Environmental Technology, held December 12–14, 1994, is the most recent of many events building toward a national environmental technology strategy, which President Clinton will announce on April 22, 1995, the 25th anniversary of Earth Day. Promoting innovation and eliminating barriers to new environmental technologies are important issues in developing this strategy. Anticipating these developments, EPA launched its own Technology Innovation Strategy in early 1994. EPA's strategy explicitly calls for strengthening incentives for technology innovation within regulatory, policy, and enforcement programs. In this light, it is worthwhile to look at a recent case study showing how regulations impact innovative environmental technologies, particularly because there appears to be a gathering political sentiment for deeper regulatory reform.  相似文献   

7.
: The modeling of dissolved oxygen in streams is a widely used technique, upon which a great deal of money has been spent. This paper concludes that the standard methods of DO modeling by computer are unnecessarily complex, and that for some purposes, they can be replaced without loss of accuracy by desk top BOD models. Taking as an example, a set of data used in DO modeling, it is shown (a) that the data are grossly inconsistent, (b) that simultaneous gathering of data introduces errors in streams of long travel time, (c) that much more data as to pollutant concentrations should have been obtained, and (d) that 24-hour DO data could have been dispensed with.  相似文献   

8.
A probability model for predicting the occurrence and magnitude of thunderstorm rainfall developed in the southwestern United States was tested in the metropolitan Chicago area with reasonable success, especially for the moderate to the extreme runoff-producing events. The model requires the estimation of two parameters, the mean number of events per year and the conditional probability of rain given that an event has occurred. To tie in the data from more than one gage in an area, an event can be defined in several ways, such as the areal mean rainfall exceeding 0.50 inch and at least one gage receiving more than 1.0 inch. This type of definition allows both of the model parameters to be obtained from daily warm-season rainfall records. Regardless of the definition used a Poisson distribution adequately described the number of events per season. A negative binomial distribution was derived as representing the frequency density function for rainfall where several gages are employed in defining a storm. Chicago data fit both distributions very well at events with relatively high return periods. The results indicate the possibility of using the model on a regional basis where limited amount of data may be used to estimate parameters for extensive areas.  相似文献   

9.
The Seymour aquifer region of Texas has been identified as containing elevated levels of nitrate in ground water. Various state and federal agencies are currently studying policy options for the region by gathering more site-specific information. However, because of lack of sufficient information, cause and effect relationships between water quality and agricultural practices have not been well established for the region. Some recently available biophysical simulation models have impressive capabilities in generating large amounts of data on environmental pollution resulting from agricultural production practices. In this study, the data generated by a biophysical simulation model were used to estimate the nitrate percolation response functions for the Seymour aquifer region. Interestingly, nitrate percolation values obtained from simulation models often comprise acensoredsample because the non-zero percolation values are only observed under certain climatic events and input levels. It has been shown in the econometric literature that the use of Ordinary Least Squares (OLS) on censored sample data produces biased and inconsistent parameter estimates. Thus, a sample selection model was used in this study to estimate the response functions for nitrate percolation. The study provides some insight into the relationship between nitrate percolation and agricultural production practices. In particular, the study demonstrates the potential of selected design standards in minimizing agricultural nonpoint-source (NPS) pollution for the study area.  相似文献   

10.
文章提出一种针对油田采油系统由于井筒原生和地面集输系统次生造成硫化氢污染的综合治理技术,研究形成了集输系统次生硫化氢外源微生物抑制硫酸盐还原菌技术,实现对次生硫化氢的生物抑制。该技术能使油井硫化氢降至5mg/m3,集输系统次生硫化氢降至25mg/m3以下,改善了原油集输系统作业场所工作环境,消除了硫化氢气体对人体危害的安全隐患。  相似文献   

11.
The Colorado River system exhibits the characteristics of a heavily over-allocated or 'closing water system'. In such systems, development of mechanisms to allow resource users to acknowledge interdependence and to engage in negotiations and agreements becomes necessary. Recently, after a decade of deliberations and environmental assessments, the Glen Canyon Dam Adaptive Management Program (GCDAMP) was established to monitor and analyze the effects of dam operations on the Grand Canyon ecosystem and recommend adjustments intended to preserve and enhance downstream physical, cultural and environmental values. The Glen Canyon Dam effectively separates the Colorado into its lower and upper basins. Dam operations and adaptive management decisions are strongly influenced by variations in regional climate. This paper focuses on the management of extreme climatic events within the Glen and Grand Canyon Region of the Colorado River. It illustrates how past events (both societal and physical) condition management flexibility and receptivity to new information. The types of climatic information and their appropriate entry points in the annual cycle of information gathering and decision-making (the 'hydro-climatic decision calendar') for dam operations and the adaptive management program are identified. The study then describes how the recently implemented program, lessons from past events, and new climate information on the Colorado River Basin, facilitated responses during the major El Ni?o-Southern Oscillation (ENSO) event of 1997-1998. Recommendations are made for engaging researchers and practitioners in the effective use of climatic information in similar settings where the decision stakes are complex and the system uncertainty is large.  相似文献   

12.
ABSTRACT: In most studies, quantile estimates of extreme 24-hour rainfall are given in annual probabilities. The probability of experiencing an excessive storm event, however, differs throughout the year. As a result, this paper explored the differences between heavy rainfall distributions by season in Louisiana. It was concluded by using the Kruskal-Wallis and Mann-Whitney tests that the distribution of heavy rainfall events differs significantly between particular seasons at the sites near the Gulf Coast. Furthermore, seasonal frequency curves varied dramatically at the four sites examined. Mixed distributions within these data were not found to be problematic, but the mechanisms that produced the events were found to change seasonally. Extreme heavy rainfall events in winter and spring were primarily generated by frontal weather systems, while summer and fall events had high proportions of events produced by tropical disturbances and airmass (free-convective) conditions.  相似文献   

13.
Protected areas present a global heritage. Assessing conservation achievements in protected areas is of crucial importance with respect to the on-time delivery of international biodiversity conservation targets. However, monitoring data from publicly accessible databases for comparative studies of conservation achievements in the protected areas of the world are very scarce, if not non-existent. At first glance this is surprising because, with regards to protected areas, at least according to well established protected area management guidelines and widely accepted public mandates, a great deal of monitoring work and data gathering is to be conducted. This would imply that data on changes of biodiversity in protected areas could be expected to exist, and the constant progress in information technologies and Web tools engenders hope that some of it might even be available online for the global public. This review article presents the results of an extensive online search and review of existing monitoring data from freely accessible online databases for its use in an assessment of conservation achievements in a larger sample of protected areas. Results show two contrary sides to the status quo of accessible data from the World Wide Web for conservation science: data overkill and data scarcity with poor metadata provision. While ever more research is, in fact, based on open-access online data, such as extrapolations of species ranges used in conservation management and planning, it remains almost impossible to obtain a basic set of information for an assessment of conservation achievements within a larger number of protected areas. This awareness has triggered a detailed discussion about the discrepancies in sharing data at the level of protected areas; mismatching relationships between expected activities in protected areas and the capacity for delivering these requirements are certainly among the main challenges. In addition, the fear of data misuse potentially resulting in harm for nature, careers, and competencies still seems to be a critical barrier strictly controlling the willingness to share data. Various initiatives aimed at tackling technical and cultural obstacles are introduced and discussed to reach the goal of a modern resource management based on adaptive management using digital opportunities of the new millennium for a sustainable global village.  相似文献   

14.
ABSTRACT: While federal water resources laws and regulations require social analysis, no one workable formula exists for integrating it into water resources planning. Two primary problems in integrating social analysis into planning are examined; making trade-offs between policy acceptability and theoretical competence, and managing social analysis in planning. For illustration, the article builds on emerging trends within the U.S. Army Corps of Engineers. It concludes by observing that creative application of social theory to policy problems along with innovative data gathering techniques are the primary routes to managing these problems.  相似文献   

15.
ABSTRACT: Electronic instruments are increasingly being used to gather water quality data. Quality assurance protocols are needed which provide adequate documentation of the procedures followed in calibration, collection, and validation of electronically acquired data. The level of precision of many data loggers exceeds the technology which is commonly used to make field measurements. Overcoming this problem involves using laboratory quality equipment in the field or enhanced quality control at the time of instrument servicing. Time control procedures for data loggers are needed to allow direct comparisons of data between instruments. Electronic instruments provide a mechanism to study transient events in great detail, but, without time controls, multiple loggers produce data which contain artifacts due to timing errors. Individual sensors deployed with data loggers are subject to different degrees of drift over time. Certain measurements can be measured with defined precision and accuracy for long periods of time, while other sensors are subject to loss of both precision and accuracy with increasing time of use. Adequate quality assurance requires the levels of precision and accuracy be documented, particularly those which vary with increasing time deployment.  相似文献   

16.
ABSTRACT: The rainfall‐runoff response of the Tygarts Creek Catchment in eastern Kentucky is studied using TOPMODEL, a hydrologic model that simulates runoff at the catchment outlet based on the concepts of saturation excess overland flow and subsurface flow. Unlike the traditional application of this model to continuous rainfall‐runoff data, the use of TOPMOEL in single event runoff modeling, specifically floods, is explored here. TOPMODEL utilizes a topographic index as an indicator of the likely spatial distribution of rainfall excess generation in the catchment. The topographic index values within the catchment are determined using the digital terrain analysis procedures in conjunction with digital elevation model (DEM) data. Select parameters in TOPMODEL are calibrated using an iterative procedure to obtain the best‐fit runoff hydrograph. The calibrated parameters are the surface transmissivity, TO, the transmissivity decay parameter, m, and the initial moisture deficit in the root zone, Sr0. These parameters are calibrated using three storm events and verified using three additional storm events. Overall, the calibration results obtained in this study are in general agreement with the results documented from previous studies using TOPMODEL. However, the parameter values did not perform well during the verification phase of this study.  相似文献   

17.
The effects of human trampling and firewood gathering on eight backcountry campsites in the Great Smoky Mountains were surveyed. Sample plots were classified as sitecenter, transition, firewood-gathering area, and control. The canopy in the center of the sites tended to be more open than that of control plots, with the greatest openings occurring at shelter sites in spruce-fir forest. Intensive human trampling in the center of the sites inhibited reproduction of tree species, whereas firewood gathering alone did not. In some cases where canopy opening had occurred, there was an increase in shrub and tree reproduction around the edge of the site. Reduction in the basal area of standing deadwood varied with the type of site; older growth stands were less depleted. Injuries to trees increased tenfold from control areas to the center of the campsites. Smaller fuels were more strongly impacted by trampling and little impacted by firewood gathering. Woody fuels in the 2.5- to 7.6-cm size class were preferred for firewood. A previously constructed carbon cycling model was modified to incorporate removal of firewood and litter on campsites. The model suggested that after extended removal of leaf litter, soil carbon takes 12 to 50 years to recover, but this hypothesis remains to be tested in the field.  相似文献   

18.
This study contributes a bathtub‐style inundation prediction model with abstractions of coastal processes (i.e., storm surge and wave runup) for flood forecasting at medium‐range (weekly to monthly) timescales along the coastline of large lakes. Uncertainty from multiple data sources are propagated through the model to establish probabilistic bounds of inundation, providing a conservative measure of risk. The model is developed in a case study of the New York Lake Ontario shoreline, which has experienced two record‐setting floods over the course of three years (2017–2019). Predictions are developed at a parcel‐level and are validated using inundation accounts from an online survey and flyover imagery taken during the recent flood events. Model predictions are compared against a baseline, deterministic model that accounts for the same processes but does not propagate forward data uncertainties. Results suggest that a probabilistic approach helps capture observed instances of inundation that would otherwise be missed by a deterministic inundation model. However, downward biases are still present in probabilistic predictions, especially for parcels impacted by wave runup. The goal of the tool is to provide community planners and property owners with a conservative, parcel‐level assessment of flood risk to help inform short‐term emergency response and better prepare for future flood events.  相似文献   

19.
/ The necessity to tailor information becomes increasingly urgent as the information revolution continues to generate ever-increasing flows of data and so-called information. From European experiences, a new approach for monitoring system design is suggested in this paper. In this approach, careful and detailed specification of information needs is a major contributing factor to the effectiveness of information products. To develop better specifications for information products, the process of collecting and transforming data into useful information requires careful thought and guidance. A dialogue between information users on one hand and information producers on the other is essential. This dialogue can be based on the information cycle, describing the continuous process from specifying information needs for water management and a strategy to collect information through data collection and data analysis up to utilization of information by water management. By following the respective steps in the information cycle, the process of information gathering can be completed. The cyclic character provides a quantitative means of connecting monitoring system design and operations with the information expectations and/or products required by management.  相似文献   

20.
Abstract: Alluvial fans are continuously being developed for residential, industrial, commercial, and agricultural uses in southern California. Development and alteration of alluvial fans need to consider the possibility of mud and debris flows from upstream mountain watersheds affected by fires. Accurate prediction of sediment yield (or hyper‐concentrated sediment yield) is essential for the design, operation, and maintenance of debris basins to safeguard properly the general populace. This paper presents a model for the prediction of sediment yields that result from a combination of fire and subsequent storm events. The watersheds used in this analysis are located in the foothills of the San Gabriel Mountains in southern California. A multiple regression analysis is first utilized to establish a fundamental statistical relationship for sediment yield as a function of relief ratio, drainage area, maximum 1‐h rainfall intensity and fire factor using 45 years of data (1938‐1983). In addition, a method for multi‐sequence sediment yield prediction under fire conditions was developed and calibrated using 17 years of sediment yield, fire, and precipitation data for the period 1984‐2000. After calibration, this model was verified by applying it to provide a prediction of the sediment yields for the 2001‐2002 fire events in southern California. The findings indicate a strong correlation between the estimated and measured sediment yields. The proposed method for sequence sediment yield prediction following fire events can be a useful tool to schedule cleanout operations for debris basins and to develop an emergency response strategy for the southern California region where plentiful sediment supplies exist and frequent fires occur.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号