首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
ABSTRACT: The Palmer Drought Severity Index (PDSI) has been calculated for about 30 years as a means of providing a single measure of meteorological drought severity. It was intended to retrospectively look at wet and dry conditions using water balance techniques. The Standardized Precipitation Index (SPI) is a probability index that was developed to give a better representation of abnormal wetness and dryness than the Palmer indices. Before the user community will accept the SPI as an alternative to the Palmer indices, a standard method must be developed for computing the index. Standardization is necessary so that all users of the index will have a common basis for both spatial and temporal comparison of index values. If different probability distributions and models are used to describe an observed series of precipitation, then different SPI values may be obtained. This article describes the effect on the SPI values computed from different probability models as well as the effects on dry event characteristics. It is concluded that the Pearson Type III distribution is the “best” universal model, and that the reliability of the SPI is sample size dependent. It is also concluded that because of data limitations, SPIs with time scales longer than 24 months may be unreliable. An internet link is provided that will allow users to access Fortran 77 source code for calculating the SPI.  相似文献   

2.
ABSTRACT: The Peace River at Arcadia, Florida, is a municipal water supply supplement for southwestern Florida. Consequently, probabilities of encountering low flows during the dry season are of critical importance. Since the association between Pacific Ocean sea surface temperatures (SSTs) and seasonal streamflow variability in the southeastern United States is well documented, it is reasonable to generate forecasts based on this information. Here, employing historic records of minimum, mean, and maximum flows during winter (JFM) and spring (AMJ), upper and lower terciles define “above normal,”“normal,” and “below normal” levels of each variable. A probability distribution model describes the likelihood of these seasonal variables conditioned upon Pacific SSTs from the previous summer (JAS). Model calibration is based upon 40 (of 50) years of record employing stratified random sampling to ensure equal representation from each decade. The model is validated against the remaining 10 samples and the process repeated 100 times. Each conditional probability distribution yields varying probabilities of observing flow variables within defined categories. Generally, a warm (cold) Pacific is associated with higher (lower) flows. To test model skill, the forecast is constrained to be the most probable category in each calibration year, with significance tested by chi‐square frequency tables. For all variables, the tables indicate high levels of association between forecast and observed terciles and forecast skill, particularly during winter. During spring the pattern is less clear, possibly due to the variable starting date of the summer rainy season. This simple technique suggests that Pacific SSTs provide a good forecast of low flows.  相似文献   

3.
A probability model for predicting the occurrence and magnitude of thunderstorm rainfall developed in the southwestern United States was tested in the metropolitan Chicago area with reasonable success, especially for the moderate to the extreme runoff-producing events. The model requires the estimation of two parameters, the mean number of events per year and the conditional probability of rain given that an event has occurred. To tie in the data from more than one gage in an area, an event can be defined in several ways, such as the areal mean rainfall exceeding 0.50 inch and at least one gage receiving more than 1.0 inch. This type of definition allows both of the model parameters to be obtained from daily warm-season rainfall records. Regardless of the definition used a Poisson distribution adequately described the number of events per season. A negative binomial distribution was derived as representing the frequency density function for rainfall where several gages are employed in defining a storm. Chicago data fit both distributions very well at events with relatively high return periods. The results indicate the possibility of using the model on a regional basis where limited amount of data may be used to estimate parameters for extensive areas.  相似文献   

4.
ABSTRACT: The traditional solution to stormwater runoff from housing developments has been stormwater sewer systems. A newer and increasingly popular solution is some sort of impoundment or “lake” within the development, which is thought to be cheaper, to provide recreation, to improve the aesthetics of the environment, and to increase property values. Little is known of the acceptability of these to public officials, developers, or potential residents, or of their policy implications. Two such developments in Mississauga, Ontario, were studied, in terms of the perceptions and opinions of a random sample of residents and of officials who had been involved in their planning and management. The areas have attracted a relatively young group, just beginning their child-bearing years, with relatively high income and education. The lakes appear to be popular, and relatively successful, especially the one which provides more recreational opportunities, and which has had fewer maintenance problems. The major problems are visual and safety. The City, and to some extent the developers, are seen as the appropriate groups to manage and maintain the lakes. Some suggestions, based on residents' and officials' responses, are presented for future designs and policy formulation.  相似文献   

5.
ABSTRACT: During the last 27 years of independence, a large number of inter-state water disputes cropped up over the use of rivers. Surprisingly enough, more disputes developed in this short period than in the earlier 200 years of the development of irrigation and so far none of the disputes has been permanently solved. The major rivers of India are all inter-state rivers and this is one of the more important reasons why some of them are not yet fully developed for irrigation or power production. The Union Government has set up, so far, only three tribunals to adjudicate inter-state disputes. But the problems do not end simply by setting up the tribunals. In practice, it has also proved a dilatory process. None of the tribunals has been successful in settling any dispute in the long years of their existence. There is no codified law prescribing rights and the notion of “equity” has come to prevail restraining the upper states from drawing such quantities of water as would injure the interests of the lower states. Though the general principle of equitable apportionment had been advocated many times, in practice each contending state had given this principle an interpretation that suited it. The basic principle would be to harness the rivers not for the benefit of a particular state but for the maximization of agricultural, industrial, and navigational potential in the areas served by the rivers.  相似文献   

6.
ABSTRACT: A general model of the policy implementation process is utilized to facilitate a discussion of the way Section 208 of PL 92-500 is being carried out on an areawide basis. A study of four “208 areas” in the “New York-Philadelphia corridor” highlights the operation of several variables used in the model. The varying political and socioeconomic conditions in geographic areas which have similar water quality problems are leading to the evolution of vastly different implementing structures, or institutional arrangements. The analysis suggests that these differences may have important implications for the success of the program in each of these areas. A major underlying theme is that such problems are characteristic of the 208 process nationwide and reflect general difficulties associated with managing water quality in a federal system.  相似文献   

7.
In March 2012, a brownfield site in Cologne was transformed into “a green garden on red clay”, when a community garden called NeuLand (new land) was created. This paper investigates in how far NeuLand is typical for a new form of political engagement 2.0, focused on local problems at people's doorstep rather than global critiques of political systems, which finds its expression in direct actions typical for the networked society, e.g. “green flash mobs.” Its potential to provide a blueprint for imagining and enacting alternative futures and new ways for citizens to claim their “right to the city” is being assessed. NeuLand provides an experiment with new forms of (urban) commons and possibly a (re)turn to the “liveable city” to replace the current neoliberal ideal of the “entrepreneurial city” [Harvey, D., 1989. From manageralism to entrepreneuralism: the transformation of urban governance in late capitalism. Geografiska Annaler, Series B, Human Geography, 71 (1), 3–17], developing new solutions to problems of urban management and city development that extend beyond the voluntary engagement of citizens within the logic of the neoliberal “big society.” Extending the scope beyond the analysis of urban gardening projects as examples of sustainable food production, or as vehicles for fostering community cohesion, integration or social capital, the NeuLand experiment is linked to wider debates of alternative and more sustainable socio-ecological futures than those currently practised in the newly “neoliberalizing cities” of Germany.  相似文献   

8.
Decades of research have sought to understand how disaster preparedness decisions are made. We believe one understudied factor is the impact of near-miss events. A near-miss occurs when an event (such as a hurricane or terrorist attack) has some non-trivial probability of ending in disaster (loss of life, property damage), but the negative outcome is avoided largely by chance (e.g., at the last minute, the storm dissipates or the bomb fails to detonate). In the first of two experiments, we study reactions to a hurricane threat when participants are told about prior near-miss events. We find that people with information about a prior near-miss event that has no negative consequences are less likely to take protective measures than those with either no information or information about a prior near-miss event that has salient negative information. Similar results have been shown in prior research, but we seek to understand people’s reasoning for the different reactions. We examine the role of an individual’s risk propensity and general level of optimism as possible explanatory variables for the “near-miss” effect. We find risk propensity to be stable across conditions, whereas general optimism is influenced by the type of prior near-miss information, so that optimism mediates how near-miss information impacts protective decisions. People who experience a potentially hazardous near-miss but escape without obvious cues of damage will feel more optimistic and take less protective action. In the second study, we test messages about the hazard’s risk and examine the impact of these messages to offset the influence of near-misses. We end by discussing the implications of near-misses for risk communication.  相似文献   

9.
ABSTRACT: Samples from 107 piñon pines (Pinns edulis) at four sites were used to develop a proxy record of annual (June to June) precipitation spanning the 1226 to 2001 AD interval for the Uinta Basin Watershed of northeastern Utah. The reconstruction reveals significant precipitation variability at interannual to decadal scales. Single‐year dry events before the instrumental period tended to be more severe than those after 1900. In general, decadal scale dry events were longer and more severe prior to 1900. In particular, dry events in the late 13th, 16th, and 18th Centuries surpass the magnitude and duration of droughts seen in the Uinta Basin after 1900. The last four decades of the 20th Century also represent one of the wettest periods in the reconstruction. The proxy record indicates that the instrumental record (approximately 1900 to the Present) underestimates the potential frequency and severity of severe, sustained droughts in this area, while over representing the prominence of wet episodes. In the longer record, the empirical probability of any decadal scale drought exceeding the duration of the 1954 through 1964 drought is 94 percent, while the probability for any wet event exceeding the duration of the 1965 through 1999 wet spell is only 1 percent. Hence, estimates of future water availability in the Uinta Basin and forecasts for exports to the Colorado River, based on the 1961 to 1990 and 1971 to 2000 “normal” periods, may be overly optimistic.  相似文献   

10.
The value of information (VOI) can be used to determine what kind of spatial information maybe relevant and useful for groundwater sustainability decisions. In this paper, the unique challenges for applying VoI to spatial information from geophysical data are described. The uncertainty regarding the spatial structure or continuity of the subsurface properties can be described with geostatistical sample models. Using these models, one can quantify the prior value given our present state of uncertainty and a set of decision alternatives and outcomes. Because geophysical techniques are a type of remote-sensing data, assuming “perfect” information is not realistic since the techniques usually are indirectly sampling the aquifer properties. Therefore, the focus of this paper is describing how the data reliability (the measure of imperfectness) can be quantified. One of the foremost considerations is the non-unique relationship between geological parameters (which determine groundwater flow) and geophysical observables (what determines the response of the technique). Another is to have the information in a form such that it is useful for spatial decisions. This will often require inversion and interpretation of the geophysical data. Inversion reconstructs an image of the subsurface from the raw geophysical data. How closely the image reproduces the true subsurface structure or property of interest depends on the particular technique’s resolution, depth of investigation and sensor locations. Lastly, in some cases, interpretation of the geophysical data or inversion will be necessary to link the data to the variables that determine the outcome of the decision. Three examples are provided that illustrate different approaches and methods for addressing these challenges. In the examples, time-domain electromagnetic and electrical resistivity techniques are evaluated for their ability to assist in spatial decisions for aquifer management. The examples considered address these three situations: aquifer vulnerability to surface–borne contaminants, managed aquifer recharge and CO2/brine leakage (related to CO2 geologic sequestration activities). The methods presented here are transferable to other subsurface sciences and decisions that involve risk. Recent work has been applied to geothermal well-siting using electromagnetic techniques. These approaches can also be applied for oil and mining spatial decisions, and they offer advantages over previous VOI work done for oil applications: they explicitly include the geologic uncertainty modeling and simulate the physics of the considered geophysical technique.  相似文献   

11.
12.
ABSTRACT: Increasing awareness about the problems brought on by urban sprawl has led to proactive measures to guide future development. Such efforts have largely been grouped under the term “Smart Growth.” Although not widely recognized as such, the “smart” in Smart Growth implies an optimization of some quantity or objective while undertaking new forms of urban development. In this study, we define Smart Growth as that development plan that leads to the optimal value of a precisely defined measure identified by a stakeholder or stakeholders. To illustrate a formal, quantitative framework for Smart Growth, this study develops definitions of optimal development from the perspectives of four different types of stakeholders: a government planner, a land developer, a hydrologist, and a conservationist subject to certain development constraints. Four different objective functions are posed that are consistent with each of these stakeholders’perspectives. We illustrate the differences in consequences on future development given these different objective functions in a stylized representation for Montgomery County, Maryland. Solutions to Smart Growth from the individual perspectives vary considerably. Tradeoff tables are presented that illustrate the consequences experienced by each stakeholder depending on the viewpoint that has been optimized. Although couched in the context of an illustrative example, this study emphasizes the need to apply rigorous, quantitative tools in a meaningful framework to address Smart Growth. The result is a tool that a range of parties can use to plan future development in ways that are environmentally and fiscally responsible and economically viable.  相似文献   

13.
Debates on the role of biotechnology in food production are beset with notorious ambiguities. This already applies to the term “biotechnology” itself. Does it refer to the use and modification of living organisms in general, or rather to a specific set of technologies developed quite recently in the form of bioengineering and genetic modification? No less ambiguous are discussions concerning the question to what extent biotechnology must be regarded as “unnatural.” In this article it will be argued that, in order to disentangle some of the ambiguities involved, we have to broaden the temporal horizon of the debate. Ideas about biotechniques and naturalness have evolved in various socio-historical contexts and their historical origins will determine to a considerable extent their actual meaning and use in contemporary deliberations. For this purpose, a comprehensive timetable is developed, beginning with the Neolithic revolution ~10,000 years ago (resulting in the emergence of agriculture and the Common Human Pattern) up to the biotech revolution as it has evolved from the 1970s onwards—sometimes referred to as a second “Genesis.” The concept of nature that emerged in the context of the “Common Human Pattern” differs considerably from traditional philosophical concepts of nature (such as coined by Aristotle), as well as from the scientific view of nature conveyed by the contemporary life sciences. A clarification of these different historical backdrops will allow us to understand and elucidate the conceptual ambiguities that are at work in contemporary debates on biotechnology and the place of human beings in nature.  相似文献   

14.
Over the last few years, commentators on all sides of the environmental debate have (with a few exceptions) joined hands to pillory the traditional model of environmental regulation in this country. The catch phrase “command and control” has become emblematic of everything that was seen as being wrong with the old system. The current push for “reinvention” of environmental regulation reinforces the concept that although traditional methods have produced progress, they have outlived their usefulness. At the same time, ISO 14000 has emerged as one of the hottest topics in the environmental field—a form of reinvention of environmental management that has been embraced by many as the solution to an array of problems. While it is still early in the game, initial results indicate that when used effectively, ISO 14000 can be a powerful tool for the environmental manager. However, misuse of ISO 14000 could represent a throwback to command and control rather than a management tool for the new millennium. This article examines how this new tool fits into the evolving picture of environmental regulation and management.  相似文献   

15.
ABSTRACT: In spite of the rather large volume of literature suggesting that non-efficiency objectives ought to be incorporated into water resource planning frameworks, little has been done to date. A partial explanation is that when goals are in conflict we have no “objective” criteria upon which to make the trade-offs. Also, there are problems of measuring the degree to which various policy actions lead to achievement of various goals. Nevertheless, this paper argues that given the magnitude of the possible gains from incorporating these considerations, considerable effort to overcome these problems is justified. Accordingly, we outline some procedures for making these trade-offs and suggest an alternative (practical) planning framework.  相似文献   

16.
Discussions about “disruptive” food controversies abound in popular and academic literatures, particularly with respect to meat production and consumption, yet there is little scholarship examining what makes an event disruptive in the first instance. Filling this gap will improve our understanding of how food controversies unfold and why certain issues may be more likely to linger in the public consciousness as opposed to others. I address these questions by using focus groups and in-depth interviews to analyze five potentially upsetting topics: dietary warnings about meat consumption, meat safety recalls, eating meat directly from the skull of the animal, the morality of killing animals for food, and the “pink slime” debate. Findings suggest that disruptive events involve negative affective reactions to safety hazards, disgust-provoking sensory cues, and/or ethical dilemmas. When these cues exist in isolation from one another, consumers’ reactions are quite often short-lived, while the simultaneous presence of multiple disruptive elements in the context of a single issue or event can trigger a far stronger reaction.  相似文献   

17.
A set of simulation and optimization tools capable of analyzing the development and operation of a complex, multi-basin, interconnected water resource are explained. These models provide valuable information regarding the important questions: (1) “When should new projects be build?” (2) “How big should they be?” and (3) “How should the system be operated?” Since these tools were developed by and for practicing engineers, their applicability to real-world problems is mandatory. To assure this, testing was done on an actual proposed project, the Texas Water System.  相似文献   

18.
“Fair and equitable benefit-sharing” is one of the objectives of the UN Convention on Biological Diversity and the FAO International Treaty on Plant Genetic Resources for Food and Agriculture. In essence, benefit-sharing holds that countries, farmers, and indigenous communities that grant access to their plant genetic resources and/or traditional knowledge should share in the benefits that users derive from these resources. But what exactly is understood by “fair” and “equitable” in this context? Neither term is defined in the international treaties. A complicating factor, furthermore, is that different motivations and perspectives exist with respect to the notion of benefit-sharing itself. This paper looks at six different approaches to benefit-sharing that can be extracted from the current debates on “Access and Benefit-Sharing.” These approaches form the basis of a philosophical reflection in which the different connotations of “fair and equitable” are considered, by analyzing the main principles of justice involved. Finally, the various principles are brought together in order to draw some conclusions as to how a fair and equitable benefit-sharing mechanism might best be realized. This results in several recommendations for policymakers.  相似文献   

19.
ABSTRACT: In this paper, we review recent experience with drought in south Florida, and report some results of a study of the likely agricultural economic impacts of drought. Our conclusions can be summarized as follows. (1) Whether a period of low rainfall becomes a “drought” in south Florida is determined largely by institutional factors. (2) The impacts of a drought event are dependent on the rules the Water Management District uses to manage the event. If the rules involve effective reductions in irrigation supply, the financial impacts may be large, but are sensitive to the way in which cutbacks are imposed. (3) Current drought management regulations do not appear to minimize the short-run cost of drought. (4) Current policies which seek to minimize the short-run cost of drought are inconsistent with dynamically-optimal policies.  相似文献   

20.
ABSTRACT: The Ecosystem Management (EM) process belongs to the category of Multi‐Criteria Decision Making (MCDM) problems. It requires appropriate decision support systems (DSS) where “all interested people” would be involved in the decision making process. Environmental values critical to EM, such as the biological diversity, health, productivity and sustainability, have to be studied, and play an important role in modeling the ecosystem functions; human values and preferences also influence decision making. Public participation in decision and policy making is one of the elements that differentiate EM from the traditional methods of management. Here, a methodology is presented on how to quantify human preferences in EM decision making. The case study of the National Park of River Nestos Delta and Lakes Vistonida and Ismarida in Greece, presented as an application of this methodology, shows that the direct involvement of the public, the quantification of its preferences and the decision maker's attitude provide a strong tool to the EM decision making process. Public preferences have been given certain weights and three MCDM methods, namely, the Expected Utility Method, Compromise Programming and the Analytic Hierarchy Process, have been used to select alternative management solutions that lead to the best configuration of the ecosystem and are also socially acceptable.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号