首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
ABSTRACT: A generalized skew map for Louisiana streams was developed using data from Louisiana, Mississippi, Arkansas, and Texas with 20 or more years of annual flood records. A comparison between the newly developed Louisiana Generalized Skew Map (LGSM) and the generalized skew map recommended by the U.S. Water Resources Council (WRCGSM) was performed. The mean square error for the LGSM was 16 percent less than that of WRCGSM in direct application of the two maps. Performance of the new map was compared with the WRCGSM and with a regional analysis procedure through its application to the Log Pearson Type 3 (LP3) distribution. Two-thirds of the stations tested had lower standardized root mean square deviations (SRMSD) by a narrow margin using the skew coefficients obtained from LGSM instead of WRCGSM. The regional analysis also performed as well as the LGSM in terms of SRMSD. Thus, it was concluded that both LGSM and the regional analysis provide a more reliable tool for flood frequency analysis for Louisiana streams with short annual flood records.  相似文献   

2.
Regional procedures to estimate flood magnitudes for ungaged watersheds typically ignore available site-specific historic flood information such as high water marks and the corresponding flow estimates, otherwise referred to as limited site-specific historic (LSSH) flood data. A procedure to construct flood frequency curves on the basis of LSSH flood observations is presented. Simple inverse variance weighting is employed to systematically combine flood estimates obtained from the LSSH data base with those from a regional procedure to obtain improved estimtes of flood peaks on the ungaged watershed. For the region studied, the variance weighted estimates of flow had a lower logarithmic standard error than either the regional or the LSSH flow estimates, when compared to the estimates determined by three standard distributions for gaged watersheds investigated in the development of the methodology. Use of the simple inverse variance weighting procedure is recommended when “reliable” estimates of LSSH floods for the ungaged site are available.  相似文献   

3.
ABSTRACT: Sliding polynomials differ from other piecewise interpolation and smoothing methods in their functional continuity at the nodes. This functional continuity was used to establish optional spacing of nodes and optional boundary controls in data smoothing while still maintaining mathematically continuous rates or gradients. Cyclic as well as noncyclic data can be smoothed. Variance of the individual nodal values. derived through least-squares optimization, can be calculated using the rigorously determined weighting coefficients between data points and nodes. Such nodal variances are estimates of localized uncertainty in the data which complement the localization of smoothing through use of piecewise functions. Choice of controls in smoothing and calculation of variance have been incorporated in a computer program for user convenience.  相似文献   

4.
Traditionally, identification of the Muskingum routing coefficients has been based on observations of the linearity of a loop formed by graphically plotting a forward and a reverse path. This graphical procedure is time-consuming and may not minimize the error of estimation. A procedure was developed to improve the drawbacks of the graphical method. This procedure calls for (a) the use of least square regression on the forward and reverse paths to determine their respective slopes, and (b) the use of statistical t-test to evaluate the hypothesis that these two slopes are equal. The computational procedure is repeated, using incremental values of the flow weighting coefficient, x. A graph of the computed t-value versus x can be constructed. The optimal value of x, as read from the graph, occurs at the minimum computed t-value. The procedure has been demonstrated superior to the graphical method for three illustrative examples, resulting in a reduction of the error squares by factors ranging from 5 to 6.  相似文献   

5.
In environmental impact assessment of policies and product design results need to be presented in a comprehensible way to make alternatives easily comparable. One way of doing this is to aggregate results to a manageable set by using weighting methods. Valuing the environmental impacts can be a challenging task that can also be quite time-consuming. To the aid of practitioners, several weighting sets with readily available weights have been developed over the last decade. The scope and coverage of these sets vary, and it is important to be aware of the implications of using different valuation methods and weighting sets.The aim of this paper is to map valuation and weighting techniques and indicate the methods that are suitable to use, depending on the purpose of the analysis. Furthermore, we give an overview over sets of generic values or weights and their properties, and give an illustration of how different sets may influence the results. It is very useful to use several weighting sets, and discuss the results thoroughly. It is often a very interesting and fruitful exercise to see if and how the results differ, why they differ, and which one seems to be the best alternative to base any recommendation on.The example provided in this article demonstrates that looking at aggregate results is not enough. Since many weighting sets are not sufficiently transparent as to how they are constructed and what their impact categories actually include, a general recommendation is to provide weighting sets with a declaration of content, providing a clear picture of what is included and what is not, and a recommendation of suitable uses of the weighting set.  相似文献   

6.
ABSTRACT: It was found that the conventional weighting factor application to hyetograph ordinates results in artificially attenuated storm patterns. A modified weighting procedure is suggested which allows adjustments in the storm timing, peak intensity, and volume but conserves the storm pattern observed at the raingage nearest to the watershed point of interest. The systematic underestimation of peak flood flows, which result from conventional hyetograph weighting, can be avoided by conserving the hyetograph shape from the raingage nearest to any subarea of a modeled watershed and merely applying weighting factors to the rainfall volumes and temporal center of gravity of several hyetographs.  相似文献   

7.
Predictive models of wildlife-habitat relationships often have been developed without being tested The apparent classification accuracy of such models can be optimistically biased and misleading. Data resampling methods exist that yield a more realistic estimate of model classification accuracy These methods are simple and require no new sample data. We illustrate these methods (cross-validation, jackknife resampling, and bootstrap resampling) with computer simulation to demonstrate the increase in precision of the estimate. The bootstrap method is then applied to field data as a technique for model comparison We recommend that biologists use some resampling procedure to evaluate wildlife habitat models prior to field evaluation.  相似文献   

8.
Standard procedures for evaluating environmental impact involve comparison between before and after conditions or scenarios or between treatment and control site pairs. In many cases, however, endogenous directional change (natural succession) is expected to occur at a significant rate over the period of concern, particularly for manmade systems such as impoundments. Static evaluations do not provide an adequate approach to such problems. A new evaluation frame is proposed. Nominal system behavior over time is characterized by a stochastic envelope around a nominal trajectory. We show that both the state variance and the sampling variance can change over time. In this context, environmental regulations can be framed as constraints, targets, or conformance to ideal trajectories. Statistical tests for determining noncompliance are explored relative to process variance, sample error, and sample size. Criteria are elucidated for choosing properties to monitor, sample size, and sampling interval.  相似文献   

9.
ABSTRACT: Customarily, it has been assumed that hydraulic conductivity is a stationary, homogeneous stochastic process with a finite variance for stochastic analysis of solute transport in the subsurface. That the distribution of hydraulic conductivity may have a fractal behavior with long range correlations was suggested from field data analyses. This motivates us to further investigate how the fractal behavior of permeability distribution impacts solute transport in porous media. This study provides longitudinal and transverse macrodispersivity coefficients and the variance of the solute concentration. Longitudinal and transverse macrodispersivity coefficients are found to depend strongly on the fractal dimension (D) of logarithmic hydraulic conductivity (logK). The longitudinal and transverse macrodispersivity coefficients are the highest when D = 1, and the values decrease monotonically to zero at D = 2. Both coefficients correspond to the characteristic length scale of the logK distribution, thus are scale dependent parameters. The ratio of the transverse to the longitudinal macrodispersivity coefficient is on the order of 10‐1 to 10‐4. Concentration variance also decreases with the fractal dimension of logK. There is no spatial spreading of solute for D = 2, and the concentration variance reaches zero for this case.  相似文献   

10.
ABSTRACT: Many hydrologic models have input data requirements that are difficult to satisfy for all but a few well-instrumented, experimental watersheds. In this study, point soil moisture in a mountain watershed with various types of vegetative cover was modeled using a generalized regression model. Information on sur-ficial characteristics of the watershed was obtained by applying fuzzy set theory to a database consisting of only satellite and a digital elevation model (DEM). The fuzzy-c algorithm separated the watershed into distinguishable classes and provided regression coefficients for each ground pixel. The regression model used the coefficients to estimate distributed soil moisture over the entire watershed. A soil moisture accounting model was used to resolve temporal differences between measurements at prototypical measurement sites and validation sites. The results were reasonably accurate for all classes in the watershed. The spatial distribution of soil moisture estimates corresponded accurately with soil moisture measurements at validation sites on the watershed. It was concluded that use of the regression model to distribute soil moisture from a specified number of points can be combined with satellite and DEM information to provide a reasonable estimation of the spatial distribution of soil moisture for a watershed.  相似文献   

11.
食品中的农药残留检测长期以来一直是一项艰巨的挑战,其杂质干扰多,残留含量低,传统的提取方法常常因为无法将杂质分离而影响检测结果。由于食品中杂质种类众多,农药的种类繁多且理化性质各异,新型农药日益涌现,所以对样品的前处理提出了更高的要求。近年来,QuEchERs(Quick,Easy,Cheap,Effective,RuggedandSafe)作为一种新型的提取方法,由于具有快速、简单、廉价、有效、可靠、安全的特点,成为国内外广泛采用的样品前处理新技术,在多种农药、医药、兽药的气相或液相色谱分析中已经得到广泛应用。本文就国内外对QuEChERS法在各类食品以及其他动植物性农产品的农药残留检测中的应用和方法的改进进行了综述,并对QuEChERS法在今后农药残留检测的应用前景以及发展方向进行了展望。  相似文献   

12.
ABSTRACT: The Pearson type 3 (P3) and log Pearson type 3 (LP3) distributions are very frequently used in flood frequency analysis. Existing methods for constructing confidence intervals for quantiles (Xp) of these two distributions are very crude. Most of these methods are based on the idea of adjusting confidence intervals for quantiles Yp of the normal distribution to obtain approximate confidence inervals for quantiles Xp of the P3/LP3 distribution. Since there is no theoretical reason why this “base” distribution, Y, should be taken to be normal, we search in the present study for the best possible base distribution for producing confidence intervals for P3/LP3 quantiles. We consider a group of base distributions such as the normal, log normal, Weibull, Gumbel, and exponential. We first assume that the skew coefficient, γ of X, to be known, and develop a method for adjusting confidence intervals for Yp to produce approximate confidence intervals for Xp. We then compare this method (Method A) with another method (Method B) introduced by Stedinger. Simulation shows that the performance of each of these two methods depends on the base distribution Y that is being used, but as a whole, the normal distribution appears to be the best-fit distribution for producing confidence intervals for P3/LP3 quantiles when γ is assumed to be known. We then extend our method (Method A) to the more important case of unknown coefficient of skewness. It is shown that by taking Y to be Weibull, fairly accurate confidence intervals for P3/LP3 quantiles can be obtained for quite a wide range of sample sizes and coefficients of skewness commonly found in hydrology. The case of the P3 distribution with negative skewness needs further research.  相似文献   

13.
A landscape-level approach was applied to eight rural watersheds to assess the role that wetlands play in reducing phosphorus loading to surface waters in the Lake Champlain Basin. Variables summarizing various characteristics of wetlands within a watershed were calculated using a geographic information system and then compared to measured phosphorus loading through multiple regression analyses. The inclusion of a variable based on the area of riparian wetlands located along low- and medium-order streams in conjunction with the area of agricultural and nonwetland forested lands explained 88% of the variance in phosphorus loading to surface waters. The best fit model coefficients (Pload = 0.86Ag + 0.64For – 30Ripwet + 160) suggest that a hectare of riparian wetland may be many times more important in reducing phosphorus than an agricultural hectare is in producing phosphorus. These results provide additional support for the concept that protection of riparian wetlands is an important management strategy for controlling stream water quality in multiuse landscapes.  相似文献   

14.
Computer display technology is currently in a state of transition, as the traditional technology of cathode ray tubes is being replaced by liquid crystal display flat-panel technology. Technology substitution and process innovation require the evaluation of the trade-offs among environmental impact, cost, and engineering performance attributes. General impact assessment methodologies, decision analysis and management tools, and optimization methods commonly used in engineering cannot efficiently address the issues needed for such evaluation. The conventional Life Cycle Assessment (LCA) process often generates results that can be subject to multiple interpretations, although the advantages of the LCA concept and framework obtain wide recognition. In the present work, the LCA concept is integrated with Quality Function Deployment (QFD), a popular industrial quality management tool, which is used as the framework for the development of our integrated model. The problem of weighting is addressed by using pairwise comparison of stakeholder preferences. Thus, this paper presents a new integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), to assess the environmental behavior of alternative technologies in correlation with their performance and economic characteristics. Computer display technology is used as the case study to further develop our methodology through the modification and integration of various quality management tools (e.g., process mapping, prioritization matrix) and statistical methods (e.g., multi-attribute analysis, cluster analysis). Life cycle thinking provides the foundation for our methodology, as we utilize a published LCA report, which stopped at the characterization step, as our starting point. Further, we evaluate the validity and feasibility of our methodology by considering uncertainty and conducting sensitivity analysis.  相似文献   

15.
ABSTRACT: In an earlier paper [1], the invariant imbedding concept was applied to the dynamic modeling of stream quality. In this approach, a set of weighting functions is introduced. The initial conditions for these weighting functions must be estimated. It has been found that these initial conditions influence the convergence rate tremendously. In many water quality control situations, the number of experimental data points are limited. In order to obtain the best estimates with limited experimental data, the best convergence rate should be used. In this work, the least squares criterion combined with various optimization techniques is ued to obtain the optimal initial conditions for the weighting functions. It is shown that the proposed schemes greatly improve the convergence rate.  相似文献   

16.
This study examines cointegration and Granger causality among global oil prices, precious metal (Gold, Platinum and Silver) prices and Indian Rupee–US Dollar exchange rate using daily data spanning from 2nd January 2009 to 30th December 2011. ARDL bounds tests indicate that the series are cointegrated. Toda–Yamamoto version of Granger causality has been employed to establish the causation amongst the variables. The study also examines generalized error variance decomposition of variables due to various shocks in the system. Such information provides insight into the transmission links between the global oil market and the Indian precious metals and foreign exchange market. These have the potential for significant impact in further research, portfolio management and central bank policy design.  相似文献   

17.
ABSTRACT: The Chubb/Bauman (Ch/B) method for making quantitative estimates of recreation potential for rivers is based on the 1968/ 69 Leopold method for quantitative assessment of the scenic beauty of rivers. Both use classifications of environmental variables as the database. Unlike the Leopold method, the classifications used in the Ch/B method consistently reflect human preferences. The Ch/B method collects information on 67 variables, and uses a computer program to produce estimates of potential for 16 common recreation activities. This critique evaluates selected concepts and procedures of the Ch/B method partly by comparison with other available methods of recreation resource inventory. It considers the validity and utility of numerical weighting of variables, the use of numbers derived from place in a classification, and the transformation process. The quantitative techniques of the method exhibit serious flaws. Much of the data produced by the method appears to be quantitative but in fact is not, and it does not produce truly quantitative estimates of recreation potential. Classifications of generalized geographic or environmental variables are shown to have serious defects as a basis for evaluation of recreational potential.  相似文献   

18.
Elevated nitrate concentrations in streamwater are a major environmental management problem. While land use exerts a large control on stream nitrate, hydrology often plays an equally important role. To date, predictions of low-flow nitrate in ungauged watersheds have been poor because of the difficulty in describing the uniqueness of watershed hydrology over large areas. Clearly, hydrologic response varies depending on the states and stocks of water, flow pathways, and residence times. How to capture the dominant hydrological controls that combine with land use to define streamwater nitrate concentration is a major research challenge. This paper tests the new Hydrologic Landscape Regions (HLRs) watershed classification scheme of Wolock and others (Environmental Management 34:S71-S88, 2004) to address the question: Can HLRs be used as a way to predict low-flow nitrate? We also test a number of other indexes including inverse-distance weighting of land use and the well-known topographic index (TI) to address the question: How do other terrain and land use measures compare to HLR in terms of their ability to predict low-flow nitrate concentration? We test this for 76 watersheds in western Oregon using the U.S. Environmental Protection Agency’s Environmental Monitoring and Assessment Program and Regional Environmental Monitoring and Assessment Program data. We found that HLRs did not significantly improve nitrate predictions beyond the standard TI and land-use metrics. Using TI and inverse-distance weighting did not improve nitrate predictions; the best models were the percentage land use—elevation models. We did, however, see an improvement of chloride predictions using HLRs, TI, and inverse-distance weighting; adding HLRs and TI significantly improved model predictions and the best models used inverse-distance weighting and elevation. One interesting result of this study is elevation consistently predicted nitrate better than TI or the hydrologic classification scheme.  相似文献   

19.
Abstract: This study investigates the regional analysis of annual maximum flood series of 48 stream gauging stations in the basins of the West Mediterranean Region in Turkey. The region is divided into three homogeneous subregions according to both Student‐t test and Dalrymple homogeneity test. The regional relationships of mean annual flood per unit area‐drainage area and coefficient of skew‐coefficient of variation are obtained. Two statistically meaningful relationships of the mean flood per unit area‐drainage area and a unique relationship between skewness and variation coefficients exist. Results show that the index‐flood method may be applicable to each homogenous subregion to estimate flood quantiles in the study area.  相似文献   

20.
Due to increasing empiricalinformation on farm animal welfare since the1960s, the prospects for sound decisionmakingconcerning welfare have improved. This paperdescribes a strategy to develop adecision-making aid, a decision support system,for assessment of farm-animal welfare based onavailable scientific knowledge. Such a decisionsupport system allows many factors to be takeninto account. It is to be developed accordingto the Evolutionary Prototyping Method, inwhich an initial prototype is improved inreiterative updating cycles. This initialprototype has been constructed. It useshierarchical representations to analysescientific statements and statements describingthe housing system. Welfare is assessed fromwhat is known about the biological needs of theanimals, using a welfare model in the form of atree that contains these needs as welfarecomponents. Each state of need is assessedusing welfare relevant attributes of thehousing system and weighting factors.Attributes are measurable properties of thehousing system. Weighting factors are assignedaccording to heuristic rules based on theprinciple of weighting all components(attributes and needs) equally, unless thereare strong reasons to do otherwise. Preliminarytests of the prototype indicate that it may bepossible to perform assessment of farm-animalwelfare in an explicit way and based onempirical findings. The procedure needs to berefined, but its prospects are promising.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号