首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   567篇
  免费   77篇
  国内免费   88篇
安全科学   57篇
废物处理   20篇
环保管理   173篇
综合类   279篇
基础理论   83篇
污染及防治   34篇
评价与监测   50篇
社会与环境   21篇
灾害及防治   15篇
  2024年   3篇
  2023年   7篇
  2022年   21篇
  2021年   19篇
  2020年   12篇
  2019年   25篇
  2018年   17篇
  2017年   39篇
  2016年   40篇
  2015年   32篇
  2014年   57篇
  2013年   61篇
  2012年   44篇
  2011年   52篇
  2010年   31篇
  2009年   37篇
  2008年   33篇
  2007年   37篇
  2006年   24篇
  2005年   25篇
  2004年   11篇
  2003年   18篇
  2002年   16篇
  2001年   14篇
  2000年   7篇
  1999年   6篇
  1998年   5篇
  1997年   4篇
  1996年   6篇
  1995年   2篇
  1994年   4篇
  1993年   1篇
  1989年   5篇
  1988年   2篇
  1986年   1篇
  1983年   1篇
  1982年   1篇
  1980年   1篇
  1979年   2篇
  1978年   1篇
  1977年   1篇
  1976年   1篇
  1975年   2篇
  1973年   1篇
  1972年   1篇
  1971年   2篇
排序方式: 共有732条查询结果,搜索用时 15 毫秒
181.
ABSTRACT: In geohydrology, three-dimensional surfaces are typically represented as a series of contours. Water levels, saturated thickness, precipitation, and geological formation boundaries are a few examples of this practice. These surfaces start as point measurements that are then analyzed to interpolate between the known point measurements. This first step typically creates a raster or a set of grid points. In modeling, subsequent processing uses these to represent the shape of a surface. For display, they are usually converted to contour lines. Unfortunately, in many field applications, the (x, y) location on the earth's surface is much less confidently known than the data in the z dimension. To test the influence of (x, y) locational accuracy on z dimension point predictions and their resulting contours, a Monte Carlo study was performed on water level data from northwestern Kansas. Four levels of (x, y) uncertainty were tested ranging in accuracy from one arc degree-minute (± 2384 feet in the x dimension and ± 3036 feet in the y dimension) to Global Positioning Systems (GPS) accuracy (± 20 feet for relatively low cost systems). These span the range of common levels of locational uncertainty in data available to hydrologists in the United States. This work examines the influence that locational uncertainty can have on both point predictions and contour lines. Results indicate that overall mean error exhibits a small sensitivity to locational uncertainty. However, measures of spread and maximum errors in the z domain are greatly affected. In practical application, this implies that estimates over large regions should be asymptotically consistent. However, local errors in z can be quite large and increase with (x, y) uncertainty.  相似文献   
182.
ABSTRACT: Mass balance models have been common tools in lake quality management for some years. However, verification for use on reservoirs, especially in the Western United States, has been seriously lacking, In this study, such a verification is attempted using data from the U.S EPA National Eutrophication Survey. Several models from the literature are compared for accuracy in application to the western reservoir data. Model standard error and correlation between estimated and observed reservoir phosphorus concentrations are the Criteria used for comparison. Standard errors am further used to calculate uncertainty of trophic state classification based on estimated phosphorus concentration. The model proposed by Dillon and Rigler (1974) proved most accurate, with a correlation coefficient of 0.86 and standard error of 0.2, based on logarithmic transformed values. Deficiencies in the other models appear to & from coefficients fit to lake data and from inappropriate model formulation.  相似文献   
183.
ABSTRACT: Sliding polynomials differ from other piecewise interpolation and smoothing methods in their functional continuity at the nodes. This functional continuity was used to establish optional spacing of nodes and optional boundary controls in data smoothing while still maintaining mathematically continuous rates or gradients. Cyclic as well as noncyclic data can be smoothed. Variance of the individual nodal values. derived through least-squares optimization, can be calculated using the rigorously determined weighting coefficients between data points and nodes. Such nodal variances are estimates of localized uncertainty in the data which complement the localization of smoothing through use of piecewise functions. Choice of controls in smoothing and calculation of variance have been incorporated in a computer program for user convenience.  相似文献   
184.
ABSTRACT. Recent advances in water quality modelling have pointed out the need for stochastic models to simulate the probabilistic nature of water quality. However, often all that is needed is an estimate of the uncertainty in predicting water quality variables. First order analysis is a simple method of providing an estimate in the uncertainty in a deterministic model due to uncertain parameters. The method is applied to the simplified Streeter-Phelps equations for DO and BOD; a more complete Monte Carlo simulation is used to check the accuracy of the results. The first order analysis is found to give accurate estimates of means and variances of DO and BOD up to travel times exceeding the critical time. Uncertainty in travel time and the BOD decay constant are found to be most important for small travel times; uncertainty in the reaeration coefficient dominates near the critical time. Uncertainty in temperature was found to be a negligible source of uncertainty in DO for all travel times.  相似文献   
185.
A procedure is outlined which allows consideration of both objective and subjective indicators to establish priorities in plan implementation of water resource development. The objective procedure utilizes stepwise multiple discriminant analysis to predict community performance regarding planned project implementation, based on previous project implementation in the Northeast. The subjective procedure incorporates prior probabilities developed by the planner, based on observation and experience gained through the planning process. The proposed analysis could eliminate waste through better allocation of planning funds to implementation studies exhibiting higher probability of early implementation.  相似文献   
186.
Legislation on the protection of biodiversity (e.g., European Union Habitat and Bird Directives) increasingly requires ecological impact assessment of human activities. However, knowledge and understanding of relevant ecological processes and species responses to different types of impact are often incomplete. In this paper we demonstrate with a case study how impact assessment can be carried out for situations where data are scarce but some expert knowledge is available. The case study involves two amphibian species, the great crested newt (Triturus cristatus) and the natterjack toad (Bufo calamita) in the nature reserve the Meinweg in the Netherlands, for which plans are being developed to reopen an old railway track called the Iron Rhine. We assess the effects of this railway track and its proposed alternatives (scenarios) on the metapopulation extinction time and the occupancy times of the patches for both species using a discrete-time stochastic metapopulation model. We quantify the model parameters using expert knowledge and extrapolated data. Because of our uncertainty about these parameter values, we perform a Monte Carlo uncertainty analysis. This yields an estimate of the probability distribution of the model predictions and insight into the contribution of each distinguished source of uncertainty to this probability distribution. We show that with a simple metapopulation model and an extensive uncertainty analysis it is possible to detect the least harmful scenario. The ranking of the different scenarios is consistent. Thus, uncertainty analysis can enhance the role of ecological impact assessment in decision making by making explicit to what extent incomplete knowledge affects predictions.  相似文献   
187.
ABSTRACT: Methods of computing probabilities of extreme events that affect the design of major engineering structures have been developed for most failure causes, but not for design floods such as the probable maximum flood (PMF). Probabilities for PMF estimates would be useful for economic studies and risk assessments. Reasons for the reluctance of some hydrologists to assign a probability to a PMF are discussed, and alternative methods of assigning a probability are reviewed. Currently, the extrapolation of a frequency curve appears to be the most practical alternative. Using 46 stations in the Mid-Atlantic region, the log-gamma, log-normal, and log-Gumbel distributions were used to estimate PMF probabilities. A 600,000-year return period appears to be a reasonable probability to use for PMFs in the Mid-Atlantic region. The coefficient of skew accounts for much of the variation in computed probabilities.  相似文献   
188.
紫外分光光度法测定水中总氮的不确定度评定   总被引:2,自引:0,他引:2  
运用测量不确度评定的基本方法和程序,分析影响紫外分光光度法测定总氮的不确定度的各种因素,建立数学模型,合成计算总氮的不确定度。  相似文献   
189.
通过对实际水样中NH3-N的测定结果的精密度及总不确定度检验,比较快速凯氏定氮法和纳氏试剂光度法测试结果,置信概率为95%.双侧DIXON检验、精密度和不确定度检验,结果表明,两方法测定结果之间不存在显著性差异,具有一致性.  相似文献   
190.
原子荧光光度法测定水中汞的不确定度分析   总被引:2,自引:0,他引:2  
用原子荧光光度法对测定水中汞的不确定度进行分析,找出影响不确定的因素,对测量不确定度进行计算和评定,结果表明影响其测量不确定度主要因素是原子荧光光强度值带来的不确定度,其它因素是次要的.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号