首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   476篇
  免费   33篇
  国内免费   18篇
安全科学   116篇
废物处理   6篇
环保管理   181篇
综合类   90篇
基础理论   56篇
污染及防治   14篇
评价与监测   38篇
社会与环境   11篇
灾害及防治   15篇
  2023年   5篇
  2022年   7篇
  2021年   12篇
  2020年   9篇
  2019年   14篇
  2018年   9篇
  2017年   19篇
  2016年   13篇
  2015年   12篇
  2014年   16篇
  2013年   23篇
  2012年   22篇
  2011年   24篇
  2010年   15篇
  2009年   24篇
  2008年   14篇
  2007年   34篇
  2006年   32篇
  2005年   20篇
  2004年   20篇
  2003年   9篇
  2002年   9篇
  2001年   15篇
  2000年   15篇
  1999年   14篇
  1998年   11篇
  1997年   13篇
  1996年   20篇
  1995年   13篇
  1994年   7篇
  1993年   7篇
  1992年   7篇
  1991年   8篇
  1990年   4篇
  1989年   5篇
  1988年   6篇
  1987年   2篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1983年   5篇
  1982年   1篇
  1981年   2篇
  1980年   2篇
  1978年   1篇
  1973年   1篇
  1971年   1篇
排序方式: 共有527条查询结果,搜索用时 406 毫秒
301.
ABSTRACT: The Great Plains of the United States, drained primanly by the Missouri River, are very sensitive to shifts in climate. The six main stem dams on the Missouri River control more than one‐half of the nearly 1.5 million square kilometer basin and can store three times the annual inflow from upstream. The dams are operated by the U.S. Army Corps of Engineers using a Master Manual that describes system priorities and benefits. The complex operational rules were incorporated into the Soil and Water Assessment Tool computer model (SWAT). SWAT is a distributed parameter rainfall‐runoff model capable of simulating the transpiration suppression effects of CO2 enrichment. The new reservoir algorithms were calibrated using a 25‐year long historic record of basin climate and discharge records. Results demonstrate that it is possible to incorporate the operation of a highly regulated river system into a complex rainfall‐runoff model. The algorithms were then tested using extreme climate scenarios indicative of a prolonged drought, a short drought, and a ten percent increase in basin‐wide precipitation. It is apparent that the rules for operating the reservoirs will likely require modification if, for example, upper‐basin precipitation were to increase only ten percent under changed climate conditions.  相似文献   
302.
We present a logistic regression approach for forecasting the probability of future groundwater levels declining or maintaining below specific groundwater‐level thresholds. We tested our approach on 102 groundwater wells in different climatic regions and aquifers of the United States that are part of the U.S. Geological Survey Groundwater Climate Response Network. We evaluated the importance of current groundwater levels, precipitation, streamflow, seasonal variability, Palmer Drought Severity Index, and atmosphere/ocean indices for developing the logistic regression equations. Several diagnostics of model fit were used to evaluate the regression equations, including testing of autocorrelation of residuals, goodness‐of‐fit metrics, and bootstrap validation testing. The probabilistic predictions were most successful at wells with high persistence (low month‐to‐month variability) in their groundwater records and at wells where the groundwater level remained below the defined low threshold for sustained periods (generally three months or longer). The model fit was weakest at wells with strong seasonal variability in levels and with shorter duration low‐threshold events. We identified challenges in deriving probabilistic‐forecasting models and possible approaches for addressing those challenges.  相似文献   
303.
304.
In order to establish a monitoring method to track long term changes of the amount of anthropogenic contamination in a district of Bavaria (Germany), a biomonitoring campaign with honey bees was performed in spring 2002. Expected anomalies from the industry or from residential areas in the sampled district could not be detected. An anomaly over a considerable part of the sampling area correlating with other phenomena lead to the hypothesis of a prehistoric cosmic impact. Moreover a principal component analysis of the data showed evidence for a biogenic, an anthropogenic and an unknown component hypothetically related to a possible cosmic impact.  相似文献   
305.
在2011年全国辐射环境监测网络射频综合场强项目测量比对中,采用稳健统计方法对36家参比单位的78个比对测量结果进行统计分析。结果表明,全国各监测机构满意结果占87.3%,有问题结果占7.7%,离群结果占5%。对可能影响监测结果的若干因素进行了研究,如仪器型号等,反映了目前全国辐射环境监测网络射频电磁场测量项目的监测能力水平,为进一步开展质保工作提供了科学依据。  相似文献   
306.
以燃煤发电行业废气污染核算为例,介绍了环境统计准专家系统的制作原理。将环境统计专家的专业知识优势与先进的计算机技术结合,制作准专家系统辅助基层环境统计人员工作,可以提高环境统计数据质量。目前我国基层环境统计部门非常需要这种易用、相对准确的方式来辅助其工作,环境统计准专家系统为在当前我国国情下提高环境统计数据质量探索了一条低成本、易推广的道路。  相似文献   
307.
根据2015年9个城市53台现场臭氧分析仪的现场比对核查结果,比较研究了稳健统计方法和一般统计方法在评价国控网臭氧自动监测数据准确性和精密性上的应用。研究表明:稳健统计能够在不剔除异常数据的前提下降低异常值对正确评价臭氧自动监测数据质量的影响,适合评价现场比对核查结果;采用Hubers方法进行稳健统计,2015年国控网臭氧日常浓度点相对偏差的95%置信区间约为-0.1%至4.5%,95%预测区间为-14.0%~18.3%,变异系数约为9.5%,数据质量仍有提升空间。  相似文献   
308.
Habitat loss is considered as one of the primary causes of species extinction, especially for a species that also suffers from an epidemic disease. Little attention has been paid to the combined effect of habitat loss and epidemic transmission on the species spatiotemporal dynamics. Here, a spatial model of the parasite–host/prey–predator eco-epidemiological system with habitat loss was studied. Habitat patches in the model, instead of undergoing a random loss, were spatially clustered by different degrees. Not only the quantity of habitat loss but also its clustering degree was shown to affect the equilibrium of the system. The infection rate and the probability of successful predation were keys to determine the spatial patterns of species. The epidemic disease is more likely to break out if only a small amount of suitable patches were lost. Counter-intuitively, infected preys are more sensitive to habitat loss than predators if the lost patches are highly clustered. This result is new to eco-epidemiology and implies a possibility of using spatial arrangement of suitable (or unsuitable) patches to control the spread of epidemics in the ecological system.  相似文献   
309.
The analysis of large data sets concerning fires in various forested areas of the world has pointed out that burned areas can often be described by different power-law distributions for small, medium and large fires and that a scaling law for the time intervals separating successive fires is fulfilled. The attempts of deriving such statistical laws from purely theoretical arguments have not been fully successful so far, most likely because important physical and/or biological factors controlling forest fires were not taken into account. By contrast, the two-layer spatially extended forest model we propose in this paper encapsulates the main characteristics of vegetational growth and fire ignition and propagation, and supports the empirically discovered statistical laws. Since the model is fully deterministic and spatially homogeneous, the emergence of the power and scaling laws does not seem to necessarily require meteorological randomness and geophysical heterogeneity, although these factors certainly amplify the chaoticity of the fires. Moreover, the analysis suggests that the existence of different power-laws for fires of various scale might be due to the two-layer structure of the forest which allows the formation of different kinds of fires, i.e. surface, crown, and mixed fires.  相似文献   
310.
Dry atmospheric deposition contributes a significant amount of phosphorus to the Everglades of South Florida. Measurement of this deposition is problematic, because samples often are contaminated to varying degrees by bird droppings and other foreign materials. This study attempted to detect and remove the outliers in phosphorus (P) flux rates measured from dry deposition samples. Visual inspection of the samples, recorded in field notes, found that 30.1% of the samples contained animal droppings and frogs. Some of the samples with droppings and frogs (2.3%) had P values greater than 884 μg P m−2 d−1 (a value twice the standard deviation of the raw data mean), and were removed from further analysis. Outlier detection statistics based on a linear regression were then used for additional data screening. Eight stations in the network of 19 were removed because high contamination precluded the use of the regression model. Of the remaining samples, 15.7% were identified through the regression procedure as contaminated and were removed. The 11 station mean for P dry deposition was 85.8±79.0 μg P m−2 d−1, prior to the regression analysis, and 74.8±75.1 μg P m−2 d−1 after removal.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号