首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   18185篇
  免费   148篇
  国内免费   152篇
安全科学   513篇
废物处理   745篇
环保管理   2565篇
综合类   2468篇
基础理论   4836篇
环境理论   4篇
污染及防治   5231篇
评价与监测   1162篇
社会与环境   837篇
灾害及防治   124篇
  2023年   94篇
  2022年   129篇
  2021年   179篇
  2020年   119篇
  2019年   176篇
  2018年   245篇
  2017年   243篇
  2016年   408篇
  2015年   339篇
  2014年   506篇
  2013年   1537篇
  2012年   595篇
  2011年   769篇
  2010年   604篇
  2009年   711篇
  2008年   820篇
  2007年   872篇
  2006年   758篇
  2005年   635篇
  2004年   658篇
  2003年   613篇
  2002年   611篇
  2001年   754篇
  2000年   557篇
  1999年   308篇
  1998年   233篇
  1997年   220篇
  1996年   267篇
  1995年   265篇
  1994年   253篇
  1993年   228篇
  1992年   205篇
  1991年   185篇
  1990年   206篇
  1989年   189篇
  1988年   180篇
  1987年   172篇
  1986年   148篇
  1985年   146篇
  1984年   176篇
  1983年   174篇
  1982年   166篇
  1981年   168篇
  1980年   136篇
  1979年   159篇
  1978年   108篇
  1977年   100篇
  1975年   94篇
  1973年   92篇
  1972年   99篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
91.
This study aimed to identify distribution of metals and the influential factors on metal concentrations in incineration residues. Bottom ash and fly ash were sampled from 19 stoker and seven fluidized bed incinerators, which were selected to have a variety of furnace capacity, furnace temperature, and input waste. In the results, shredded bulky waste in input waste increased the concentration of some metals, such as Cd and Pb, and the effect was confirmed by analysis of shredded bulky waste. During MSW incineration, lithophilic metals such as Fe, Cu, Cr, and Al remained mainly in the bottom ash while Cd volatilized from the furnace and condensed to the fly ash. About two thirds of Pb and Zn was found in the bottom ash despite their high volatility. Finally, based on the results obtained in this study, the amount of metal in incineration residues of MSW was calculated and the loss of metal was estimated in terms of mass and money. A considerable amount of metal was found to be lost as waste material by landfilling of incineration residues.  相似文献   
92.
ABSTRACT: An important international Niagara River management issue concerns allocation of the average 202,000 cubic feet per second river discharge for hydroelectric power and scenic purposes. Major water diversions from Niagara Falls are necessary for power production. Flow is allocated by the 1950 Niagara Treaty which is intended to maximize power benefits and preserve and enhance the scenic fals spectacle. This paper examines the extent to which the Treaty objectives have been achieved. Based on analyses of government documents, engineering data, and falls-viewing patterns, it is concluded that the 1950 Treaty led to enhancement of the falls spectacle and increased power generation. But significant additional power diversions probably are attainable without adverse effect upon the existing falls spectacle. Reducing daytime summer Horseshoe Falls flow and scheduling spring and autumn flow according to viewing patterns are possible means of increasing power diversions. Existing generating facilities could use considerably more water and the value of additional Niagara hydroelectricity is very high in terms of generation-cost savings over alternative power sources. Because of the cultural importance of the falls, Treaty modifications to permit increased power diversions are not recommended without prior public opinion sampling and on-site viewing experiments. These findings highlight the need for more careful study before long-term international agreements are concluded and illustrate the need for more flexible treaty arrangements to permit periodic adjustments for changing conditions.  相似文献   
93.
Background The use of natural gas has increased in the last years. In the future, its import supply and transport structure will diversify (longer distances, higher share of LNG (liquefied natural gas), new pipelines). Thus the process chain and GHG emissions of the production, processing, transport and distribution might change. Simultaneously, the injection of bio methane into the natural gas grid is becoming more important. Although its combustion is regarded as climate neutral, during the production processes of bio methane GHG emissions are caused. The GHG emissions occurring during the process chain of energy fuels are relevant for the discussion on climate policy and decision making processes. They are becoming even more important, considering the new Fuel Quality Directive of the EU (Dec. 2008), which aims at controlling emissions of the fuel process chains. Aim In the context of the aspects outlined above the aim is to determine the future development of gas supply for Germany and the resulting changes in GHG emissions of the whole process chain of natural gas and bio methane. With the help of two gas consumption scenarios and an LCA of bio methane, the amount of future emissions and emission paths until 2030 can be assessed and used to guide decision processes in energy policy. Results and discussion The process chain of bio methane and its future technical development are outlined and the related emissions calculated. The analysis is based on an accompanying research study on the injection of bio methane to the German gas grid. Two types of biogas plants have been considered whereof the “optimised technology” is assumed to dominate the future market. This is the one which widely exploits the potential of process optimisation of the current “state of the art” plant. The specific GHG emissions of the process chain can thus be nearly halved from currently 27.8?t CO2-eq./TJ to 14.8?t CO2-eq./TJ in 2030. GHG emissions of the natural gas process chain have been analysed in detail in a previous article. Significant modifications and a decrease of specific emissions is possible, depending on the level of investment in the modernisation of the gas infrastructure and the process improvements. These mitigation options might neutralise the emission increase resulting from longer distances and energy intensive processes. In the last section two scenarios (low and high consumption) illustrate the possible development of the German gas supply until 2030, given an overall share of 8–12?% of bio methane. Considering the dynamic emission factors calculated in the former sections, the overall gas emissions and average specific emissions of German gas supply can be given. The current emissions of 215.4 million t CO2-eq. are reduced by 25?% in the low-consumption scenario (162 million t CO2-eq.), where consumption is reduced by 17?%. Assuming a consumption which is increased by 17?% in 2030, emissions are around 7?% higher (230.9 million t CO2-eq.) than today. Conclusions Gaseous fuels will still play a significant role for the German energy supply in the next two decades. The GHG emissions mainly depend on the amount of gas used. Thus, energy efficiency will be a key issue in the climate and energy related policy discussion. A higher share of bio methane and high investments in mitigation and best available technologies can significantly reduce the emissions of the process chain. The combustion of bio methane is climate neutral compared to 56?t CO2/TJ caused by the direct combustion of natural gas (or 111?t CO2/TJ emitted by lignite). The advantage of gaseous energy carriers with the lowest levels of GHG emissions compared to other fossil fuels still remains. This holds true for fossil natural gas alone as well as for the expected future blend with bio-methane.  相似文献   
94.
Although networks of environmental monitors are constantly improving through advances in technology and management, instances of missing data still occur. Many methods of imputing values for missing data are available, but they are often difficult to use or produce unsatisfactory results. I-Bot (short for “Imputation Robot”) is a context-intensive approach to the imputation of missing data in data sets from networks of environmental monitors. I-Bot is easy to use and routinely produces imputed values that are highly reliable. I-Bot is described and demonstrated using more than 10 years of California data for daily maximum 8-hr ozone, 24-hr PM2.5 (particulate matter with an aerodynamic diameter <2.5 μm), mid-day average surface temperature, and mid-day average wind speed. I-Bot performance is evaluated by imputing values for observed data as if they were missing, and then comparing the imputed values with the observed values. In many cases, I-Bot is able to impute values for long periods with missing data, such as a week, a month, a year, or even longer. Qualitative visual methods and standard quantitative metrics demonstrate the effectiveness of the I-Bot methodology.Implications: Many resources are expended every year to analyze and interpret data sets from networks of environmental monitors. A large fraction of those resources is used to cope with difficulties due to the presence of missing data. The I-Bot method of imputing values for such missing data may help convert incomplete data sets into virtually complete data sets that facilitate the analysis and reliable interpretation of vital environmental data.  相似文献   
95.
Tillage has been and will always be integral to crop production. Tillage can result in the degradation of soil, water, and air quality. Of all farm management practices, tillage may have the greatest impact on the environment. A wide variety of tillage equipment, practices and systems are available to farmers, providing opportunities to enhance environmental performance. These opportunities have made tillage a popular focus of environmental policies and programs such as environmental indicators for agriculture. This paper provides a very brief examination of the role of tillage in crop production, its effect on biophysical processes and, therefore, its impact on the environment. Models of biophysical processes are briefly examined to demonstrate the importance of tillage relative to other farm management practices and to demonstrate the detail of tillage data that these models can demand. The focus of this paper is an examination of the use of information on tillage in Canada's agri-environmental indicators initiative, National Agri-environmental Health Analysis and Reporting Program (NAHARP). Information on tillage is required for several of the indicators in NAHARP. The type of data used, its source, and its quality are discussed. Recommendations regarding the collection of tillage data and use of tillage information are presented.  相似文献   
96.
ABSTRACT: A grid based daily hydrologic model for a watershed with paddy fields was developed to predict the stream discharge. ASCII formatted elevation, soil, and land use data supported by the GRASS Geographic Information System are used to generate distributed results such as surface runoff and subsurface flow, soil water content, and evapotranspiration. The model uses a single flow path algorithm and simulates a water balance at each grid element. A linear reservoir assumption was used to predict subsurface runoff components. The model was applied to a 75.6 km2 watershed located in the middle of South Korea, and observed stream flow hydrographs from 1995 and 1996 were compared to model predictions. The stream flow predictions of 1995 and 1996 generally agreed with the observed flow, resulting in a Nash‐Sutcliffe efficiency R2 of 0.60 and 0.62, respectively. The hydraulic conductivity for percolating water through the saturated layer affected baseflow generation. The levee height of the paddy influenced the time and magnitude of the surface runoff, depending on irrigation management. The model will be used for making low flow management decisions by evaluating the role of each land use to stream flow, especially in case of paddy decrease by gradual urbanization of a watershed.  相似文献   
97.
Summary The growth in the application of computers is one of the major developments of the last half of the 20th Century. There have already been substantial changes in society because of the computer, but even greater changes lie ahead. This paper defines some of the characteristics and applications of computers, as well as some of their limitations. It closes with comments on the implications of the development of ‘a new class of illiterates’—those who are unfamiliar with or even afraid of the computer as an aid in measurement, analysis, record keeping, communication and education. Robert C. Baron has over 25 years experience in the computer industry, as an engineer and as an executive. He was program manager for the Mariner II (Venus) and the Mariner IV (Mars) on board space computers. He was worldwide systems manager for Honeywell's minicomputer business. In 1972, he founded Prime Computer and was its first president. He is currently working as a writer, lecturer and consultant on the development and application of computer and communication technology. Mr. Baron is the author or contributor to six books and has written over 40 papers and speeches.  相似文献   
98.
ABSTRACT: The probability distributions of annual peak flows used in flood risk analysis quantify the risk that a design flood will be exceeded. But the parameters of these distributions are themselves to a degree uncertain and this uncertainty increases the risk that the flood protection provided will in fact prove to be inadequate. The increase in flood risk due to parameter uncertainty is small when a fairly long record of data is available and the annual flood peaks are serially independent, which is the standard assumption in flood frequency analysis. But standard tests for serial independence are insensitive to the type of grouping of high and low values in a time series, which is measured by the Hurst coefficient. This grouping increases the parameter uncertainty considerably. A study of 49 annual peak flow series for Canadian rivers shows that many have a high Hurst coefficient. The corresponding increase in flood risk due to parameter uncertainty is shown to be substantial even for rivers with a long record, and therefore should not be neglected. The paper presents a method of rationally combining parameter uncertainty due to serial correlation, and the stochastic variability of peak flows in a single risk assessment. In addition, a relatively simple time series model that is capable of reproducing the observed serial correlation of flood peaks is presented.  相似文献   
99.
Disturbance regime is a critical organizing feature of stream communities and ecosystems. The position of a given reach in the river basin and the sediment type within that reach are two key determinants of the frequency and intensity of flow-induced disturbances. We distinguish between predictable and unpredictable events and suggest that predictable discharge events are not disturbances. We relate the dynamics of recovery from disturbance (i.e., resilience) to disturbance regime (i.e., the disturbance history of the site). The most frequently and predictably disturbed sites can be expected to demonstrate the highest resilience. Spatial scale is an important dimension of community structure, dynamics, and recovery from disturbance. We compare the effects on small patches (⩽1 m2) to the effects of large reaches at the river basin level. At small scales, sediment movements and scour are major factors affecting the distribution of populations of aquatic insects or algae. At larger scales, we must deal with channel formation, bank erosion, and interactions with the riparian zone that will affect all taxa and processes. Our understanding of stream ecosystem recovery rests on our grasp of the historical, spatial, and temporal background of contemporary disturbance events.  相似文献   
100.
Solid waste collection services in Ilorin, Nigeria are shown to be unsatisfactory. The poor service is related to rapid population growth, insufficient data and inconsistent government policies. Surveys have been conducted to assess the quantities and types of solid waste and to show how these, in part, vary according to the nature of the land use and the properties being served. Suggestions for an improved service are offered.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号