全文获取类型
收费全文 | 1904篇 |
免费 | 51篇 |
国内免费 | 20篇 |
专业分类
安全科学 | 129篇 |
废物处理 | 82篇 |
环保管理 | 445篇 |
综合类 | 212篇 |
基础理论 | 490篇 |
环境理论 | 2篇 |
污染及防治 | 387篇 |
评价与监测 | 124篇 |
社会与环境 | 72篇 |
灾害及防治 | 32篇 |
出版年
2023年 | 14篇 |
2022年 | 23篇 |
2021年 | 22篇 |
2020年 | 26篇 |
2019年 | 24篇 |
2018年 | 47篇 |
2017年 | 61篇 |
2016年 | 67篇 |
2015年 | 64篇 |
2014年 | 60篇 |
2013年 | 131篇 |
2012年 | 86篇 |
2011年 | 155篇 |
2010年 | 100篇 |
2009年 | 89篇 |
2008年 | 114篇 |
2007年 | 126篇 |
2006年 | 120篇 |
2005年 | 75篇 |
2004年 | 71篇 |
2003年 | 71篇 |
2002年 | 59篇 |
2001年 | 40篇 |
2000年 | 28篇 |
1999年 | 30篇 |
1998年 | 28篇 |
1997年 | 15篇 |
1996年 | 25篇 |
1995年 | 16篇 |
1994年 | 20篇 |
1993年 | 17篇 |
1992年 | 17篇 |
1991年 | 9篇 |
1990年 | 11篇 |
1989年 | 5篇 |
1988年 | 10篇 |
1987年 | 15篇 |
1986年 | 9篇 |
1985年 | 5篇 |
1984年 | 9篇 |
1983年 | 6篇 |
1982年 | 10篇 |
1981年 | 4篇 |
1980年 | 5篇 |
1979年 | 6篇 |
1969年 | 3篇 |
1967年 | 3篇 |
1936年 | 2篇 |
1935年 | 3篇 |
1926年 | 2篇 |
排序方式: 共有1975条查询结果,搜索用时 31 毫秒
81.
Susan Hodgson Fu-Meng Khaw Mark S. Pearce Tanja Pless-Mulloli 《Atmospheric environment (Oxford, England : 1994)》2009,43(21):3356-3363
BackgroundIn the UK air quality has been monitored systematically since 1914, providing valuable data for studies of the long-term trends in air pollution and potentially for studies of health effects of air pollutants. There are, however, challenges in interpreting these data due to changes over time in the number and location of monitored sites, and in monitoring techniques. Particulate matter was measured as deposited matter (DM) using deposit gauge monitors until the 1950s when black smoke (BS) filters were introduced. Estimating long-term exposure to particulates using data from both deposit gauge and BS monitors requires an understanding of the relationships between DM, SO2 and BS.AimsTo explore whether DM and/or SO2, along with seasonal and location specific variables can be used to predict BS levels.MethodsAir quality data were abstracted from hard copies of the monthly Atmospheric Pollution Bulletins for the period April 1956–March 1961 for any sites with co-located DM, SO2 and BS data for three or more consecutive years. The relationships between DM, SO2, and BS were assessed using mixed models.ResultsThere were 34 eligible sites giving 1521 triplets of data. There was a consistent correlation between SO2 and BS at all sites, but the association between DM and BS was less clear and varied by location. Mixed modelling allowing for repeat measurements at each site revealed that SO2, year, rainfall and season of measurement explained 72% of the variability in BS levels.ConclusionsSO2 can be used as a surrogate measure for BS in all monitoring locations. This surrogate can be improved upon by consideration of site specific characteristics, seasonal effects, rainfall and year of measurement. These findings will help in estimating historic, long-term exposure to particulates where BS or other measures are not available. 相似文献
82.
Viachaslau Filimonau Janet Dickinson Derek Robbins Mark A.J. Huijbregts 《Journal of Cleaner Production》2011,19(17-18):1917-1930
This study discusses the potential for Life Cycle Assessment (LCA) to be utilized for the environmental assessment of tourism accommodation facilities, and their contribution to global carbon footprint. To demonstrate the viability of employing LCA in the hotel sector, its simplified derivative, Life Cycle Energy Analysis (LCEA), is applied to two tourism accommodation facilities in Poole, Dorset (UK) to quantify their CO2 emissions. The results indicate that the reviewed hotels are less energy and carbon-intense than the tourism accommodation establishments reported in the literature. This may indirectly imply the continuous progress of hotel’s energy efficiency over time. The implications of the current energy use practices in the reviewed hotels are discussed and suggestions are made on how to further improve the energy performance and therefore cut the carbon footprint. Recommendations for hotel management and policy-making are developed to reduce the energy and carbon intensity of the hotel industry. A method for energy and carbon footprint analysis of outsourced laundries and breakfast services is also proposed. 相似文献
83.
84.
85.
86.
M. Graziano Ceddia Mark Bartlett Charles Perrings 《Agriculture, ecosystems & environment》2009,129(1-3):65-72
The development of genetically modified (GM) crops has led the European Union (EU) to put forward the concept of ‘coexistence’ to give farmers the freedom to plant both conventional and GM varieties. Should a premium for non-GM varieties emerge in the market, ‘contamination’ by GM pollen would generate a negative externality to conventional growers. It is therefore important to assess the effect of different ‘policy variables’ on the magnitude of the externality to identify suitable policies to manage coexistence. In this paper, taking GM herbicide tolerant oilseed rape as a model crop, we start from the model developed in Ceddia et al. [Ceddia, M.G., Bartlett, M., Perrings, C., 2007. Landscape gene flow, coexistence and threshold effect: the case of genetically modified herbicide tolerant oilseed rape (Brassica napus). Ecol. Modell. 205, pp. 169–180] use a Monte Carlo experiment to generate data and then estimate the effect of the number of GM and conventional fields, width of buffer areas and the degree of spatial aggregation (i.e. the ‘policy variables’) on the magnitude of the externality at the landscape level. To represent realistic conditions in agricultural production, we assume that detection of GM material in conventional produce might occur at the field level (no grain mixing occurs) or at the silos level (where grain mixing from different fields in the landscape occurs). In the former case, the magnitude of the externality will depend on the number of conventional fields with average transgenic presence above a certain threshold. In the latter case, the magnitude of the externality will depend on whether the average transgenic presence across all conventional fields exceeds the threshold. In order to quantify the effect of the relevant ‘policy variables’, we compute the marginal effects and the elasticities. Our results show that when relying on marginal effects to assess the impact of the different ‘policy variables’, spatial aggregation is far more important when transgenic material is detected at field level, corroborating previous research. However, when elasticity is used, the effectiveness of spatial aggregation in reducing the externality is almost identical whether detection occurs at field level or at silos level. Our results show also that the area planted with GM is the most important ‘policy variable’ in affecting the externality to conventional growers and that buffer areas on conventional fields are more effective than those on GM fields. The implications of the results for the coexistence policies in the EU are discussed. 相似文献
87.
Mark E. Churchwell Robert L. Livingston Donald L. Sgontz Jerry D. Messman 《Environment international》1987,13(6)
The current U.S. Environmental Protection Agency (U.S. EPA) protocols for mercury determinations in aqueous and solid waste samples (SW-846 Methods 7470 and 7471) using recirculating cold-vapor atomic absorption spectrometry (CV-AAS) have been evaluated. The U.S. EPA methods are not sufficiently flexible to permit special quality control (QC) measures, have limited detectability for low-level mercury concentrations, and are plagued by spectral interferences caused by the nonspecific absorption of primary mercury radiation by volatile organic vapors. The U.S. EPA protocols have been modified in a single-laboratory study to facilitate additional QC measures, to enhance detectability for low-level mercury concentrations, and to eliminate nonspecific vapor absorption interferences. Volumetric manipulations for additional QC measures, if required, are facilitated by performing the sample digestions in Erlenmeyer flasks rather than in the current Biochemical Oxygen Demand (BOD) reduction-aeration bottles. Typical manipulations for additional QC measures that are now feasible include dilution of concentrated samples and multiple aliquot sampling for post-digestion spike and replicate analyses. Instrument detectability is improved 10-fold by using a gas sparging bottle as a dedicated reduction-aeration vessel and a silver wool-amalgamation CV-AAS system operated in an open configuration. The on-line amalgamation/thermal desorption process of the modified CV-AAS system eliminates interfering water and organic matrix vapors prior to the mercury absorption measurement. Good accuracy and precision have been obtained with the amalgamation CV-AAS system for the analyses of four reference sediment materials. The amalgamation CV-AAS measurements on the reference sediment digests have been successfully performed at absolute mercury concentration levels that are only 1 to 4 times above the instrumental detection limit of the U.S. EPA recirculating CV-AAS method. 相似文献
88.
Wendy A. Williams Mark E. Jensen J. Chris Winne Roland L. Redmond 《Environmental monitoring and assessment》2000,64(1):105-114
Accurate delineation and characterization of valley-bottom settings is crucial to the assessment of the biological and geomorphological components of riverine systems; yet, to date, most valley-bottom mapping endeavors have been done manually. To improve this situation, we developed automated techniques in a Geographic Information System (GIS) for delineating and characterizing valley-bottom settings in river basins ranging in size from approximately 1,000–10,000 km2. All procedures were developed with ARC/INFO GIS software and fully automated in Arc Macro Language (AML). The GRID module is required for valley-bottom delineation and slope calculations; whereas characterization (i.e., measuring the width of the valley-bottom zone) requires Coordinate Geometry (COGO) in the ARCEDIT module. The process requires three inputs: a polygon coverage of the analysis area; an arc coverage of its hydrography, and a grid representing its digital elevation. The AML is designed to operate within a wide range of computer memory/disk space options, and it allows users to customize several procedures to match the scale and complexity of a given analysis area with available computer hardware. 相似文献
89.
90.