首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   722篇
  免费   14篇
  国内免费   12篇
安全科学   33篇
废物处理   30篇
环保管理   196篇
综合类   52篇
基础理论   185篇
污染及防治   176篇
评价与监测   39篇
社会与环境   27篇
灾害及防治   10篇
  2022年   7篇
  2021年   10篇
  2020年   8篇
  2019年   8篇
  2018年   17篇
  2017年   12篇
  2016年   23篇
  2015年   15篇
  2014年   21篇
  2013年   88篇
  2012年   25篇
  2011年   34篇
  2010年   33篇
  2009年   28篇
  2008年   36篇
  2007年   36篇
  2006年   35篇
  2005年   21篇
  2004年   23篇
  2003年   28篇
  2002年   24篇
  2001年   11篇
  2000年   15篇
  1999年   12篇
  1998年   5篇
  1997年   12篇
  1996年   11篇
  1995年   9篇
  1994年   8篇
  1993年   8篇
  1992年   11篇
  1991年   9篇
  1990年   5篇
  1989年   9篇
  1988年   5篇
  1987年   9篇
  1986年   5篇
  1985年   8篇
  1984年   8篇
  1983年   6篇
  1982年   9篇
  1981年   6篇
  1980年   5篇
  1978年   4篇
  1977年   5篇
  1976年   6篇
  1975年   2篇
  1974年   3篇
  1973年   2篇
  1972年   2篇
排序方式: 共有748条查询结果,搜索用时 31 毫秒
691.
To preserve and improve environmental quality in a prosperous industrialized nation like the United States, we must use efficient control technology to reduce the pollution which would otherwise accompany our growth. The need for control is especially great in our use of energy. In the near term our country must depend increasingly on coal to meet our energy needs. In his 1977 energy message President Carter declared that it would be this Administration’s policy to require the use of best available control technology for all new coal burning plants. EPA is implementing this policy by adopting a rule that will require such controls on new coal-fired power plants.  相似文献   
692.
Receptor models are used to identify and quantify source contributions to particulate matter and volatile organic compounds based on measurements of many chemical components at receptor sites. These components are selected based on their consistent appearance in some source types and their absence in others. UNMIX, positive matrix factorization (PMF), and effective variance are different solutions to the chemical mass balance (CMB) receptor model equations and are implemented on available software. In their more general form, the CMB equations allow spatial, temporal, transport, and particle size profiles to be combined with chemical source profiles for improved source resolution. Although UNMIX and PMF do not use source profiles explicitly as input data, they still require measured profiles to justify their derived source factors. The U.S. Supersites Program provided advanced datasets to apply these CMB solutions in different urban areas. Still lacking are better characterization of source emissions, new methods to estimate profile changes between source and receptor, and systematic sensitivity tests of deviations from receptor model assumptions.  相似文献   
693.
The prevalence of toxicopathic liver lesions in demersal fish on the San Pedro Shelf, California was determined for a 15-year period (1988–2003). Fish livers were sampled at fixed locations as part of the Orange County Sanitation Districts (OCSD) ocean monitoring program. Histopathological examination of selected fish liver tissues was studied to determine whether the wastewater discharge had affected fish health. The prevalence of toxicopathic lesion classes neoplasms (NEO), preneoplastic foci of cellular alteration (FCA), and hydropic vacuolation (HYDVAC) varied among species and locations. For all species sampled, severe lesions occurred in 6.2% of the fish examined (n = 7,694). HYDVAC (4.1%) was the most common toxicopathic lesion type followed by FCA (1.4%) and NEO (0.7%). HYDVAC occurred only in white croaker (Genyonemus lineatus), accounting for 84.8% of the toxicopathic lesions for this species. Prevalence of HYDVAC, NEO, and FCA in white croaker was 15.2, 2.0, and 0.7%, respectively. The prevalence of HYDVAC and NEO in white croaker increased with age and size but there was no sexual difference. A linear regression model was used for hypothesis testing to account for significant differences in fish size (and age for croakers) at the different sampling locations. This analysis showed that for HYDVAC there was no spatial or location effect for lesion rate or size/age of onset. For NEO, the model predicted that white croaker near the wastewater outfall may acquire these lesions at a smaller size/younger age, and at a higher rate, than at other sites. However, this result may be biased due to the unequal size frequency distributions and the low prevalence of NEO in white croaker at the different sampling sites. Bigmouth sole (Hippoglossina stomata) had a prevalence of FCA and NEO of 1.3 and 0.35%, respectively, but the prevalence and distribution of lesions was too few for statistical testing. There was no sexual difference for lesion prevalence in hornyhead turbot (Pleuronichthys verticalis) and the prevalence of FCA and NEO was 3.4 and 0.37%, respectively. FCA prevalence increased with size in hornyhead turbot and there were no significant spatial differences for lesion rates and fish size at lesion onset. Overall, consistent spatial differences for lesion prevalence were not demonstrated and highlight the analytical difficulties of detecting a possible point source impact when the effect is rare, correlated with the size/age structure of the population, and may be caused by exposure to unknown multiple sources. Thus, the usefulness of liver histopathology as a point-source monitoring tool is best applied to where the spatial scale of impact generally exceeds the home range of the target species.  相似文献   
694.
The disaster clearinghouse concept originates with the earthquake community as an effort to coordinate research and data collection activities. Though prior earthquake clearinghouses are small in comparison to what was needed in response to Hurricane Katrina, these seminal structures are germane to the establishment of our current model. On 3 September 2005, five days after Katrina wrought cataclysmic destruction along the Gulf Coast, FEMA and Louisiana State University personnel met to establish the LSU GIS Clearinghouse Cooperative (LGCC), a resource for centralization and dissemination of geospatial information related to Hurricane Katrina. Since its inception, the LGCC has developed into a working model for organization, dissemination, archiving and research regarding geospatial information in a disaster. This article outlines the formation of the LGCC, issues of data organization, and methods of data dissemination and archiving with an eye towards implementing the clearinghouse model as a standard resource for addressing geospatial data needs in disaster research and management.  相似文献   
695.
We monitored two Seattle school buses to quantify the buses’ self pollution using the dual tracers (DT), lead vehicle (LV), and chemical mass balance (CMB) methods. Each bus drove along a residential route simulating stops, with windows closed or open. Particulate matter (PM) and its constituents were monitored in the bus and from a LV. We collected source samples from the tailpipe and crankcase emissions using an on-board dilution tunnel. Concentrations of PM1, ultrafine particle counts, elemental and organic carbon (EC/OC) were higher on the bus than the LV. The DT method estimated that the tailpipe and the crankcase emissions contributed 1.1 and 6.8 μg m?3 of PM2.5 inside the bus, respectively, with significantly higher crankcase self pollution (SP) when windows were closed. Approximately two-thirds of in-cabin PM2.5 originated from background sources. Using the LV approach, SP estimates from the EC and the active personal DataRAM (pDR) measurements correlated well with the DT estimates for tailpipe and crankcase emissions, respectively, although both measurements need further calibration for accurate quantification. CMB results overestimated SP from the DT method but confirmed crankcase emissions as the major SP source. We confirmed buses’ SP using three independent methods and quantified crankcase emissions as the dominant contributor.  相似文献   
696.
697.
An extensive site-characterization project was conducted at a large chlorinated-solvent contaminated Superfund site in Tucson, AZ. The project consisted of several components, including traditional site-characterization activities, tracer tests, laboratory experiments conducted with core material collected from the site, and mathematical modeling. The primary focus of the work presented herein is the analysis of induced-gradient contaminant elution tests conducted in a source zone at the site, investigation of the potential occurrence of immiscible liquid in the saturated zone, characterization of the relationship between mass flux reduction and mass removal, and evaluation of the impact of source-zone management on site remediation. The results of the present study, along with those of prior work, indicate that immiscible liquid is likely present in the saturated zone at the site source zones. Extensive tailing and rebound was observed for the contaminant-elution tests, indicating nonideal transport and mass-transfer behavior. The elution data were analyzed with a source-zone-scale mathematical model, and the results indicated that nonideal immiscible-liquid dissolution was the primary cause of the observed behavior. The time-continuous relationship between mass flux reduction and mass removal associated with the plume-scale pump-and-treat operation exhibited an initial large drop in mass flux with minimal mass removed, followed by a period of minimal mass flux reduction and a second period of large reduction. This behavior reflects the impact of both source-zone and aqueous-plume mass removal dynamics. Ultimately, a greater than 90% reduction in mass flux was achieved for a mass removal of approximately 50%. The influence of source-zone management on site remediation was evaluated by conducting two predictive simulations, one for which the source zones were controlled and one for which they were not. A plume-scale model was used to simulate the composite contaminant concentrations associated with groundwater extracted with the pump-and-treat system, which were compared to measured data. The information generated from this study was used to enhance the site conceptual model, help optimize operation of the pump-and-treat system, and evaluate the utility of source-zone remediation.  相似文献   
698.
Compound-specific isotope analysis (CSIA) was used to assess biodegradation of MTBE and TBA during an ethanol release study at Vandenberg Air Force Base. Two continuous side-by-side field releases were conducted within a preexisting MTBE plume to form two lanes. The first involved the continuous injection of site groundwater amended with benzene, toluene and o-xylene ("No ethanol lane"), while the other involved the continuous injection of site groundwater amended with benzene, toluene and o-xylene and ethanol ("With ethanol lane"). The delta(13)C of MTBE for all wells in the "No ethanol lane" remained constant during the experiment with a mean value of -31.3 +/- 0.5 per thousand (n=40), suggesting the absence of any substantial MTBE biodegradation in this lane. In contrast, substantial enrichment in (13)C of MTBE by 40.6 per thousand, was measured in the "With ethanol lane", consistent with the effects of biodegradation. A substantial amount of TBA (up to 1200 microg/L) was produced by the biodegradation of MTBE in the "With ethanol lane". The mean value of delta(13)C for TBA in groundwater samples in the "With ethanol lane" was -26.0 +/- 1.0 per thousand (n=32). Uniform delta(13)C TBA values through space and time in this lane suggest that substantial anaerobic biodegradation of TBA did not occur during the experiment. Using the reported range in isotopic enrichment factors for MTBE of -9.2 per thousand to -15.6 per thousand, and values of delta(13)C of MTBE in groundwater samples, MTBE first-order biodegradation rates in the "With ethanol lane" were 12.0 to 20.3 year(-1) (n=18). The isotope-derived rate constants are in good agreement with the previously published rate constant of 16.8 year(-1) calculated using contaminant mass-discharge for the "With ethanol lane".  相似文献   
699.
The hunting handicap: costly signaling in human foraging strategies   总被引:1,自引:0,他引:1  
Humans sometimes forage or distribute the products of foraging in ways that do not maximize individual energetic return rates. As an alternative to hypotheses that rely on reciprocal altruism to counter the costs of inefficiency, we suggest that the cost itself could be recouped through signal benefit. Costly signaling theory predicts that signals can provide fitness benefits when costs are honestly linked to signaler quality, and this information is broadcast to potential mates and competitors. Here, we test some predictions of costly signaling theory against empirical data on human food acquisition and sharing patterns. We show that at least two types of marine foraging, turtle hunting and spearfishing, as practiced among the Meriam (a Melanesian people of Torres Strait, Australia) meet key criteria for costly signaling: signal traits are (1) differentially costly or beneficial in ways that are (2) honestly linked to signaler quality, and (3) designed to effectively broadcast the signal. We conclude that relatively inefficient hunting or sharing choices may be maintained in a population if they serve as costly and reliable signals designed to reveal the signaler's qualities to observers.  相似文献   
700.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号