首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   427篇
  免费   9篇
  国内免费   7篇
安全科学   30篇
废物处理   6篇
环保管理   87篇
综合类   33篇
基础理论   132篇
污染及防治   107篇
评价与监测   26篇
社会与环境   17篇
灾害及防治   5篇
  2022年   5篇
  2021年   3篇
  2020年   3篇
  2019年   10篇
  2018年   4篇
  2017年   9篇
  2016年   12篇
  2015年   21篇
  2014年   14篇
  2013年   35篇
  2012年   17篇
  2011年   21篇
  2010年   20篇
  2009年   16篇
  2008年   26篇
  2007年   32篇
  2006年   15篇
  2005年   11篇
  2004年   17篇
  2003年   18篇
  2002年   5篇
  2001年   15篇
  2000年   13篇
  1999年   9篇
  1998年   8篇
  1997年   7篇
  1996年   4篇
  1995年   6篇
  1994年   5篇
  1993年   11篇
  1992年   2篇
  1991年   2篇
  1990年   4篇
  1989年   4篇
  1988年   4篇
  1987年   3篇
  1984年   3篇
  1983年   4篇
  1982年   5篇
  1980年   3篇
  1979年   1篇
  1978年   1篇
  1977年   1篇
  1975年   2篇
  1974年   2篇
  1973年   1篇
  1972年   1篇
  1971年   4篇
  1970年   2篇
  1968年   1篇
排序方式: 共有443条查询结果,搜索用时 15 毫秒
81.
Declines in many native fish populations have led to reassessments of management goals and shifted priorities from consumptive uses to species preservation. As management has shifted, relevant environmental characteristics have evolved from traditional metrics that described local habitat quality to characterizations of habitat size and connectivity. Despite the implications this shift has for how habitats may be prioritized for conservation, it has been rare to assess the relative importance of these habitat components. We used an information-theoretic approach to select the best models from sets of logistic regressions that linked habitat quality, size, and connectivity to the occurrence of chinook salmon (Oncorhynchus tshawytscha) nests. Spawning distributions were censused annually from 1995 to 2004, and data were complemented with field measurements that described habitat quality in 43 suitable spawning patches across a stream network that drained 1150 km2 in central Idaho. Results indicated that the most plausible models were dominated by measures of habitat size and connectivity, whereas habitat quality was of minor importance. Connectivity was the strongest predictor of nest occurrence, but connectivity interacted with habitat size, which became relatively more important when populations were reduced. Comparison of observed nest distributions to null model predictions confirmed that the habitat size association was driven by a biological mechanism when populations were small, but this association may have been an area-related sampling artifact at higher abundances. The implications for habitat management are that the size and connectivity of existing habitat networks should be maintained whenever possible. In situations where habitat restoration is occurring, expansion of existing areas or creation of new habitats in key areas that increase connectivity may be beneficial. Information about habitat size and connectivity also could be used to strategically prioritize areas for improvement of local habitat quality, with areas not meeting minimum thresholds being deemed inappropriate for pursuit of restoration activities.  相似文献   
82.
Run sizes of spring chinook salmon in the South Umpqua River in Oregon have declined dramatically since the early part of this century. Habitat degradation is thought to be an important factor contributing to the decline of this stock, and qualitative assessment suggests the stock is at moderate risk of extinction. We use data from this and similar stocks to develop an age-structured, density-dependent model of the population dynamics that incorporates both demographic and environmental stochasticity. Under the assumption of no further habitat destruction, the population is predicted to have a greater than 95% probability of persistence for 200 years. However, sensitivity analysis for the density-dependence estimated from historical run-return data shows that substantially lower predicted viabilities are also statistically consistent with the data. A model that simulates continued habitat degradation results in almost certain extinction within 100 years.  相似文献   
83.
Fine particulate matter (PM2.5) mass was determined on a continuous basis at the Salt Lake City Environmental Protection Agency Environmental Monitoring for Public Awareness and Community Tracking monitoring site in Salt Lake City, UT, using three different monitoring techniques. Hourly averaged PM2.5 mass data were collected during two sampling periods (summer 2000 and winter 2002) using a real-time total ambient mass sampler (RAMS), sample equilibration system (SES)-tapered element oscillating microbalance (TEOM), and conventional TEOM monitor. This paper compares the results obtained from the various monitoring systems, which differ in their treatment of semivolatile material (SVM; particle-bound water, semivolatile ammonium nitrate, and semivolatile organic compounds). PM2.5 mass results obtained by the RAMS were consistently higher than those obtained by the SES-TEOM and conventional TEOM monitors because of the RAMS ability to measure semivolatile ammonium nitrate and semivolatile organic material but not particle-bound water. The SES-TEOM monitoring system was able to account for an average of 28% of the SVM, whereas the conventional TEOM monitor loses essentially all of the SVM from the single filter during sampling. Occasional mass readings by the various TEOM monitors that are higher than RAMS results may reflect particle-bound water, which, under some conditions, is measured by the TEOM but not the RAMS.  相似文献   
84.
In siting a monitor to measure compliance with U.S. National Ambient Air Quality Standards (NAAQS) for particulate matter (PM), there is a need to characterize variations in PM concentration within a neighborhood-scale region to achieve monitor siting objectives. A simple methodology is provided here for the selection of a neighborhood-scale site for meeting either of the two objectives identified for PM monitoring. This methodology is based on analyzing middle-scale (from 100 to 500 m) data from within the area of interest. The required data can be obtained from widely available dispersion models and emissions databases. The performance of the siting methodology was evaluated in a neighborhood-scale field study conducted in Hudson County, NJ, to characterize the area's inhalable particulate (PM10) concentrations. Air monitors were located within a 2- by 2-km area in the vicinity of the Lincoln Tunnel entrance in Hudson County. Results indicate the siting methodology performed well, providing a positive relationship between the predicted concentration rank at each site and the actual rank experienced during the field study. Also discussed are factors that adversely affected the predictive capabilities of the model.  相似文献   
85.
The chemical mass balance (CMB) model was applied for source apportionment of PM2.5 in Atlanta in order to explore levels and causes of uncertainties in source contributions. Monte Carlo analysis with Latin hypercube sampling (MC-LHS) was performed to evaluate the source impact uncertainties and quantify how uncertainties in ambient measurement and source profile data affect results. In general, uncertainties in the source profile data contribute more to the final uncertainties in source apportionment results than do those in ambient measurement data. Uncertainty contribution estimates suggest that non-linear interactions among source profiles also affect the final uncertainties although their influence is typically less than uncertainties in source profile data.  相似文献   
86.
A three-dimensional Eulerian photochemical model is used to follow the dynamics of ozone, NOx, and CO over the Athens area, for 25 May 1990, the day considered in the APSIS project. A unique aspect of this work lies in the study of the impacts of the wind field preparation methods on the concentrations predicted by the model. Three sets of wind fields are developed. The first one used is derived from a prognostic meteorological model. The second one is calculated from available wind observations using objective: methods. For these two cases, a previous day is simulated, using the same conditions, to develop preconditioned initial conditions for the following day. For the third simulation, again two days are simulated, this time using the observed winds for each of the two days modeled. The predictions using the prognostically derived and the objective analysis wind fields are significantly different, particularly for the primary pollutants. Comparing predictions to the observations did not favor any particular method of wind field preparation. In this case, when using the prognostically derived field, the simulations are very sensitive to boundary conditions. In contrast, when using the wind fields constructed by objective methods, the simulations became most sensitive to emissions and initial conditions. This comes directly from the different residence times in the domain, which are governed by the wind speed.  相似文献   
87.
An on-site wastewater treatment project with two separate drip fields was operated for 6 years and received no maintenance. The two drip fields (with different design configurations) contained pressure-compensating emitters (PC) and non-pressure-compensating emitters (NPC), respectively, and received wastewater with an average 5-day biochemical oxygen demand concentration of 23 mg/L. Flowrates of the PC emitters reduced from rated average of 3.50 to 1.00 L/h, and the average flowrate of the NPC emitters reduced from 2.00 to 1.53 L/h. The statistical uniformities were 48 and 71%, and the uniformity coefficients were 70 and 86% for PC and NPC emitters, respectively. Significant, but incomplete, recovery was achieved with field-flushing and consecutive shock-chlorination treatments of 500 and 1000 mg/L.  相似文献   
88.
A microanalytical method suitable for the quantitative determination of the sugar anhydride levoglucosan in low-volume samples of atmospheric fine particulate matter (PM) has been developed and validated. The method incorporates two sugar anhydrides as quality control standards. The recovery standard sedoheptulosan (2,7-anhydro-beta-D-altro-heptulopyranose) in 20 microL solvent is added onto samples of the atmospheric fine PM and aged for 1 hr before ultrasonic extraction with ethylacetate/ triethylamine. The extract is reduced in volume, an internal standard is added (1,5-anhydro-D-mannitol), and a portion of the extract is derivatized with 10% by volume N-trimethylsilylimidazole. The derivatized extract is analyzed by gas chromatography/mass spectrometry (GC/MS). The recovery of levoglucosan using this procedure was 69 +/- 6% from five filters amended with 2 microg levoglucosan, and the reproducibility of the assay is 9%. The limit of detection is approximately 0.1 microg/mL, which is equivalent to approximately 3.5 ng/m3 for a 10 L/min sampler or approximately 8.7 ng/m3 for a 4 L/min personal sampler (assuming 24-hr integrated samples). We demonstrated that levoglucosan concentrations in collocated samples (expressed as ng/m3) were identical irrespective of whether samples were collected by PM with aerodynamic diameter < or = 2.5 microm or PM with aerodynamic diameter < or = 10 microm impactors. It was also demonstrated that X-ray fluorescence analysis of samples of atmospheric PM, before levoglucosan determinations, did not alter the levels of levoglucosan.  相似文献   
89.
This study presents a large-eddy simulation (LES) study of the convective boundary layer on August 1, 1999 over Philadelphia, PA during a summer ozone episode. The study is an evaluation of the Colorado State University's Regional Atmospheric Modeling System Version 4.3 (RAMS4.3) with the LES option using Northeast Oxidant and Particulate Study (NE-OPS) data. Simulations were performed with different imposed sensible heat fluxes at the ground surface. The model was initialized with the atmospheric sounding data collected at Philadelphia at 1230 UTC and model integrations continued till 2130 UTC. The resulting mean profiles of temperature and humidity obtained from the LES model were compared with atmospheric soundings, tethered balloon and aircraft data collected during the NE-OPS 1999 field campaign. Also the model-derived vertical profiles of virtual temperature were compared with NE-OPS Radio Acoustic Sounder System (RASS) data while the humidity profiles were compared with NE-OPS lidar data. The comparison of the radiosonde data with the LES model predictions suggests that the growth of the mixing layer is reasonably well simulated by the model. Overall, the agreement of temperature predictions of the LES model with the radiosonde observations is good. The model appears to underestimate humidity values for the case of higher imposed sensible heat flux. However, the humidity values in the mixing layer agree quite well with radiosonde observations for the case of lower imposed sensible heat flux. The model-predicted temperature and humidity profiles are in reasonable agreement with the tethered balloon data except for some small overestimation of temperature at lower layers and some underestimation of humidity values. However, the humidity profiles as simulated by the model agree quite well with the tethered balloon data for the case of lower imposed sensible heat flux. The model-predicted virtual temperature profile is also in better agreement with RASS data for the case of lower imposed sensible heat flux. The model-predicted temperature profile further agrees quite well with aircraft data for the case of lower imposed heat flux. However, the relative humidity values predicted by the model are lower compared with the aircraft data. The model-predicted humidity profiles are only in partial agreement with the lidar data. The results of this study suggest that the explicitly resolved energetic eddies seem to provide the correct forcing necessary to produce good agreement with observations for the case of an imposed sensible heat flux of 0.1 K m s–1 at the surface.  相似文献   
90.
Technically, forestry projects have thepotential to contribute significantly tothe mitigation of global warming, but manysuch projects may not be economicallyattractive at current estimates of carbon(C) prices. Forest C is, in a sense, a newcommodity that must be measured toacceptable standards for the commodity toexist. This will require that credible Cmeasuring and monitoring procedures be inplace. The amount of sequestered C that canbe claimed by a project is normallyestimated based on sampling a number ofsmall plots, and the precision of thisestimate depends on the number of plotssampled and on the spatial variability ofthe site. Measuring C can be expensive andhence it is important to select anefficient C-monitoring strategy to makeprojects competitive in the C market. Thispaper presents a method to determinewhether a forestry project will benefitfrom C trading, and to find the optimalmanagement strategy in terms of forestcycle length and C-monitoring strategyA model of an Acacia mangiumplantation in southern Sumatra, Indonesiais used to show that forestry projects canbe economically attractive under a range ofconditions, provided that the project islarge enough to absorb fixed costs.Modeling results indicate that between 15and 38 Mg of Certified Emission Reductions(CERs) per hectare can be captured by thesimulated plantation under optimalmanagement, with optimality defined asmaximizing the present value of profitsobtained from timber and C. The optimalcycle length ranged from 12 to 16 years andthe optimal number of sample plots rangedfrom 0 to 30. Costs of C monitoring (inpresent-value terms) were estimated to bebetween 0.45 (Mg C)-1 to 2.11 (MgC)-1 depending on the spatialvariability of biomass, the variable costsof C monitoring and the discount rate.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号