首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   183篇
  免费   13篇
  国内免费   1篇
安全科学   8篇
废物处理   15篇
环保管理   41篇
综合类   25篇
基础理论   48篇
环境理论   3篇
污染及防治   38篇
评价与监测   9篇
社会与环境   7篇
灾害及防治   3篇
  2023年   4篇
  2022年   4篇
  2021年   4篇
  2020年   6篇
  2019年   2篇
  2018年   5篇
  2017年   12篇
  2016年   15篇
  2015年   6篇
  2014年   9篇
  2013年   11篇
  2012年   15篇
  2011年   12篇
  2010年   11篇
  2009年   9篇
  2008年   17篇
  2007年   11篇
  2006年   11篇
  2005年   5篇
  2004年   6篇
  2003年   3篇
  2002年   6篇
  1999年   2篇
  1997年   1篇
  1996年   2篇
  1995年   1篇
  1994年   1篇
  1993年   1篇
  1992年   2篇
  1988年   1篇
  1987年   1篇
  1986年   1篇
排序方式: 共有197条查询结果,搜索用时 265 毫秒
31.
Stream-subsurface exchange results from a complex ensemble of transport mechanisms that require different modeling approaches. Field and laboratory experiments show that advective exchange through the underlying sediments is an important mechanism of solutes transport and storage in riverine systems. Here, Transient Storage Model parameters are obtained for reactive solute exchange driven by bedform-induced advection. Consideration of exchange induced by this single mechanism allows specific relationships between model parameters and system properties like solute reactivity to be identified. This work shows that when a simplified model like the Transient Storage Model is applied to analyze metal storage in river sediments, particular attention must be devoted to the choice of modeling parameters.  相似文献   
32.
Effective water quality management depends on enactment of appropriately designed monitoring programs to reveal current and forecasted conditions. Because water quality conditions are influenced by numerous factors, commonly measured attributes such as total phosphorus (TP) can be highly temporally varying. For highly varying processes, monitoring programs should be long-term and periodic quantitative analyses are needed so that temporal trends can be distinguished from stochastic variation, which can yield insights into potential modifications to the program. Using generalized additive mixed modeling, we assessed temporal (yearly and monthly) trends and quantified other sources of variation (daily and subsampling) in TP concentrations from a multidecadal depth-specific monitoring program on Big Platte Lake, Michigan. Yearly TP concentrations decreased from the late 1980s to late 1990s before rebounding through the early 2000s. At depths of 2.29 to 13.72 m, TP concentrations have cycled around stationary points since the early 2000s, while at the surface and depths ≥?18.29 concentrations have continued declining. Summer and fall peaks in TP concentrations were observed at most depths, with the fall peak at deeper depths occurring 1 month earlier than shallower depths. Daily sampling variation (i.e., variation within a given month and year) was greatest at shallowest and deepest depths. Variation in subsamples collected from depth-specific water samples constituted a small fraction of total variation. Based on model results, cost-saving measures to consider for the monitoring program include reducing subsampling of depth-specific concentrations and reducing the number of sampling depths given observed consistencies across the program period.  相似文献   
33.
Forty-two communities in rural Alaska are considered unserved or underserved with water and sewer infrastructure. Many challenges exist to provide centralized piped water and sewer infrastructure to the homes, and they are exacerbated by decreasing capital funding. Unserved communities in rural Alaska experience higher rates of disease, supporting the recommendation that sanitation infrastructure should be provided. Organizations are pursuing alternative solutions to conventional piped water and sewer in order to maximize water use and reuse for public health. This paper reviews initiatives led by the State of Alaska, the Alaska Native Tribal Health Consortium, and the Yukon Kuskokwim Health Corporation to identify and develop potential long-term solutions appropriate and acceptable to rural communities. Future developments will likely evolve based on the lessons learned from the initiatives. Recommendations include Alaska-specific research needs, increased end-user participation in the design process, and integrated monitoring, evaluation, and information dissemination in future efforts.  相似文献   
34.
Non-compliance and the quota price in an ITQ fishery   总被引:4,自引:0,他引:4  
This paper examines the effects of non-compliance on quota demands and the equilibrium quota price in an ITQ fishery. I show that whereas lower quota prices are implied unambiguously by expected penalties which are a function of the absolute violation size, the expectation of penalties based upon relative violations of quota demands can, under certain conditions, produce higher quota prices than in a compliant quota market. If there are both compliant and non-compliant firms in the fishery, the result would then be a shift in quota demand from compliant to non-compliant firms, rather than the reverse. The findings are generally applicable to quota markets in other industries, including pollution permit markets.  相似文献   
35.
The tiger shark (Galeocerdo cuvier Peron and Lesueur 1822) is a widely distributed predator with a broad diet and the potential to affect marine community structure, yet information on local patterns of abundance for this species is lacking. Tiger shark catch data were gathered over 7 years of tag and release research fishing (1991–2000, 2002–2004) in Shark Bay, Western Australia (25°45′S, 113°44′E). Sharks were caught using drumlines deployed in six permanent zones (~3 km2 in area). Fishing effort was standardized across days and months, and catch rates on hooks were expressed as the number of sharks caught h−1. A total of 449 individual tiger sharks was captured; 29 were recaptured. Tiger shark catch rate showed seasonal periodicity, being higher during the warm season (Sep–May) than during the cold season (Jun–Aug), and was marked by inter-annual variability. The most striking feature of the catch data was a consistent pattern of slow, continuous variation within each year from a peak during the height of the warm season (February) to a trough in the cold season (July). Annual growth rates of recaptured individuals were generally consistent with estimates from other regions, but exceeded those for populations elsewhere for sharks >275 cm fork length (FL), perhaps because mature sharks in the study area rely heavily on large prey. The data suggest that (1) the threat of predation faced by animals consumed by tiger sharks fluctuates dramatically within and between years, and (2) efforts to monitor large shark abundance should be extensive enough to detect inter-annual variation and sufficiently intensive to account for intra-annual trends.  相似文献   
36.
Although forest conservation activities, particularly in the tropics, offer significant potential for mitigating carbon (C) emissions, these types of activities have faced obstacles in the policy arena caused by the difficulty in determining key elements of the project cycle, particularly the baseline. A baseline for forest conservation has two main components: the projected land-use change and the corresponding carbon stocks in applicable pools in vegetation and soil, with land-use change being the most difficult to address analytically. In this paper we focus on developing and comparing three models, ranging from relatively simple extrapolations of past trends in land use based on simple drivers such as population growth to more complex extrapolations of past trends using spatially explicit models of land-use change driven by biophysical and socioeconomic factors. The three models used for making baseline projections of tropical deforestation at the regional scale are: the Forest Area Change (FAC) model, the Land Use and Carbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD) model. The models were used to project deforestation in six tropical regions that featured different ecological and socioeconomic conditions, population dynamics, and uses of the land: (1) northern Belize; (2) Santa Cruz State, Bolivia; (3) Paraná State, Brazil; (4) Campeche, Mexico; (5) Chiapas, Mexico; and (6) Michoacán, Mexico. A comparison of all model outputs across all six regions shows that each model produced quite different deforestation baselines. In general, the simplest FAC model, applied at the national administrative-unit scale, projected the highest amount of forest loss (four out of six regions) and the LUCS model the least amount of loss (four out of five regions). Based on simulations of GEOMOD, we found that readily observable physical and biological factors as well as distance to areas of past disturbance were each about twice as important as either sociological/demographic or economic/infrastructure factors (less observable) in explaining empirical land-use patterns. We propose from the lessons learned, a methodology comprised of three main steps and six tasks can be used to begin developing credible baselines. We also propose that the baselines be projected over a 10-year period because, although projections beyond 10 years are feasible, they are likely to be unrealistic for policy purposes. In the first step, an historic land-use change and deforestation estimate is made by determining the analytic domain (size of the region relative to the size of proposed project), obtaining historic data, analyzing candidate baseline drivers, and identifying three to four major drivers. In the second step, a baseline of where deforestation is likely to occur–a potential land-use change (PLUC) map—is produced using a spatial model such as GEOMOD that uses the key drivers from step one. Then rates of deforestation are projected over a 10-year baseline period based on one of the three models. Using the PLUC maps, projected rates of deforestation, and carbon stock estimates, baseline projections are developed that can be used for project GHG accounting and crediting purposes: The final step proposes that, at agreed interval (e.g., about 10 years), the baseline assumptions about baseline drivers be re-assessed. This step reviews the viability of the 10-year baseline in light of changes in one or more key baseline drivers (e.g., new roads, new communities, new protected area, etc.). The potential land-use change map and estimates of rates of deforestation could be re-done at the agreed interval, allowing the deforestation rates and changes in spatial drivers to be incorporated into a defense of the existing baseline, or the derivation of a new baseline projection.  相似文献   
37.
Twenty-two pesticides and metabolites selected on the basis of a regional priority list, were surveyed in surface river waters by high performance liquid chromatography coupled in tandem with UV diode array detection and mass spectrometry, after an off-line pre-concentration step. Pesticide concentrations ranged between 0.07 and 4.8 microg/l according to the compound and sampling period. Analytical results were linked to the environmental risk of pesticides, evaluated by their system investigation of risk by integration of score (SIRIS) rank.  相似文献   
38.
This research focused on the use of sonication to destroy surfactants and surface tension properties in industrial wastewaters that affect traditional water treatment processes. We have investigated the sonochemical destruction of surfactants and a chelating agent to understand the release of metals from surfactants during sonication. In addition, the effects of physical properties of surfactants and the effect of ultrasonic frequency were investigated to gain an understanding of the factors affecting degradation. Sonochemical degradation of surfactants was observed to be more effective than nonsurfactant compounds. In addition, as the concentration is increased, the degradation rate constant does not decrease as significantly as with nonsurfactant compounds in the near-field acoustical processor reactor. The degradation of metal complexes is not as effective as in the absence of the metal. However, this is likely an artifact of the model complexing agent used. Surfactant metal complexes are expected to be faster, as they will accumulate at the hot bubble interface, significantly increasing ligand exchange kinetics and thus degradation of the complex.  相似文献   
39.
Objective: Despite advances in vehicle safety systems, motor vehicle crashes continue to cause ankle fractures. This study attempts to provide insight into the mechanisms of injury and to identify the at-risk population groups.

Methods: A study was made of ankle fractures patients treated at an urban level 1 trauma center following motor vehicle crashes, with a concurrent analysis of a nationally representative crash data set. The national data set focused on ankle fractures in drivers involved in frontal crashes. Statistical analysis was applied to the national data set to identify factors associated with fracture risk.

Results: Malleolar fractures occurred most frequently in the driver's right foot due to pedal interaction. The majority of complex/open fractures occurred in the left foot due to interaction with the vehicle floor. These fractures occurred in association with a femoral fracture, but their broad injury pattern suggests a range of fracture causation mechanisms. The statistical analysis indicated that the risk of fracture increased with increasing driver body mass index (BMI) and age.

Conclusions: Efforts to reduce the risk of driver ankle injury should focus on right foot and pedal interaction. The range of injury patterns identified here suggest that efforts to minimize driver ankle fracture risk will likely need to consider injury tolerances for flexion, pronation/supination, and axial loading in order to capture the full range of injury mechanisms. In the clinical environment, physicians examining drivers after a frontal crash should consider those who are older or obese or who have severe femoral injury without concurrent head injury as highly suspicious for an ankle injury.  相似文献   

40.
Not as much abatement as has been presumed. Smog check programs aim to curb tailpipe emissions from in-use vehicles by requiring repairs whenever emissions, measured at regular time intervals, exceed a certain threshold. Using data from California, we estimate that on average 41% of the initial emissions abatement from repairs is lost by the time of the subsequent inspection, normally two years later. Our estimates imply that the cost per pound of pollution avoided is an order of magnitude greater for smog check repairs than alternative policies such as new-vehicle standards or emissions trading among industrial point sources.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号