首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   775篇
  免费   9篇
  国内免费   13篇
安全科学   37篇
废物处理   41篇
环保管理   214篇
综合类   83篇
基础理论   160篇
污染及防治   182篇
评价与监测   51篇
社会与环境   28篇
灾害及防治   1篇
  2022年   4篇
  2021年   3篇
  2020年   10篇
  2019年   7篇
  2018年   9篇
  2017年   16篇
  2016年   14篇
  2015年   15篇
  2014年   20篇
  2013年   94篇
  2012年   19篇
  2011年   45篇
  2010年   24篇
  2009年   35篇
  2008年   44篇
  2007年   46篇
  2006年   47篇
  2005年   17篇
  2004年   20篇
  2003年   36篇
  2002年   28篇
  2001年   16篇
  2000年   10篇
  1999年   8篇
  1998年   8篇
  1997年   9篇
  1996年   14篇
  1995年   7篇
  1994年   9篇
  1993年   6篇
  1992年   6篇
  1991年   5篇
  1990年   4篇
  1989年   6篇
  1988年   4篇
  1987年   8篇
  1986年   8篇
  1985年   9篇
  1984年   15篇
  1983年   14篇
  1982年   13篇
  1981年   18篇
  1980年   5篇
  1979年   6篇
  1978年   6篇
  1977年   5篇
  1976年   3篇
  1974年   7篇
  1973年   3篇
  1970年   3篇
排序方式: 共有797条查询结果,搜索用时 11 毫秒
111.
ABSTRACT: Baseflow, or water that enters a stream from slowly varying sources such as ground water, can be critical to humans and ecosystems. We evaluate a simple method for estimating base‐flow parameters at ungaged sites. The method uses one or more baseflow discharge measurements at the ungaged site and longterm streamflow data from a nearby gaged site. A given baseflow parameter, such as the median, is estimated as the product of the corresponding gage site parameter and the geometric mean of the ratios of the measured baseflow discharges and the concurrent discharges at the gage site. If baseflows at gaged and ungaged sites have a bivariate lognormal distribution with high correlation and nearly equal log variances, the estimated baseflow parameters are very accurate. We tested the proposed method using long‐term streamflow data from two watershed pairs in the Driftless Area of southwestern Wisconsin. For one watershed pair, the theoretical assumptions are well met; for the other the log‐variances are substantially different. In the first case, the method performs well for estimating both annual and long‐term baseflow parameters. In the second, the method performs remarkably well for estimating annual mean and annual median baseflow discharge, but less well for estimating the annual lower decile and the long‐term mean, median, and lower decile. In general, the use of four measurements in a year is not substantially better than the use of two.  相似文献   
112.
The Pittsburgh Research Laboratory (PRL) of the National Institute for Occupational Safety and Health (NIOSH) and the Mine Safety and Health Administration (MSHA) conducted joint research on dust explosions by studying post-explosion dust samples. The samples were collected after full-scale explosions at the PRL Lake Lynn Experimental Mine (LLEM), and after laboratory explosions in the PRL 20-L chamber and the Fike 1 m3 chamber. The dusts studied included both high- and low-volatile bituminous coals. Low temperature ashing for 24 h at 515 °C was used to measure the incombustible content of the dust before and after the explosions. The data showed that the post-explosion incombustible content was always as high as, or higher than the initial incombustible content. The MSHA alcohol coking test was used to determine the amount of coked dust in the post-explosion samples. The results showed that almost all coal dust that was suspended within the explosion flame produced significant amounts of coke. Measurements of floor dust concentrations after LLEM explosions were compared with the initial dust loadings to determine the transport distance of dust during an explosion. All these data will be useful in future forensic investigations of accidental dust explosions in coal mines, or elsewhere.  相似文献   
113.
OBJECTIVES: Zero tolerance (ZT) laws have been effective in reducing alcohol-related crashes among underage drivers. However, enforcement in some states has not been rigorous, and ZT offenses may not be viewed as serious offenses. On July 1, 1994, the state of Washington implemented a ZT law that allowed police to request a test for alcohol on suspicion of either a ZT or driving-under-the-influence (DUI) offense. The present study examined effects of the ZT law on arrests and case dispositions among underage offenders as a function of blood alcohol concentration (BAC) and post-law patterns of recidivism. METHODS: Times-series analyses examined the effects of the ZT law on trends in arrests of underage drivers between 1991 and 1999. Based on arrest records matched with driver's license records, the effects of the law on dispositions of alcohol-related offenses among underage drivers were examined, and rates of recidivism among underage offenders were examined for the period following the ZT law. RESULTS: There was a substantial increase in arrests of underage drivers beginning immediately after implementation of the ZT law, especially among drivers with low BACs. The types of court or administrative dispositions received by underage offenders changed markedly after the ZT law was implemented. Underage offenders with lower BACs became far more likely to receive alcohol-related convictions and/or license suspensions. However, the percentage of underage offenders with higher BACs receiving DUI convictions declined as some of these offenders received the lesser ZT disposition. After the ZT law, underage offenders with BACs of 0.10 g/dL or higher were more likely to recidivate than those with lower BACs, but appreciable proportions of drivers were re-arrested for another alcohol offense, whatever the BAC and however they were penalized. CONCLUSIONS: Implementation of Washington's law indicates that a ZT law can increase the likelihood that an underage person will be sanctioned for drinking and driving. However, recidivism remains an issue as more than one in four underage drivers arrested with low BACs subsequently were re-arrested.  相似文献   
114.
Historically, many watershed studies have been based on using the streamflow flux, typically from a single gauge at the basin's outlet, to support calibration. In this setting, there is great potential for equifinality of parameters during the optimization process, especially for parameters that are not directly related to streamflow. Therefore, some of the optimal parameter values achieved during the autocalibration process may be physically unrealistic. In recent decades a vast array of data from land surface models and remote sensing platforms can help to constrain hydrologic fluxes such as evapotranspiration (ET). While the spatial resolution of these ancillary datasets varies, the continuous spatial coverage of these gridded datasets provides flux measurements across the entire basin, in stark contrast to point‐based streamflow data. This study uses Global Land Evaporation: the Amsterdam Model data to constrain Soil and Water Assessment Tool parameter values associated with ET to a more physically realistic range. The study area is the Little Washita River Experimental Watershed, in southern Oklahoma. Traditional objective metrics such as the Nash‐Sutcliffe coefficients record no performance improvement after application of this method. However, there is a dramatic increase in the number of days with receding flow where simulations match observed streamflow.  相似文献   
115.
Demand for new environmental services from forests requires improved monitoring of these services at three scales: project-, regional-, and national-level. Most forest management activities are organized at the project scale, while the Framework Convention on Climate Change (FCCC) recognizes the nation as the party to the agreement. Hence, measurement and monitoring issues are emerging at the intersections of the project and national scales, referred to here as monitoring-domain edge effects. The following actions are necessary to improve existing monitoring capabilities and to help resolve project/national edge effects: (1) consensus on standard methods and protocols for monitoring mitigation activities, their off-site greenhouse gas (GHG) impacts, the fate of forest products and their relation to national GHG inventories (baselines); (2) a global program for collecting land use, land cover, biomass burning, and other data essential for national baselines; (3) the development of new nested-monitoring-domain methods that allow projects to be identified in national GHG inventories (baselines), and permit tracking of leakage of GHGs and wood product flows outside project boundary and over time; and (4) presentation of a set of credible, carefully designed, and well-documented forest mitigation activities that resolve most of the current issues.  相似文献   
116.
In the coming century, modern bioenergy crops have the potential to play a crucial role in the global energy mix, especially under policies to reduce carbon dioxide emissions as proposed by many in the international community. Previous studies have not fully addressed many of the dynamic interactions and effects of a policy-induced expansion of bioenergy crop production, particularly on crop yields and human food demand. This study combines an updated agriculture and land use (AgLU) model with a well-developed energy-economic model to provide an analysis of the effects of bioenergy crops on energy, agricultural and land use systems. The results indicate that carbon dioxide mitigation policies can stimulate a large production of bioenergy crops, dependent on the level of the policy. This production of bioenergy crops can lead to several impacts on the agriculture and land use system: decreases in forestland and unmanaged land, decreases in the average yield of food crops, increases in the prices of food crops, and decreases in the level of human demand of calories.
Steven J. Smith (Corresponding author)Email:
  相似文献   
117.
Maternal serum free beta (hCG) levels are elevated (median 2·20 MOM) in the first trimester of pregnancy in 38 Down syndrome cases as compared with appropriate controls. This observation may form the basis for its use as a marker in screening for Down syndrome in the first trimester. Altered levels of the free beta analyte are observed in pregnancy conditions or complications other than Down syndrome.  相似文献   
118.
Exotic species invasion is widely considered to affect ecosystem structure and function. Yet, few contemporary approaches can assess the effects of exotic species invasion at such an inclusive level. Our research presents one of the first attempts to examine the effects of an exotic species at the ecosystem level in a quantifiable manner. We used ecological network analysis (ENA) and a social network analysis (SNA) method called cohesion analysis to examine the effect of zebra mussel (Dreissena polymorpha) invasion on the Oneida Lake, New York, USA, food web. We used ENA to quantify ecosystem function through an analysis of food web carbon transfer that explicitly incorporated flow over all food web paths (direct and indirect). The cohesion analysis assessed ecosystem structure through an organization of food web members into subgroups of strongly interacting predators and prey. Our analysis detected effects of zebra mussel invasion throughout the entire Oneida Lake food web, including changes in trophic flow efficiency (i.e., carbon flow among trophic levels) and alterations of food web organization (i.e., paths of carbon flow) and ecosystem activity (i.e., total carbon flow). ENA indicated that zebra mussels altered food web function by shunting carbon from pelagic to benthic pathways, increasing dissipative flow loss, and decreasing ecosystem activity. SNA revealed the strength of zebra mussel perturbation as evidenced by a reorganization of food web subgroup structure, with a decrease in importance of pelagic pathways, a concomitant rise of benthic pathways, and a reorganization of interactions between top predator fish. Together, these analyses allowed for a holistic understanding of the effects of zebra mussel invasion on the Oneida Lake food web.  相似文献   
119.
本文主要研究了肯尼亚姆瓦彻湾(属于浅潮滩红树林湿地)泥沙交换的动力机制.该港湾属于半日潮,大、小潮的潮差分别为.2m和1.4m,大潮高水位的水面面积为17km2.  相似文献   
120.
Although forest conservation activities, particularly in the tropics, offer significant potential for mitigating carbon (C) emissions, these types of activities have faced obstacles in the policy arena caused by the difficulty in determining key elements of the project cycle, particularly the baseline. A baseline for forest conservation has two main components: the projected land-use change and the corresponding carbon stocks in applicable pools in vegetation and soil, with land-use change being the most difficult to address analytically. In this paper we focus on developing and comparing three models, ranging from relatively simple extrapolations of past trends in land use based on simple drivers such as population growth to more complex extrapolations of past trends using spatially explicit models of land-use change driven by biophysical and socioeconomic factors. The three models used for making baseline projections of tropical deforestation at the regional scale are: the Forest Area Change (FAC) model, the Land Use and Carbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD) model. The models were used to project deforestation in six tropical regions that featured different ecological and socioeconomic conditions, population dynamics, and uses of the land: (1) northern Belize; (2) Santa Cruz State, Bolivia; (3) Paraná State, Brazil; (4) Campeche, Mexico; (5) Chiapas, Mexico; and (6) Michoacán, Mexico. A comparison of all model outputs across all six regions shows that each model produced quite different deforestation baselines. In general, the simplest FAC model, applied at the national administrative-unit scale, projected the highest amount of forest loss (four out of six regions) and the LUCS model the least amount of loss (four out of five regions). Based on simulations of GEOMOD, we found that readily observable physical and biological factors as well as distance to areas of past disturbance were each about twice as important as either sociological/demographic or economic/infrastructure factors (less observable) in explaining empirical land-use patterns. We propose from the lessons learned, a methodology comprised of three main steps and six tasks can be used to begin developing credible baselines. We also propose that the baselines be projected over a 10-year period because, although projections beyond 10 years are feasible, they are likely to be unrealistic for policy purposes. In the first step, an historic land-use change and deforestation estimate is made by determining the analytic domain (size of the region relative to the size of proposed project), obtaining historic data, analyzing candidate baseline drivers, and identifying three to four major drivers. In the second step, a baseline of where deforestation is likely to occur–a potential land-use change (PLUC) map—is produced using a spatial model such as GEOMOD that uses the key drivers from step one. Then rates of deforestation are projected over a 10-year baseline period based on one of the three models. Using the PLUC maps, projected rates of deforestation, and carbon stock estimates, baseline projections are developed that can be used for project GHG accounting and crediting purposes: The final step proposes that, at agreed interval (e.g., about 10 years), the baseline assumptions about baseline drivers be re-assessed. This step reviews the viability of the 10-year baseline in light of changes in one or more key baseline drivers (e.g., new roads, new communities, new protected area, etc.). The potential land-use change map and estimates of rates of deforestation could be re-done at the agreed interval, allowing the deforestation rates and changes in spatial drivers to be incorporated into a defense of the existing baseline, or the derivation of a new baseline projection.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号