首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   211篇
  免费   5篇
安全科学   19篇
废物处理   11篇
环保管理   44篇
综合类   23篇
基础理论   70篇
污染及防治   35篇
评价与监测   9篇
社会与环境   2篇
灾害及防治   3篇
  2022年   3篇
  2021年   3篇
  2020年   1篇
  2019年   6篇
  2018年   8篇
  2017年   5篇
  2016年   7篇
  2015年   4篇
  2014年   5篇
  2013年   16篇
  2012年   10篇
  2011年   6篇
  2010年   14篇
  2009年   12篇
  2008年   15篇
  2007年   7篇
  2006年   10篇
  2005年   7篇
  2004年   9篇
  2003年   10篇
  2002年   9篇
  2001年   2篇
  2000年   6篇
  1999年   2篇
  1998年   3篇
  1997年   3篇
  1996年   3篇
  1995年   2篇
  1994年   1篇
  1993年   3篇
  1992年   3篇
  1990年   3篇
  1989年   1篇
  1988年   4篇
  1987年   3篇
  1986年   3篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
  1979年   2篇
  1971年   1篇
排序方式: 共有216条查询结果,搜索用时 15 毫秒
11.
Information on distribution and relative abundance of species is integral to sustainable management, especially if they are to be harvested for subsistence or commerce. In northern Australia, natural landscapes are vast, centers of population few, access is difficult, and Aboriginal resource centers and communities have limited funds and infrastructure. Consequently defining distribution and relative abundance by comprehensive ground survey is difficult and expensive. This highlights the need for simple, cheap, automated methodologies to predict the distribution of species in use, or having potential for use, in commercial enterprise. The technique applied here uses a Geographic Information System (GIS) to make predictions of probability of occurrence using an inductive modeling technique based on Bayes' theorem. The study area is in the Maningrida region, central Arnhem Land, in the Northern Territory, Australia. The species examined, Cycas arnhemica and Brachychiton diversifolius, are currently being 'wild harvested' in commercial trials, involving sale of decorative plants and use as carving wood, respectively. This study involved limited and relatively simple ground surveys requiring approximately 7 days of effort for each species. The overall model performance was evaluated using Cohen's kappa statistics. The predictive ability of the model for C. arnhemica was classified as moderate and for B. diversifolius as fair. The difference in model performance can be attributed to the pattern of distribution of these species. C. arnhemica tends to occur in a clumped distribution due to relatively short distance dispersal of its large seeds and vegetative growth from long-lived rhizomes, while B. diversifolius seeds are smaller and more widely dispersed across the landscape. The output from analysis predicts trends in species distribution that are consistent with independent on-site sampling for each species and therefore should prove useful in gauging the extent of resource availability. However, some caution needs to be applied as the models tend to over predict presence which is a function of distribution patterns and of other variables operating in the landscape such as fire histories which were not included in the model due to limited availability of data.  相似文献   
12.
Work teams are being utilized more frequently to give organizations access to the broader knowledge and skill base of employees, as well as to provide for adaptive, efficient decision‐making. In teams, we argue that constructive confrontation norms are an important contingency variable in the relationship between mental model similarity and decision quality. Mental model similarity helps team members understand one another's perspectives and reduces the likelihood of conflict. Accordingly, mental model similarity improves decision quality. When strong norms of constructive confrontation are in place, however, teams are in a better position to reap the benefits of conflict (greater diversity of inputs) without experiencing its negative consequences. Thus, when constructive confrontation norms are strong, less mental model similarity (i.e., more diversity of perspectives) is likely to improve decision quality. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   
13.
If global warming is accelerating, then one might expect temperatures for most stations to be accelerating and perhaps variability to be increasing. In this study, we examine 57 New Zealand temperature time series for evidence of non-linearity and changing variability. These correspond to time series for annual minima, annual means and annual maxima for 19 stations. Estimation is by an extended least-squares method. We find a surprising diversity of behaviour of these series – presumably reflecting their different geographic factors as well as series length. We give evidence of regions where temperatures are decreasing. For series where a linear trend is significant, it is downwards in about one third of the cases. This proportion was higher in the South Island, especially for series of minima. Where a non-linear trend is significant, temperatures are decelerating in about one half of the cases. The ratio of downward to upward trends is highest among annual maxima and South Island minima and smallest in annual means. Where a linear trend in the variability is significant, it is decreasing in 13 cases and increasing in 5 cases, although possibly this is partly due to poorer quality data last century. Where a non-linear trend in the variability is significant, variability is decelerating in about two thirds of the cases. The results are used to project upper and lower return levels of minima, means and maxima for each of the series to the year 2010.  相似文献   
14.
Organochlorine chemical residues and elemental concentrations were measured in piscivorous and benthivorous fish at 111 sites from large U.S. river basins. Potential contaminant sources such as urban and agricultural runoff, industrial discharges, mine drainage, and irrigation varied among the sampling sites. Our objectives were to provide summary statistics for chemical contaminants and to determine if contaminant concentrations in the fish were a risk to wildlife that forage at these sites. Concentrations of dieldrin, total DDT, total PCBs, toxaphene, TCDD-EQ, cadmium, chromium, mercury, lead, selenium, and zinc exceeded toxicity thresholds to protect fish and piscivorous wildlife in samples from at least one site; most exceedences were for total PCBs, mercury, and zinc. Chemical concentrations in fish from the Mississippi River Basin exceeded the greatest number of toxicity thresholds. Screening level wildlife risk analysis models were developed for bald eagle and mink using no adverse effect levels (NOAELs), which were derived from adult dietary exposure or tissue concentration studies and based primarily on reproductive endpoints. No effect hazard concentrations (NEHC) were calculated by comparing the NOAEL to the food ingestion rate (dietary-based NOAEL) or biomagnification factor (tissue-based NOAEL) of each receptor. Piscivorous wildlife may be at risk from a contaminant if the measured concentration in fish exceeds the NEHC. Concentrations of most organochlorine residues and elemental contaminants represented no to low risk to bald eagle and mink at most sites. The risk associated with pentachloroanisole, aldrin, Dacthal, methoxychlor, mirex, and toxaphene was unknown because NOAELs for these contaminants were not available for bald eagle or mink. Risk differed among modeled species and sites. Our screening level analysis indicates that the greatest risk to piscivorous wildlife was from total DDT, total PCBs, TCDD-EQ, mercury, and selenium. Bald eagles were at greater risk to total DDT and total PCBs than mink, whereas risks of TCDD-EQ, mercury, and selenium were greater to mink than bald eagle.  相似文献   
15.
Elizabeth Anderson’s “pluralist–expressivist” value theory, an alternative to the understanding of value and rationality underlying the “rational actor” model of human behavior, provides rich resources for addressing questions of environmental and animal ethics. It is particularly well-suited to help us think about the ethics of commodification, as I demonstrate in this critique of the pet trade. I argue that Anderson’s approach identifies the proper grounds for criticizing the commodification of animals, and directs our attention to the importance of maintaining social practices and institutions that respect the social meanings of animals. Her theory alone, however, does not adequately address the role of the state in this project. Drawing on social contract theory to fill this gap, I conclude that the state’s role in regulating the pet trade should be limited to ensuring the welfare of animals in the stream of commerce, not prohibiting their mass marketing altogether.  相似文献   
16.
ABSTRACT: The consumptive loss from man-made snowmaking at six Colorado ski areas is calculated. The focus of the procedures in this investigation is on the consumptive loss that occurs to man-made snow particles during the period they reside on or in the snowpack until spring snowmelt (termed the watershed loss). Calculated watershed losses under a variety of precipitation and temperature conditions at six ski areas varied from 7 to 33 percent. These calculations were made using the calibrated Subalpine Water Balance Simulation Model (Leaf and Brink, 1973a, 1973b). The watershed loss of 7 to 33 percent indicates the range of likely watershed losses that can be expected at Colorado ski areas. A previous paper by the authors (Eisel et al., 1988) provided estimates of the mean consumptive loss during the snowmaking process (termed initial loss) for conditions existing at Colorado ski areas to be 6 percent of water applied. Therefore, based on the mean initial loss, the total consumptive loss from man-made snowmaking under conditions found at Colorado ski areas could be expected to range from 13 to 37 percent. These results demonstrate the range of total consumptive losses that could be expected in various years and for various watershed conditions. These total percentage losses cannot be extrapolated directly to other specific sites because the total consumptive loss is dependent on temperature during actual snowmaking, temperature and precipitation throughout the winter at the specific ski area, and watershed conditions at the ski area. Consumptive losses to man-made snow for a specific ski area should be estimated using the handbook procedures developed especially for this purpose (Colorado Ski Country USA, 1986b).  相似文献   
17.
Traditional bioremediation approaches have been used to treat petroleum source contamination in readily accessible soils and sludges. Contamination under existing structures is a greater challenge. Options to deal with this problem have usually been in the extreme (i.e., to dismantle the facility and excavate to an acceptable regulated residual, or to pump and treat for an inordinately long period of time). The excavated material must be further remediated and cleanfill must be added to close the excavation. If site assessments were too conservative or incomplete, new contamination adulterating fill soils may result in additional excavation at some later date. Innovative, cost-efficient technologies must be developed to remove preexisting wastes under structures and to reduce future remediation episodes. An innovative soil bioremediation treatment method was developed and evaluated in petroleum hydrocarbon contaminated (PHC) soils at compressor stations of a natural gas pipeline running through Louisiana. The in-situ protocol was developed for remediating significant acreage subjected to contamination by petroleum-based lubricants and other PHC products resulting from a chronic leakage of lubricating oil used to maintain the pipeline itself. Initial total petroleum hydrocarbon (TPH) measurements revealed values of up to 12,000 mg/kg soil dry weight. The aim of the remediation project was to reduce TPH concentration in the contaminated soils to a level of <200 mg/kg soil dry weight, a level negotiated to be acceptable to state and federal regulators. After monitoring the system for 122 days, all sites showed greater than 99-percent reduction in TPH concentration.  相似文献   
18.
This paper tests the use of a spatial analysis technique, based on the calculation of local spatial autocorrelation, as a possible approach for modelling and quantifying structure in northern Australian savanna landscapes. Unlike many landscapes in the world, northern Australian savanna landscapes appear on the surface to be intact. They have not experienced the same large-scale land clearance and intensive land management as other landscapes across Australia. Despite this, natural resource managers are beginning to notice that processes are breaking down and declines in species are becoming more evident. With future declines of species looking more imminent it is particularly important that models are available that can help to assess landscape health, and quantify any structural change that takes place. GIS and landscape ecology provide a useful way of describing landscapes both spatially and temporally and have proved to be particularly useful for understanding vegetation structure or pattern in landscapes across the world. There are many measures that examine spatial structure in the landscape and most of these are now available in a GIS environment (e.g. FRAGSTATS* ARC, r.le, and Patch Analyst). All these methods depend on a landscape described in terms of patches, corridors and matrix. However, since landscapes in northern Australia appear to be relatively intact they tend to exist as surfaces of continuous variation rather than in clearly defined homogeneous units. As a result they cannot be easily described using entity-based models requiring patches and other essentially cartographic approaches. This means that more appropriate methods need to be developed and explored. The approach examined in this paper enables clustering and local pattern in the data to be identified and forms a generic method for conceptualising the landscape structure where patches are not obvious and where boundaries between landscape features are difficult to determine. Two sites are examined using this approach. They have been exposed to different degrees of disturbance by fire and grazing. The results show that savanna landscapes are very complex and that even where there is a high degree of disturbance the landscape is still relatively heterogeneous. This means that treating savanna landscapes as being made up of homogeneous units can limit analysis of pattern, as it can over simplify the structure present, and that methods such as the autocorrelation approach are useful tools for quantifying the variable nature of these landscapes.  相似文献   
19.
The aim of this paper was to investigate the effects of nitrogen (N) deposition on tree N cycling and identify potential biomarkers forNdeposition. Between April and October 2002 extensive fieldwork was undertaken at Mardley Heath in Hertfordshire. This woodland, located adjacent to the A1(M) motorway, is exposed to high levels of atmospheric nitrogen oxides from the traffic. Measurements of 15N, in vivo nitrate reductase (NR) activity, tissue, xylem and surface nitrate concentrations as well as N concentration and growth were made along a 700-m transect at 90° to the motorway. The 15N data show that oxidised N from the road traffic is taken up by nearby trees and is incorporated into plant tissues. Our measurements of NR activities suggest elevated rates close to the motorway. However, xylem sap, leaf tissue and leaf surface nitrate concentrations showed no differences between the roadside location and the most distant sampling point from the motorway. Taken together the 15N and nitrate reductase data suggest uptake and assimilation of N through the foliage.We conclude that for this lowland deciduouswoodland, tissue, xylem and surface measurements of nitrate are unreliable biomarkers for N deposition whereas 15N, growth measurements and integrated seasonal NR might be useful. The results also point to the benefit of roadside tree planting to screen pollution from motor vehicles.  相似文献   
20.
Copper Chemical Mechanical Planarization (Cu-CMP) is a critical step in integrated circuit (IC) device manufacturing. CMP and post-CMP cleaning processes are projected to account for 30-40% of the water consumed by IC manufacturers in 2003. CMP wastewater is expected to contain increasing amounts of copper as the industry switches from Al-CMP to Cu-CMP causing some IC manufacturers to run the risk of violating discharge regulations. There are a variety of treatment schemes currently available for the removal of heavy metals from CMP wastewater, however, many introduce additional chemicals to the wastewater, have large space requirements, or are expensive. This work explores the use of microorganisms for waste treatment. A Staphylococcus sp. of bacteria was isolated and studied to determine the feasibility for use in removing copper from Cu-CMP wastewater. A model Cu-CMP wastewater was developed and tested, as well as actual Cu-CMP wastes. Continuous-flow packed column experiments were performed to obtain adsorption data and show copper recovery from the waste. A predictive, empirical model was used to accurately describe Cu removal. Additionally, the immobilized cells were regenerated, allowing for the concentration and potential recovery of copper from the wastewater.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号