首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   359篇
  免费   1篇
  国内免费   1篇
安全科学   30篇
废物处理   14篇
环保管理   100篇
综合类   28篇
基础理论   83篇
环境理论   1篇
污染及防治   67篇
评价与监测   27篇
社会与环境   9篇
灾害及防治   2篇
  2023年   3篇
  2022年   3篇
  2021年   4篇
  2019年   2篇
  2018年   6篇
  2017年   4篇
  2016年   6篇
  2015年   6篇
  2014年   8篇
  2013年   40篇
  2012年   11篇
  2011年   22篇
  2010年   12篇
  2009年   16篇
  2008年   13篇
  2007年   12篇
  2006年   20篇
  2005年   10篇
  2004年   18篇
  2003年   11篇
  2002年   15篇
  2001年   4篇
  2000年   9篇
  1999年   5篇
  1998年   4篇
  1997年   5篇
  1996年   4篇
  1995年   2篇
  1994年   6篇
  1993年   8篇
  1992年   7篇
  1991年   4篇
  1990年   5篇
  1989年   2篇
  1988年   7篇
  1987年   6篇
  1986年   3篇
  1985年   3篇
  1984年   4篇
  1983年   4篇
  1982年   6篇
  1980年   4篇
  1978年   3篇
  1973年   1篇
  1972年   2篇
  1970年   1篇
  1969年   1篇
  1968年   1篇
  1967年   1篇
  1965年   1篇
排序方式: 共有361条查询结果,搜索用时 703 毫秒
21.
The basic theories and fundamental assumptions usually employed in the solution of unsteady groundwater flow problems are reviewed critically. The best known method of analysis for such problems is based on the Dupuit-Forchheimer approximation and leads to a nonlinear parabolic differential equation which is generally solved by linearization or numerical methods. The accuracy of the solution to this equation can be improved by use of a different approach which does not employ the Dupuit Forchheimer assumption, but rather is based on a semi-numerical solution of the Laplace equation for quasi-steady conditions. The actual unsteady process is replaced by a sequence of steady-state conditions, and it is assumed that the actual unsteady flow characteristics during a short time interval can be approximated by those associated with “average” steady state flow. The Laplace equation is solved by a semi-discretization method according to which the horizontal coordinate is divided into subintervals, while the vertical coordinate is maintained continuous. The proposed method is applied to a typical tile drainage problem, and, based on a comparison of calculated results with experimental data, the method is evaluated and practical conclusions regarding its applicability are advanced.  相似文献   
22.
A method, previously used for determination of 2,3,7,8-substituted polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs), has been modified for quantitative analysis of "dioxin-like" polychlorobiphenyls (PCBs) in environmental samples from the steel industry. The existing sample clean-up procedure, involving liquid chromatography on multi-layered silica and Florisil columns, has been extended to include a third chromatography stage on a basic alumina stationary phase. The additional clean-up stage is required for PCB analysis in order to eliminate interferences from relatively large concentrations of saturated cyclic and aliphatic hydrocarbons. Samples were analysed for WHO-12 congeners using high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS) and standard solutions of the method US EPA 1668A. Replicate analysis of method blanks revealed background contamination for PCBs 118, 105 and 77, which are generally abundant in ambient air. These contaminants were taken into account using a subtraction method. The entire procedure was validated by replicate analysis (N = 3) of a certified reference sediment. The RSD for each WHO-12 congener was below 15%, 13C12-labelled PCB internal standard recoveries were in the range 70-95%. A waste dust sample collected in the electrostatic precipitator of a UK sinter plant was analysed for determination of PCDD/Fs and WHO-12 PCBs and exhibited a PCDD/F I-TEQ of 148.5 +/- 21.2 ngkg(-1) and a WHO-TEQ of 7.2 +/- 1.5 ngkg(-1). WHO-12 congeners contributed only 4.6% to the overall TEQ and PCB 126 was the major congener contributing to the WHO-TEQ (96%). The contribution to the overall TEQ of the waste dust sample was mainly attributed to PCDF followed by PCDD, which accounted for 86.6% and 8.7% to the overall TEQ, respectively.  相似文献   
23.
Graduated Driver Licensing (GDL) inserts between the leaner permit and full licensure an intermediate or "provisional" license that allows novices to drive unsupervised but subject to provisions intended to reduce the risks that accompany entry into highway traffic. Introduction of GDL has been followed by lowered accident rates, resulting from both limiting exposure of novices to unsafe situations and by helping them to deal with them more safely. Sources of safer driving include extended learning, early intervention, contingent advancement, and multistage instruction. To extend the learning process, most GDL systems lengthen the duration of the learner phase and require a specified level of adult-supervised driving. Results indicate that extended learning can reduce accidents substantially if well structured and highly controlled. Early intervention with novice traffic violators have shown both a general deterrent effect upon novice violators facing suspension and a specific effect upon those who have experienced it. Making advancement to full licensure contingent upon a violation-free record when driving on the provisional license has also evidenced a reduction in accidents and violations during that phase of licensure. Multistage instruction attempts development of advanced skills only after novices have had a chance to master more basic skills. Although this element of GDL has yet to be evaluated, research indicates crash reduction is possible in situations where it does not increase exposure to risk. While the various elements of GDL have demonstrated potential benefit in enhancing the safety of novice drivers, considerable improvement in the nature and enforcement of GDL requirements is needed to realize that potential.  相似文献   
24.
The first commercial supercritical water oxidation sludge processing plant   总被引:20,自引:0,他引:20  
Final disposal of sludge continues to be one of the more pressing problems for the wastewater treatment industry. Present regulations for municipal sludge have favored beneficial use, primarily in land application. However, several agencies and entities have warned of potential health risks associated with these methods. Hydrothermal oxidation provides an alternative method that addresses the health concerns associated with sludge disposal by completely converting all organic matter in the sludge to carbon dioxide, water, and other innocuous materials. A hydrothermal oxidation system using HydroProcessing, L.L.C.'s HydroSolids process has been installed at Harlingen, Texas to process up to 9.8 dry tons per day of sludge. Based on a literature review, this system is the largest hydrothermal oxidation system in the world, and the only one built specifically to process a sludge. Start up of Unit 1 of two units of the HTO system began in April 2001. Early results have indicated COD conversion rates in excess of 99.9%. Harlingen Waterworks System estimates that the HydroSolids system will cost less than other alternatives such as autothermal thermophilic aerobic digestion and more traditional forms of digestion that still require dewatering and final disposal. The Waterworks intends to generate income from the sale of energy in the form of hot water and the use of carbon dioxide from the HydroSolids process for neutralization of high pH industrial effluent. The Waterworks also expects to generate income from the treatment of septage and grease trap wastes.  相似文献   
25.
Peng H  Brooks BW  Chan R  Chyan O  La Point TW 《Chemosphere》2002,46(7):1141-1146
Silver thiosulfate, often a waste product of photoprocessing, is less bioavailable or toxic to aquatic organisms than is ionic silver. We conducted duplicate 48-h Ceriodaphnia dubia tests in reconstituted laboratory water using treatments of 92.7 nM Ag+ with various concentrations of thiosulfate. Expected Ag+ concentrations were generated for thiosulfate treatment levels using MINEQL + chemical equilibrium modeling. Ag+ concentrations in treatments were determined using a novel silicon-based sensor. Based on predicted Ag+ and published 48-h LC50 values for C. dubia, we did not expect to observe adverse effects. Yet, 100% mortality was observed at low thiosulfate treatments, whereas > 85% and > 95% survival was observed at higher thiosulfate treatment levels, respectively. Our results indicate that biotic responses match the sensor-based Ag+ concentrations. However, there is a discrepancy between these empirical results and responses expected to occur with Ag+ concentrations as predicted by MINEQL + chemical modeling. By correlating silicon sensor data with toxicity results obtained from our laboratory, our work clearly relates a specific chemical form (Ag+) to toxicity results.  相似文献   
26.
Because of the common source, lead and CO values in the atmosphere tend to behave in a similar manner. Thus, diurnal variations in these two pollutants show a pattern related to motor vehicle traffic flow. Also, the exposure to both vary by orders of magnitude with the highest being on the road (in the car) thus setting up special dosage situations. Community sources seem to affect background level at least based on fall-off with distance. There may be a relatively wider exposure of the general population to lead and CO. While the lead levels may not be increasing in the downtown portion of the central city proper, typical central city levels of several years ago may be more diffuse and spread out, thus occurring over increasingly large portions of the community. Similarly, there may be a wider exposure of the population to CO as the levels become more nearly uniformly high over a larger area. In addition, there may be problems of a shorter term exposure to high levels of CO in commuter traffic. This may be of consequence to selected types of drivers or passengers. Finally, it should also be noted that during air pollution episodes, CO levels appear to rise with no data currently available on changes in concomitant ambient lead levels.  相似文献   
27.
Abstract

In the last 5 yr, the capabilities of earth-observing satellites and the technological tools to share and use satellite data have advanced sufficiently to consider using satellite imagery in conjunction with ground-based data for urban-scale air quality monitoring. Satellite data can add synoptic and geospatial information to ground-based air quality data and modeling. An assessment of the integrated use of ground-based and satellite data for air quality monitoring, including several short case studies, was conducted. Findings identified current U.S. satellites with potential for air quality applications, with others available internationally and several more to be launched within the next 5 yr; several of these sensors are described in this paper as illustrations. However, use of these data for air quality applications has been hindered by historical lack of collaboration between air quality and satellite scientists, difficulty accessing and understanding new data, limited resources and agency priorities to develop new techniques, ill-defined needs, and poor understanding of the potential and limitations of the data. Specialization in organizations and funding sources has limited the resources for cross-disciplinary projects. To successfully use these new data sets requires increased collaboration between organizations, streamlined access to data, and resources for project implementation.  相似文献   
28.
Agricultural technologies are non-neutral and ethical challenges are posed by these technologies themselves. The technologies we use or endorse are embedded with values and norms and reflect the shape of our moral character. They can literally make us better or worse consumers and/or people. Looking back, when the world’s developed nations welcomed and steadily embraced industrialization as the dominant paradigm for agriculture a half century or so ago, they inadvertently championed a philosophy of technology that promotes an insular human-centricism, despite its laudable intent to ensure food security and advance human flourishing. The dominant philosophy of technology has also seeded particular ethical consequences that plague the well-being of human beings, the planet, and farmed animals. After revisiting some fundamental questions regarding the complex ways in which technology as agent shapes our lives and choices and relegates food and farmed constituents into technological artifacts or commodities, I argue that we should accord an environmental virtue ethic of care—understood as caretaking—a central place in developing a more conscientious philosophy of technology that aims at sustainability, fairness, and humaneness in animal agriculture. While technology shapes society, it also is socially shaped and an environmental virtue ethic of care (EVEC) as an alternative design philosophy has the tools to help us take a much overdue inventory of ourselves and our relationships with the nonhuman world. It can help us to expose the ways in which technology hinders critical reflection of its capacity to alter communities and values, to come to terms with why we may be, in general, disengaged from critical ethical analysis of contemporary agriculture and to consider the moral shape and trajectory and the sustainability of our food production systems going into the future. I end by outlining particular virtues associated with the ethic of care discussed here and consider some likely implications for consumers and industry technocrats as they relate to farming animals.  相似文献   
29.
Climate change has recently become a major focus for industry and government agencies. Some recent works have been reported on the use of pinch analysis techniques for carbon-constrained energy planning problems. This paper discusses a new application of graphical technique based on pinch analysis for company-level visualization and analysis of carbon footprint improvement. The technique is based on the decomposition of total carbon footprint into material- and energy-based components, or alternatively, into internal and external components. The decomposition facilitates the evaluation and screening of process improvement alternatives. Two industrial case studies on the production of phytochemical extracts and bulk chemicals are used to illustrate the new extension.  相似文献   
30.
Originally prenatal diagnosis was confined to the diagnosis of metabolic disorders and depended on assaying enzyme levels in amniotic fluid. With the development of recombinant DNA technology, molecular diagnosis became possible for some genetic conditions late in the 1970s. Here we briefly review the history of molecular prenatal diagnostic testing, using Duchenne muscular dystrophy as an example, and describe how over the last 30 years we have moved from offering testing to a few affected individuals using techniques, such as Southern blotting to identify deletions, to more rapid and accurate PCR-based testing which identifies the precise change in dystrophin for a greater number of families. We discuss the potential for safer, earlier prenatal genetic diagnosis using cell free fetal DNA in maternal blood before concluding by speculating on how more recent techniques, such as next generation sequencing, might further impact on the potential for molecular prenatal testing. Progress is not without its challenges, and as cytogenetics and molecular genetics begin to unite into one, we foresee the main challenge will not be in identifying the genetic change, but rather in interpreting its significance, particularly in the prenatal setting where we frequently have no phenotype on which to base interpretation. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号