首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   476篇
  免费   33篇
  国内免费   18篇
安全科学   116篇
废物处理   6篇
环保管理   181篇
综合类   90篇
基础理论   56篇
污染及防治   14篇
评价与监测   38篇
社会与环境   11篇
灾害及防治   15篇
  2023年   5篇
  2022年   7篇
  2021年   12篇
  2020年   9篇
  2019年   14篇
  2018年   9篇
  2017年   19篇
  2016年   13篇
  2015年   12篇
  2014年   16篇
  2013年   23篇
  2012年   22篇
  2011年   24篇
  2010年   15篇
  2009年   24篇
  2008年   14篇
  2007年   34篇
  2006年   32篇
  2005年   20篇
  2004年   20篇
  2003年   9篇
  2002年   9篇
  2001年   15篇
  2000年   15篇
  1999年   14篇
  1998年   11篇
  1997年   13篇
  1996年   20篇
  1995年   13篇
  1994年   7篇
  1993年   7篇
  1992年   7篇
  1991年   8篇
  1990年   4篇
  1989年   5篇
  1988年   6篇
  1987年   2篇
  1986年   2篇
  1985年   2篇
  1984年   1篇
  1983年   5篇
  1982年   1篇
  1981年   2篇
  1980年   2篇
  1978年   1篇
  1973年   1篇
  1971年   1篇
排序方式: 共有527条查询结果,搜索用时 271 毫秒
371.
ABSTRACT: Urban water-quality managers need load estimates of storm-runoff pollutants to design effective remedial programs. Estimates are commonly made using published models calibrated to large regions of the country. This paper presents statistical methods, termed model-adjustment procedures (MAPs), which use a combination of local data and published regional models to improve estimates of urban-runoff quality. Each MAP is a form of regression analysis that uses a local data base as a calibration data set to adjust the regional model, in effect increasing the size of the local data base without additional, expensive data collection. The adjusted regional model can then be used to estimate storm-runoff quality at unmonitored sites and storms in the locality. The four MAPs presented in this study are (1) single-factor regression against the regional model prediction, Pu; (2) least-squares regression against Pu; (3) least-squares regression against Pu and additional local variables; and (4) weighted combination of Pu and a local-regression prediction. Identification of the statistically most valid method among these four depends upon characteristics of the local data base. A MAP-selection scheme based on statistical analysis of the calibration data set is presented and tested.  相似文献   
372.
ABSTRACT A conceptual framework and the systematic collection of reliable information for application within the framework are the cornerstones of effective water planning. The ideal of strengthening these cornerstones was a driving force behind formation of the Water Resources Council and Council efforts, during its life, to develop the Principles and Standards and to complete two National Water Assessments. The Assessments contained voluminous data but never really became an integral component of the national water planning process. Before being disbanded in 1982, the Council solicited several appraisals of its assessment process. This paper reports one made by the university community in which experiences and opinions were obtained from 108 water research administrators and water policy experts.  相似文献   
373.
ABSTRACT: Stream water during fair weather (base flow) is largely ground water discharge, which has been in contact with minerals of the underlying aquifer. Base flow water quality should therefore reflect aquifer mineralogy as well as upstream land use. Three upstream mining categories (unmined lands, abandoned coal mines, and reclaimed coal mines) differed in pH, specific conductance, sulfate, iron, aluminum, and alkalinity for 122 streams in eastern Ohio. Aquifer rock type influenced pH, specific conductance, sulfate, iron, and alkalinity. Reclamation returned many components of acid mine drainage to near unmined levels, although sulfate and specific conductance were not improved. Acid mine drainage problems were less severe in watersheds underlain by the calcareous Monogahela Formation. These results should ayply to other Appalachian coal regions having similar rock units. The water quality data distributions were neither consistently normal nor lognormal. Statistical tests utilizing ranks of the water quality data, instead of the data themselves, proved useful in analyzing the influences of mining category and rock type.  相似文献   
374.
A method is presented to assist policy makers in determining the combination of number of sampling stations and number of years of sampling necessary to state with a given probability that a step reduction in atmospheric deposition rates of a given magnitude has occurred at a pre-specified time. This pre-specified time would typically be the time at which a sulfate emission control program took effect, and the given magnitude of reduction is some percentage change in deposition rate one might expect to occur as a result of the emission control. In order to determine this probability of detection, a stochastic model of sulfate deposition rates is developed, based on New York State bulk collection network data. The model considers the effect of variation in precipitation, seasonal variations, serial correlation, and site-to-site (cross) correlation. A nonparametric statistical test which is well suited to detection of step changes in such multi-site data sets is developed. It is related to the Mann-Whitney Rank-Sum test. The test is used in Monte Carlo simulations along with the stochastic model to derive statistical power functions. These power functions describe the probability of detecting (α=0.05) a step trend in deposition rate as a function of the size of the step-trend, record length before and after the step-trend, and the number of stations sampled. The results show that, for an area the size of New York State, very little power is gained by increasing the number of stations beyond about eight. The results allow policy makers to determine the tradeoff between the cost of monitoring and time required to detect a step-trend of a given magnitude with a given probability.  相似文献   
375.
ABSTRACT: Water quality data collected at inflows to Everglades National Park (ENP) are analyzed for trends using the seasonal Kendall test (Hirsch et al., 1982; Hirsch and Slack, 1984). The period of record is 1977–1989 for inflows to Shark River Slough and 1983–1989 for inflows to Taylor Slough and ENP's Coastal Basin. The analysis considers 20 water quality components, including nutrients, field measurements, inorganic species, and optical properties. Significant (p<0.10) increasing trends in total phosphorus concentration are indicated at eight out of nine stations examined. When the data are adjusted to account for variations in antecedent rainfall and water surface elevation, increasing trends are indicated at seven out of nine stations. Phosphorus trend magnitudes range from 4 percent/year to 21 percent/year Decreasing trends in the Total N/P ratio are detected at seven out of nine stations. N/P trend magnitudes range from -7 percent/year to -15 percent/year. Trends in water quality components other than nutrients are observed less frequently and are of less importance from a water-quality-management perspective. The apparent nutrient trends are not explained by variations in marsh water elevation, antecedent rainfall, flow, or season.  相似文献   
376.
ABSTRACT: An assumption of scale is inherent in any environmental monitoring exercise. The temporal or spatial scale of interest defines the statistical model which would be most appropriate for a given system and thus affects both sampling design and data analysis. Two monitoring objectives which are strongly tied to scale are the estimation of average conditions and the evaluation of trends. For both of these objectives, the time or spatial scale of interest strongly influences whether a given set of observations should be regarded as independent or serially correlated and affects the importance of serial correlation in choosing statistical methods. In particular serial correlation has a much different effect on the estimation of long-term means than it does on the estimation of specific-period means. For estimating trends, a distinction between serial correlation and trend is scale dependent. An explicit consideration of scale in monitoring system design and data analysis is, therefore, most important for producing meaningful statistical information.  相似文献   
377.
ABSTRACT: The risks associated with a traditional wasteload allocation (WLA) analysis were quantified with data from a recent study of the Upper Trinity River (Texas). Risk is define here as the probability of failing to meet an established in-stream water quality standard. The QUAL-TX dissolved oxygen (DO) water quality model was modified to a Monte Carlo framework. Flow augmentation coding was also modified to allow an exact match to be computed between the predicted and an established DO concentration standard, thereby providing an avenue for linking input parameter uncertainty to the assignment of a wasteload permit (allowable mass loading rate). Monte Carlo simulation techniques were employed to propagate input parameter uncertainties, typically encountered during WLA analysis, to the computed effluent five-day carbonaceous biochemical oxygen demand requirements for a single major wastewater treatment plant (WWTP). The risk of failing to meet an established in-stream DO criterion may be as high as 96 percent. The uncertainty associated with estimation of the future total Kjeldahl nitrogen concentration for a single tributary was found to have the greatest impact on the determination of allowable WWTP loadings.  相似文献   
378.
Background, Aim and Scope At present, large-scale paper manufacture involves delignification and bleaching by elemental chlorine free (ECF) or totally chlorine free (TCF) processes. The wastewater is purified by secondary treatment (mechanical, chemical and biological) which removes most of the toxic substances from the discharge. However, we found residual toxicity in the high molecular (> 1000 D) matter (HMWM) of the discharge by test of the RET (reverse electronic transfer) inhibition. This fraction consists mainly of polydisperse lignin (LIG) and carbohydrate (CH) macromolecules. Structural units in these molecules are studied by pyrolysis gas chromatography / mass spectrometry (Pyr-GC/MS). In the present work, our aim was to find out those structural units which could explain the RET toxicity of LIG or CH molecules. We compared statistically RET toxicity values of the HMWM samples from treated wastewaters of pilot pulping experiments and intensity variation of the pyrolysis product gas chromatograms of these samples. This application is a novel study procedure. Methods Pyrolysis products (Py-GC/MS results) and inhibition of RET (reverse electronic transport toxicity) as TU50 and TU20 of HMWM (High Molecular Weight Material; Mw > 1000 D) were compared by multivariate statistics. The samples were from laboratory pilot stages of TCF (Totally Chlorine Free) and ECF (Elemental Chlorine Free) manufacture of softwood pulp. Py-GC/MS was done without and with addition of TMAH (Tetra Methyl Ammonium Hydroxide). The name and structure of each abundant fragment compound was identified from its retention time and mass spectrum compared to authentic reference compounds or literature. Four sets of Toxicity Units (TUs) and GC peak areas of the pyrolysis fragments were obtained. The data were normalized by division with LIG (lignin content of each sample). TU values were dependent and the fragment values independent (explanatory) variables in statistical treatments by SPSS system. Separate analyses of correlations, principal components (PCA) and stepwise multiple linear regression (SMLR) were performed from the four sample sets TCF and ECF with and without TMAH. Results and Discussion From the CH fragments, 2-furfural in TCF, and from the LIG fragments, styrene in ECF showed the highest probabilities to originate from source structures of toxicity. Other possible compounds in concern were indicated to be CH fragment 2-methyl-2-cyclopenten-1-one in ECF and LIG fragments 2-methoxy-4-methylphenol, 4,5-dimethoxy-2-methylphenol and 2-methylphenol in TCF.  相似文献   
379.
Bayesian entropy for spatial sampling design of environmental data   总被引:1,自引:0,他引:1  
We develop a spatial statistical methodology to design national air pollution monitoring networks with good predictive capabilities while minimizing the cost of monitoring. The underlying complexity of atmospheric processes and the urgent need to give credible assessments of environmental risk create problems requiring new statistical methodologies to meet these challenges. In this work, we present a new method of ranking various subnetworks taking both the environmental cost and the statistical information into account. A Bayesian algorithm is introduced to obtain an optimal subnetwork using an entropy framework. The final network and accuracy of the spatial predictions is heavily dependent on the underlying model of spatial correlation. Usually the simplifying assumption of stationarity, in the sense that the spatial dependency structure does not change location, is made for spatial prediction. However, it is not uncommon to find spatial data that show strong signs of nonstationary behavior. We build upon an existing approach that creates a nonstationary covariance by a mixture of a family of stationary processes, and we propose a Bayesian method of estimating the associated parameters using the technique of Reversible Jump Markov Chain Monte Carlo. We apply these methods for spatial prediction and network design to ambient ozone data from a monitoring network in the eastern US.  相似文献   
380.
Efficient and reliable unexploded ordnance (UXO) site characterization is needed for decisions regarding future land use. There are several types of data available at UXO sites and geophysical signal maps are one of the most valuable sources of information. Incorporation of such information into site characterization requires a flexible and reliable methodology. Geostatistics allows one to account for exhaustive secondary information (i.e.,, known at every location within the field) in many different ways. Kriging and logistic regression were combined to map the probability of occurrence of at least one geophysical anomaly of interest, such as UXO, from a limited number of indicator data. Logistic regression is used to derive the trend from a geophysical signal map, and kriged residuals are added to the trend to estimate the probabilities of the presence of UXO at unsampled locations (simple kriging with varying local means or SKlm). Each location is identified for further remedial action if the estimated probability is greater than a given threshold. The technique is illustrated using a hypothetical UXO site generated by a UXO simulator, and a corresponding geophysical signal map. Indicator data are collected along two transects located within the site. Classification performances are then assessed by computing proportions of correct classification, false positive, false negative, and Kappa statistics. Two common approaches, one of which does not take any secondary information into account (ordinary indicator kriging) and a variant of common cokriging (collocated cokriging), were used for comparison purposes. Results indicate that accounting for exhaustive secondary information improves the overall characterization of UXO sites if an appropriate methodology, SKlm in this case, is used.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号