首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   173篇
  免费   28篇
  国内免费   47篇
安全科学   43篇
废物处理   6篇
环保管理   34篇
综合类   80篇
基础理论   55篇
污染及防治   16篇
评价与监测   6篇
社会与环境   7篇
灾害及防治   1篇
  2024年   3篇
  2023年   8篇
  2022年   11篇
  2021年   8篇
  2020年   4篇
  2019年   10篇
  2018年   9篇
  2017年   7篇
  2016年   10篇
  2015年   12篇
  2014年   11篇
  2013年   8篇
  2012年   16篇
  2011年   17篇
  2010年   10篇
  2009年   19篇
  2008年   12篇
  2007年   17篇
  2006年   5篇
  2005年   6篇
  2004年   8篇
  2003年   6篇
  2002年   6篇
  2001年   3篇
  2000年   6篇
  1999年   2篇
  1998年   1篇
  1997年   2篇
  1995年   3篇
  1993年   2篇
  1991年   2篇
  1989年   2篇
  1979年   1篇
  1975年   1篇
排序方式: 共有248条查询结果,搜索用时 31 毫秒
81.
运用确定性和概率性人体健康风险评估的方法,推算一般工业暴露情景和拆卸清理暴露情景下三氯杀螨醇生产设备表面污染物的筛选值.结果表明:p,p'-滴滴涕、p,p'-滴滴滴和p,p'-滴滴依基于确定性风险评估的设备表面筛选值在一般工业暴露情景下分别为0.224 mg/m2、0.214 mg/m2和0.151 mg/m2,在拆卸...  相似文献   
82.
为了获取影响腐蚀管道失效概率的关键因素及敏感性规律,基于FITNET FFS模型,采用可靠性理论对国内某腐蚀管道的失效概率进行计算和分析。通过全寿命方法计算了腐蚀增长速率,从而得到了与时间相关的腐蚀管道损伤概率模型,并采用蒙特卡罗模拟算法进行求解,得出了不同年限下腐蚀管道的失效概率;采用变异系数法对各影响因素进行参数敏感性分析。研究结果表明:管道直径、壁厚及径向腐蚀速率的分散性对管道失效概率具有双向扰动作用,其机理在于随机变量的分散性和腐蚀速率同时影响失效概率的波动,开始阶段随机变量分散性起主导作用,两者在管道失效概率达到50%会趋于一个平衡状态,之后腐蚀速率起主要支配作用;另外,管材的抗拉强度对腐蚀管道失效概率的影响较屈服强度的影响更大,可靠性分析时采用只考虑屈服强度的强度模型将存在一定的局限性,建议同时考虑管材抗拉强度的影响。  相似文献   
83.
This research analyses the application of spatially explicit sensitivity and uncertainty analysis for GIS (Geographic Information System) multicriteria decision analysis (MCDA) within a multi-dimensional vulnerability assessment regarding flooding in the Salzach river catchment in Austria. The research methodology is based on a spatially explicit sensitivity and uncertainty analysis of GIS-CDA for an assessment of the social, economic, and environmental dimensions of vulnerability. The main objective of this research is to demonstrate how a unified approach of uncertainty and sensitivity analysis can be applied to minimise the associated uncertainty within each dimension of the vulnerability assessment. The methodology proposed for achieving this objective is composed of four main steps. The first step is computing criteria weights using the analytic hierarchy process (AHP). In the second step, Monte Carlo simulation is applied to calculate the uncertainties associated with AHP weights. In the third step, the global sensitivity analysis (GSA) is employed in the form of a model-independent method of output variance decomposition, in which the variability of the different vulnerability assessments is apportioned to every criterion weight, generating one first-order (S) and one total effect (ST) sensitivity index map per criterion weight. Finally, in the fourth step, an ordered weighted averaging method is applied to model the final vulnerability maps. The results of this research demonstrate the robustness of spatially explicit GSA for minimising the uncertainty associated with GIS-MCDA models. Based on these results, we conclude that applying the variance-based GSA enables assessment of the importance of each input factor for the results of the GIS-MCDA method, both spatially and statistically, thus allowing us to introduce and recommend GIS-based GSA as a useful methodology for minimising the uncertainty of GIS-MCDA.  相似文献   
84.
中国居民饮用水镉暴露非致癌风险的年龄分层权重   总被引:2,自引:2,他引:0  
饮水是人体镉(Cd)暴露的重要途径,为了定量表征中国居民饮用水镉暴露风险,通过文献调研收集我国3类主要饮用水类型的镉浓度数据.利用回归模型获得不同年龄段人群饮水暴露参数分布模式.基于概率方法评价不同水体和不同人群由于饮用水镉暴露造成的非致癌风险.结果发现,3种类型水体镉浓度存在显著差异.自来水、未处理的地下水和地表水源...  相似文献   
85.
Lake Toolibin, an ephemeral lake in the agricultural zone of Western Australia, is under threat from secondary salinity due to land clearance throughout the catchment. The lake is extensively covered with native vegetation and is a Ramsar listed wetland, being one of the few remaining significant migratory bird habitats in the region. Currently, inflow with salinity greater than 1000 mg/L TDS is diverted from the lake in an effort to protect sensitive lakebed vegetation. However, this conservative threshold compromises the frequency and extent of lake inundation, which is essential for bird breeding. It is speculated that relaxing the threshold to 5000 mg/L may pose negligible additional risk to the condition of lakebed vegetation. To characterise the magnitude of improvement in the provision of bird breeding habitat that might be generated by relaxing the threshold, a dynamic water and salt balance model of the lake was developed and implemented using Monte Carlo simulation. Results from best estimate model inputs indicate that relaxation of the threshold increases the likelihood of satisfying habitat requirements by a factor of 9.7. A second-order Monte Carlo analysis incorporating incertitude generated plausible bounds of [2.6, 37.5] around the best estimate for the relative likelihood of satisfying habitat requirements. Parameter-specific sensitivity analyses suggest the availability of habitat is most sensitive to pan evaporation, lower than expected inflow volume, and higher than expected inflow salt concentration. The characterisation of uncertainty associated with environmental variation and incertitude allows managers to make informed risk-weighted decisions.  相似文献   
86.
Environmental integrated assessments are often carried out via the aggregation of a set of environmental indicators. Aggregated indices derived from the same data set can differ substantially depending upon how the indicators are weighted and aggregated, which is often a subjective matter. This article presents a method of generating aggregated environmental indices in an objective manner via Monte Carlo simulation. Rankings derived from the aggregated indices within and between three Monte Carlo simulations were used to evaluate the overall environmental condition of the study area. Other insights, such as the distribution of good or bad values of indicators at a watershed and/or a subregion, were observed in the study.  相似文献   
87.
After Hurricane Katrina passed through the US Gulf Coast in August 2005, floodwaters covering New Orleans were pumped into Lake Pontchartrain as part of the rehabilitation process in order to make the city habitable again. The long-term consequences of this environmentally critical decision were difficult to assess at the time and were left to observation. In the aftermath of these natural disasters, and in cases of emergency, the proactive use of screening level models may prove to be an important factor in making appropriate decisions to identify cost effective and environmentally friendly mitigation solutions. In this paper, we propose such a model and demonstrate its use through the application of several hypothetical scenarios to examine the likely response of Lake Pontchartrain to the contaminant loading that were possibly in the New Orleans floodwaters. For this purpose, an unsteady-state fugacity model was developed in order to examine the environmental effects of contaminants with different physicochemical characteristics on Lake Pontchartrain. The three representative contaminants selected for this purpose are benzene, atrazine, and polychlorinated biphenyls (PCBs). The proposed approach yields continuous fugacity values for contaminants in the water, air, and sediment compartments of the lake system which are analogous to concentrations. Since contaminant data for the floodwaters are limited, an uncertainty analysis was also performed in this study. The effects of uncertainty in the model parameters were investigated through Monte Carlo analysis. Results indicate that the acceptable recovery of Lake Pontchartrain will require a long period of time. The computed time range for the levels of the three contaminants considered in this study to decrease to maximum contaminant levels (MCLs) is about 1 year to 68 years. The model can be implemented to assess the possible extent of damage inflicted by any storm event on the natural water resources of Southern Louisiana or similar environments elsewhere. Furthermore, the model developed can be used as a useful decision-making tool for planning and remediation in similar emergency situations by examining various potential contamination scenarios and their consequences.  相似文献   
88.
ABSTRACT: A mathematical model is developed to optimally schedule long-term stormwater infrastructure rehabilitation activities. The model is capable of considering multiple rehabilitation projects and is driven by overall cost eensiderations. Rehabilitation activities are scheduled based on perceived reliabilities and future deterioration expected within the specified planning horizon. Future growth within the stormwater drainage basin is incorporated using chance constraints that limit the likelihood that a stormwater discharge exceeds system conveyance capacity. Model structure and development are discussed, and a hypothetical example using a drainage network is presented.  相似文献   
89.
Abstract: A mix of causative mechanisms may be responsible for flood at a site. Floods may be caused because of extreme rainfall or rain on other rainfall events. The statistical attributes of these events differ according to the watershed characteristics and the causes. Traditional methods of flood frequency analysis are only adequate for specific situations. Also, to address the uncertainty of flood frequency estimates for hydraulic structures, a series of probabilistic analyses of rainfall‐runoff and flow routing models, and their associated inputs, are used. This is a complex problem in that the probability distributions of multiple independent and derived random variables need to be estimated to evaluate the probability of floods. Therefore, the objectives of this study were to develop a flood frequency curve derivation method driven by multiple random variables and to develop a tool that can consider the uncertainties of design floods. This study focuses on developing a flood frequency curve based on nonparametric statistical methods for the estimation of probabilities of rare floods that are more appropriate in Korea. To derive the frequency curve, rainfall generation using the nonparametric kernel density estimation approach is proposed. Many flood events are simulated by nonparametric Monte Carlo simulations coupled with the center Latin hypercube sampling method to estimate the associated uncertainty. This study applies the methods described to a Korean watershed. The results provide higher physical appropriateness and reasonable estimates of design flood.  相似文献   
90.
In this paper we examine the use of data augmentation techniques for simplifying iterative simulation in the context of both Bayesian and classical statistical inference for survival rate estimation. We examine two distinct model families common in population ecology to illustrate our ideas, ring-recovery models and capture–recapture models, and we present the computational advantage of this approach. We discuss also the fact that problems associated with identifiability in the classical framework can be overcome using data augmentation, but highlight the dangers in doing so under both inferential paradigms.
I. C. OlsenEmail:
  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号