首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   35篇
  免费   1篇
安全科学   3篇
废物处理   1篇
环保管理   2篇
综合类   9篇
基础理论   8篇
污染及防治   1篇
评价与监测   8篇
社会与环境   4篇
  2020年   1篇
  2013年   2篇
  2012年   1篇
  2011年   5篇
  2010年   4篇
  2009年   3篇
  2008年   1篇
  2007年   2篇
  2006年   2篇
  2005年   2篇
  2002年   3篇
  2000年   4篇
  1999年   1篇
  1998年   1篇
  1997年   1篇
  1996年   3篇
排序方式: 共有36条查询结果,搜索用时 15 毫秒
1.
We here examine species distribution models for a Neotropical anuran restricted to ombrophilous areas in the Brazilian Atlantic Forest hotspot. We extend the known occurrence for the treefrog Hypsiboas bischoffi (Anura: Hylidae) through GPS field surveys and use five modeling methods (BIOCLIM, DOMAIN, OM-GARP, SVM, and MAXENT) and selected bioclimatic and topographic variables to model the species distribution. Models were first trained using two calibration areas: the Brazilian Atlantic Forest (BAF) and the whole of South America (SA). All modeling methods showed good levels of predictive power and accuracy with mean AUC ranging from 0.77 (BIOCLIM/BAF) to 0.99 (MAXENT/SA). MAXENT and SVM were the most accurate presence-only methods among those tested here. All but the SVM models calibrated with SA predicted larger distribution areas when compared to models calibrated in BAF. OM-GARP dramatically overpredicted the species distribution for the model calibrated in SA, with a predicted area around 106 km2 larger than predicted by other SDMs. With increased calibration area (and environmental space), OM-GARP predictions followed changes in the environmental space associated with the increased calibration area, while MAXENT models were more consistent across calibration areas. MAXENT was the only method that retrieved consistent predictions across calibration areas, while allowing for some overprediction, a result that may be relevant for modeling the distribution of other spatially restricted organisms.  相似文献   
2.
C. Martin  E. Ayesa 《Ecological modelling》2010,221(22):2656-2667
This paper proposes an Integrated Monte Carlo Methodology (IMCM) to solve the parameter estimation problem in water quality models. The methodology is based on Bayesian approach and Markov Chain Monte Carlo techniques and it operates by means of four modules: Markov Chain Monte Carlo (MCMC), Moving Feasible Ranges (MFR), Statistical Analysis of the Joint Posterior Distribution (SAD) and Uncertainty Propagation Analysis (UPA). The main innovation of the new proposal lies in the combination of MCMC and MFR modules which provides the joint posterior distribution of the calibrated parameters following the classical Bayesian approach. While MCMC module, based on Shuffled Complex Evolution Metropolis (SCEM-UA) algorithm, is specially designed to sample complex joint posterior shapes within certain parameter ranges, the MFR readjusts these ranges until the coverage of the feasible parameter space is guaranteed. Once the joint posterior distribution is properly defined, the SAD provides the parameter statistics and the UPA performs an analysis of the uncertainty propagation through the model. The possibilities of the new proposal have been tested on the basis of a simple model featuring different activated sludge batch experiments. IMCM has been implemented in Matlab and it is prepared to be easily connected to any software package.  相似文献   
3.
Testing ecological models: the meaning of validation   总被引:9,自引:0,他引:9  
The ecological literature reveals considerable confusion about the meaning of validation in the context of simulation models. The confusion arises as much from semantic and philosophical considerations as from the selection of validation procedures. Validation is not a procedure for testing scientific theory or for certifying the ‘truth’ of current scientific understanding, nor is it a required activity of every modelling project. Validation means that a model is acceptable for its intended use because it meets specified performance requirements.Before validation is undertaken, (1) the purpose of the model, (2) the performance criteria, and (3) the model context must be specified. The validation process can be decomposed into several components: (1) operation, (2) theory, and (3) data. Important concepts needed to understand the model evaluation process are verification, calibration, validation, credibility, and qualification. These terms are defined in a limited technical sense applicable to the evaluation of simulation models, and not as general philosophical concepts. Different tests and standards are applied to the operational, theoretical, and data components. The operational and data components can be validated; the theoretical component cannot.The most common problem with ecological and environmental models is failure to state what the validation criteria are. Criteria must be explicitly stated because there are no universal standards for selecting what test procedures or criteria to use for validation. A test based on comparison of simulated versus observed data is generally included whenever possible. Because the objective and subjective components of validation are not mutually exclusive, disagreements over the meaning of validation can only be resolved by establishing a convention.  相似文献   
4.
苏州河干流水质模型的开发研究   总被引:9,自引:2,他引:9  
廖良  徐祖信  刘东胜 《上海环境科学》2002,21(3):136-138,142
对《苏州河水质模型的开发》课题完成的大部分内容进行了介绍,包括水质模型的选择及其原理、模型研究的内容与开发方法、干流水质模型的率定与验证,并据此给出了关于苏州河水环境特性的一些参数值,计算的基准时间为1999年夏季苏州河第3次调水试验时期。需要指出的是,苏州河正处于动态的综合整治过程中,模型中有关参数也会随其相应改变。  相似文献   
5.
Effects of calibration on L-THIA GIS runoff and pollutant estimation   总被引:3,自引:0,他引:3  
Urbanization can result in alteration of a watershed's hydrologic response and water quality. To simulate hydrologic and water quality impacts of land use changes, the Long-Term Hydrologic Impact Assessment (L-THIA) system has been used. The L-THIA system estimates pollutant loading based on direct runoff quantity and land use based pollutant coefficients. The accurate estimation of direct runoff is important in assessing water quality impacts of land use changes. An automated program was developed to calibrate the L-THIA model using the millions of curve number (CN) combinations associated with land uses and hydrologic soil groups. L-THIA calibration for the Little Eagle Creek (LEC) watershed near Indianapolis, Indiana was performed using land use data for 1991 and daily rainfall data for six months of 1991 (January 1-June 30) to minimize errors associated with use of different temporal land use data and rainfall data. For the calibration period, the Nash-Sutcliffe coefficient was 0.60 for estimated and observed direct runoff. The calibrated CN values were used for validation of the model for the same year (July 1-December 31), and the Nash-Sutcliffe coefficient was 0.60 for estimated and observed direct runoff. The Nash-Sutcliffe coefficient was 0.52 for January 1, 1991 to December 31, 1991 using uncalibrated CN values. As shown in this study, the use of better input parameters for the L-THIA model can improve accuracy. The effects on direct runoff and pollutant estimation of the calibrated CN values in the L-THIA model were investigated for the LEC. Following calibration, the estimated average annual direct runoff for the LEC watershed increased by 34%, total nitrogen by 24%, total phosphorus by 22%, and total lead by 43%. This study demonstrates that the L-THIA model should be calibrated and validated prior to application in a particular watershed to more accurately assess the effects of land use changes on hydrology and water quality.  相似文献   
6.
目前GB/T164 88-1996以红外分光光度法这测定石油类的有一种方法。该法用四氯代碳作莘取剂 ,四氯化碳用量大。四氯化碳是一种有毒、挥发性较强的有机溶剂。在监测该项目时 ,绘制标准曲线所用的标准储备液的配制相当复杂且形成的系统误差比较大 ,为减少误差并准确地绘制标准曲线 ,本文采取了简化配制步骤的方法来绘制曲线 ,并达到了与JDS -10 0型红外分光测油仪使用说明书中提供的方法一致的效果。  相似文献   
7.
本文对Ⅱ套SRU的标定结果,进行了比较详细的技术分析,在此基础上,对装置及其运行状况作出了评价,并提出了一些建设。  相似文献   
8.
When characterizing environmental radioactivity, whether in the soil or within concrete building structures undergoing remediation or decommissioning, it is highly desirable to know the radionuclide depth distribution. This is typically modeled using continuous analytical expressions, whose forms are believed to best represent the true source distributions. In situ gamma ray spectroscopic measurements are combined with these models to fully describe the source. Currently, the choice of analytical expressions is based upon prior experimental core sampling results at similar locations, any known site history, or radionuclide transport models. This paper presents a method, employing multiple in situ measurements at a single site, for determining the analytical form that best represents the true depth distribution present. The measurements can be made using a variety of geometries, each of which has a different sensitivity variation with source spatial distribution. Using non-linear least squares numerical optimization methods, the results can be fit to a collection of analytical models and the parameters of each model determined. The analytical expression that results in the fit with the lowest residual is selected as the most accurate representation. A cursory examination is made of the effects of measurement errors on the method.  相似文献   
9.
Engineering projects involving hydrogeology are faced with uncertainties because the earth is heterogeneous, and typical data sets are fragmented and disparate. In theory, predictions provided by computer simulations using calibrated models constrained by geological boundaries provide answers to support management decisions, and geostatistical methods quantify safety margins. In practice, current methods are limited by the data types and models that can be included, computational demands, or simplifying assumptions. Data Fusion Modeling (DFM) removes many of the limitations and is capable of providing data integration and model calibration with quantified uncertainty for a variety of hydrological, geological, and geophysical data types and models. The benefits of DFM for waste management, water supply, and geotechnical applications are savings in time and cost through the ability to produce visual models that fill in missing data and predictive numerical models to aid management optimization. DFM has the ability to update field-scale models in real time using PC or workstation systems and is ideally suited for parallel processing implementation. DFM is a spatial state estimation and system identification methodology that uses three sources of information: measured data, physical laws, and statistical models for uncertainty in spatial heterogeneities. What is new in DFM is the solution of the causality problem in the data assimilation Kalman filter methods to achieve computational practicality. The Kalman filter is generalized by introducing information filter methods due to Bierman coupled with a Markov random field representation for spatial variation. A Bayesian penalty function is implemented with Gauss–Newton methods. This leads to a computational problem similar to numerical simulation of the partial differential equations (PDEs) of groundwater. In fact, extensions of PDE solver ideas to break down computations over space form the computational heart of DFM. State estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. Further, a system identification theory has been derived based on statistical likelihood principles. A maximum likelihood theory is provided to estimate statistical parameters such as Markov model parameters that determine the geostatistical variogram. Field-scale application of DFM at the DOE Savannah River Site is presented and compared with manual calibration. DFM calibration runs converge in less than 1 h on a Pentium Pro PC for a 3D model with more than 15,000 nodes. Run time is approximately linear with the number of nodes. Furthermore, conditional simulation is used to quantify the statistical variability in model predictions such as contaminant breakthrough curves.  相似文献   
10.
This paper describes a quantitative radioactivity analysis method especially suitable for environmental samples with low-level activity. The method, consisting of a multi-group approximation based on total absorption and Compton spectra of gamma rays, is coherently formalized and a computer algorithm thereof designed to analyze low-level activity NaI(Tl) gamma ray spectra of environmental samples. Milk powder from 1988 was used as the example case. Included is a special analysis on the uncertainty estimation. Gamma sensitiveness is defined and numerically evaluated. The results reproduced the calibration data well, attesting to the reliability of the method. The special analysis shows that the uncertainty of the assessed activity is tied to that of the calibration activity data. More than 77% of measured 1461-keV photons of 40K were counted in the range of clearly lower energies. Pile-up of single line photons (137Cs) looks negligible compared to that of a two-line cascade (134Cs). The detection limit varies with radionuclide and spectrum region and is related to the gamma sensitiveness of the detection system. The best detection limit always lies in a spectrum region holding a line of the radionuclide and the highest sensitiveness. The most radioactive milk powder sample showed a activity concentration of 21 ± 1 Bq g−1for 137Cs, 323 ± 13 Bq g−1 for 40K and no 134Cs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号