首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The late rise in global concern for environmental issues such as global warming and air pollution is accentuating the need for environmental assessments in the construction industry. Promptly evaluating the environmental loads of the various design alternatives during the early stages of a construction project and adopting the most environmentally sustainable candidate is therefore of large importance. Yet, research on the early evaluation of a construction project's environmental load in order to aid the decision making process is hitherto lacking. In light of this dilemma, this study proposes a model for estimating the environmental load by employing only the most basic information accessible during the early design phases of a project for the pre-stressed concrete (PSC) beam bridge, the most common bridge structure. Firstly, a life cycle assessment (LCA) was conducted on the data from 99 bridges by integrating the bills of quantities (BOQ) with a life cycle inventory (LCI) database. The processed data was then utilized to construct a case based reasoning (CBR) model for estimating the environmental load. The accuracy of the estimation model was then validated using five test cases; the model's mean absolute error rates (MAER) for the total environmental load was calculated as 7.09%. Such test results were shown to be superior compared to those obtained from a multiple-regression based model and a slab area base-unit analysis model. Henceforth application of this model during the early stages of a project is expected to highly complement environmentally friendly designs and construction by facilitating the swift evaluation of the environmental load from multiple standpoints.  相似文献   

2.
This study investigates the possibility of acid mine drainage (AMD) generation in active and derelict mine waste piles in Sarcheshmeh Copper Mine produced in several decades, using static tests including acid–base accounting (ABA) and net acid-generating pH (NAGpH). In this study, 51 composite samples were taken from 11 waste heaps, and static ABA and NAGpH tests were carried out on samples. While some piles are acid producing at present and AMD is discharging from the piles, most of them do not show any indication on their AMD potential, and they were investigated to define their acid-producing potential. The analysis of data indicates that eight waste piles are potentially acid generating with net neutralization potentials (NNPs) of ?56.18 to ?199.3, net acid generating of 2.19–3.31, and NPRs from 0.18 to 0.44. Other waste piles exhibited either a very low sulfur, high carbonate content or excess carbonate over sulfur; hence, they are not capable of acid production or they can be considered as weak acid producers. Consistency between results of ABA and NAGpH tests using a variety of classification criteria validates these tests as powerful means for preliminary evaluation of AMD/ARD possibilities in any mining district. It is also concluded that some of the piles with very negative NNPs are capable to produce AMD naturally, and they can be used in heap leaching process for economic recovery of trace amounts of metals without applying any biostimulation methods.  相似文献   

3.
Determination of emission of contaminant gases as ammonia, methane, or laughing gas from natural ventilated livestock buildings with large opening is a challenge due to the large variations in gas concentration and air velocity in the openings. The close relation between calculated animal heat production and the carbon dioxide production from the animals have in several cases been utilized for estimation of the ventilation air exchange rate for the estimation of ammonia and greenhouse gas emissions. Using this method, the problem of the complicated air velocity and concentration distribution in the openings is avoided; however, there are still some important issues remained unanswered: (1) the precision of the estimations, (2) the requirement for the length of measuring periods, and (3) the required measuring point number and location. The purpose of this work was to investigate how estimated average gas emission and the precision of the estimation are influenced by different calculation procedures, measuring period length, measure point locations, measure point numbers, and criteria for excluding measuring data. The analyses were based on existing data from a 6-day measuring period in a naturally ventilated, 150 milking cow building. The results showed that the methane emission can be determined with much higher precision than ammonia or laughing gas emissions, and, for methane, relatively precise estimations can be based on measure periods as short as 3 h. This result makes it feasible to investigate the influence of feed composition on methane emission in a relative large number of operating cattle buildings and consequently it can support a development towards reduced greenhouse gas emission from cattle production.  相似文献   

4.
Ongoing marine monitoring programs are seldom designed to detect changes in the environment between different years, mainly due to the high number of samples required for a sufficient statistical precision. We here show that pooling over time (time integration) of seasonal measurements provides an efficient method of reducing variability, thereby improving the precision and power in detecting inter-annual differences. Such data from weekly environmental sensor profiles at 21 stations in the northern Bothnian Sea was used in a cost-precision spatio-temporal allocation model. Time-integrated averages for six different variables over 6 months from a rather heterogeneous area showed low variability between stations (coefficient of variation, CV, range of 0.6–12.4%) compared to variability between stations in a single day (CV range 2.4–88.6%), or variability over time for a single station (CV range 0.4–110.7%). Reduced sampling frequency from weekly to approximately monthly sampling did not change the results markedly, whereas lower frequency differed more from results with weekly sampling. With monthly sampling, high precision and power of estimates could therefore be achieved with a low number of stations. With input of cost factors like ship time, labor, and analyses, the model can predict the cost for a given required precision in the time-integrated average of each variable by optimizing sampling allocation. A following power analysis can provide information on minimum sample size to detect differences between years with a required power. Alternatively, the model can predict the precision of annual means for the included variables when the program has a pre-defined budget. Use of time-integrated results from sampling stations with different areal coverage and environmental heterogeneity can thus be an efficient strategy to detect environmental differences between single years, as well as a long-term temporal trend. Use of the presented allocation model will then help to minimize the cost and effort of a monitoring program.  相似文献   

5.
The need for ambient gaseous ammonia (NH(3)) measurements has increased in the last decade as reactive NH(3) concentrations and deposition fluxes show little change even with tightening standards on nitrogen oxides (NO(x)) emissions. Currently, there are several networks developing methods for adding NH(3) measurements in the U.S. Gaseous NH(3) measurements will provide scientists and policymakers data which can be used to estimate ecosystem inputs, validate air quality models including trends and regional variability, and evaluate changes to the environment based on additional emission reduction requirements and estimates of critical nitrogen load exceedances. The passive samplers described in this paper were deployed in duplicate or triplicate and collocated with annular denuders or continuous instruments to determine their accuracy. The samplers assessed included the Adapted Low-Cost Passive High Absorption (ALPHA), Radiello(?), and Ogawa passive samplers. The median relative percent differences (MRPD) between the reference method and passive samplers for the ALPHA, Radiello(?) and Ogawa were -2.4%, -37% and -44%, respectively. The precision between duplicate samplers for the ALPHA and Ogawa samplers, was 7% and 6%, respectively. Triplicate Radiello(?) precision was assessed using the coefficient of variation (CV). The CV for the Radiello(?) samplers was 10%. This article discusses the statistical results from these studies.  相似文献   

6.
Concern about the possible deterioration of forest health led to the establishment in the 1980s of inventories of forest condition throughout Europe. International standardisation of the programmes was sought and a number of recommendations were made concerning sampling and assessment procedures. One of the most important rulings was that the assessment should be made on a systematic grid, the minimum density of which was 16×16 km. However, many countries adopted denser sampling grids, with 4×4 km being used in several countries and 1×1 km being used in the Netherlands. With five or more years of monitoring completed, there is a growing belief that a rapid and irreversible decline in forest health is not occurring. Consequently, some countries/regions are seeking to reduce their annual investment in forest health monitoring.The precision of national/regional estimates of forest health can be directly related to the sample size. As the sample size decreases, so also does the precision of the estimates. This is illustrated using data collected in Switzerland in 1992 and using grid densities of 4×4 km, 8×8 km, 12×12 km and 16×16 km. The value of the data is dependent on the sample size and the degree to which it is broken down (by region or species). The loss of precision associated with most subdivisions at the 8×8 km grid level remains acceptable, but a sharp deterioration in the precision occurs at the 12×12 km and 16×16 km grid levels. This has considerable implications for the interpretation of the inventories from those countries using a 16×16 km grid. In Switzerland, a reduction from the current 4×4 km grid to an 8×8 km grid (i.e. 75% reduction in sample size) would have relatively little impact on the overall results from the annual inventories of forest health.  相似文献   

7.
Only with a properly designed water quality monitoring network can data be collected that can lead to accurate information extraction. One of the main components of water quality monitoring network design is the allocation of sampling locations. For this purpose, a design methodology, called critical sampling points (CSP), has been developed for the determination of the critical sampling locations in small, rural watersheds with regard to total phosphorus (TP) load pollution. It considers hydrologic, topographic, soil, vegetative, and land use factors. The objective of the monitoring network design in this methodology is to identify the stream locations which receive the greatest TP loads from the upstream portions of a watershed. The CSP methodology has been translated into a model, called water quality monitoring station analysis (WQMSA), which integrates a geographic information system (GIS) for the handling of the spatial aspect of the data, a hydrologic/water quality simulation model for TP load estimation, and fuzzy logic for improved input data representation. In addition, the methodology was purposely designed to be useful in diverse rural watersheds, independent of geographic location. Three watershed case studies in Pennsylvania, Amazonian Ecuador, and central Chile were examined. Each case study offered a different degree of data availability. It was demonstrated that the developed methodology could be successfully used in all three case studies. The case studies suggest that the CSP methodology, in form of the WQMSA model, has potential in applications world-wide.  相似文献   

8.
针对国内固定源大气颗粒物监测技术的现状及不足,通过对资料总结发现国内固定源排放颗粒物的监测技术现状中存在国标精度要求较低和技术细节不足等问题,并重点从分级采样和大气低浓度颗粒物检测方面深入剖析相关技术的不足与需求,并结合国内外的经验,从国标的修订与补充、分级采样技术体系的建立和低浓度大气颗粒物采样方法的改进等3个方面提出技术与设备方面的改进建议。  相似文献   

9.
Accurately estimating various solute loads in streams during storms is critical to accurately determine maximum daily loads for regulatory purposes. This study investigates the impact of sampling strategy on solute load estimates in streams in the US Midwest. Three different solute types (nitrate, magnesium, and dissolved organic carbon (DOC)) and three sampling strategies are assessed. Regardless of the method, the average error on nitrate loads is higher than for magnesium or DOC loads, and all three methods generally underestimate DOC loads and overestimate magnesium loads. Increasing sampling frequency only slightly improves the accuracy of solute load estimates but generally improves the precision of load calculations. This type of investigation is critical for water management and environmental assessment so error on solute load calculations can be taken into account by landscape managers, and sampling strategies optimized as a function of monitoring objectives.  相似文献   

10.
A new model is proposed for estimating horizontal dilution potential of an area using wind data. The mean wind speed and wind direction variation are used as a measure of linear and angular spread of pollutant in the atmosphere. The methodology is applied to monitored hourly wind data for each month of 1 year for wind data collected at Vadodara, Gujarat and monthly dilution potential is estimated. It is found that there is a gradual variation of horizontal dilution potential over a year with limited dilution during post monsoon period i.e., October and November and a high dilution in pre monsoon period i.e., May and June. This information can be used to design air quality sampling network and duration of sampling for source apportionment study. Air pollutant sampling during high dilution period can be carried out for identifying urban and rural dust and wind blown dust from mining activity. Air pollutant sampling during low dilution period can be carried out for capturing large amount of particulate matter from anthropogenic sources like elevated stack of furnace.  相似文献   

11.
Accurate estimation of constituent loads is important for studies of ecosystem mass balance or total maximum daily loads. In response, there has been an effort to develop methods to increase both accuracy and precision of constituent load estimates. The relationship between constituent concentration and stream discharge is often complicated, potentially leading to high uncertainty in load estimates for certain constituents, especially at longer-term (annual) scales. We used the loadflex R package to compare uncertainty in annual load estimates from concentration vs. discharge relationships in constituents of interest in agricultural systems, including ammonium as nitrogen (NH4-N), nitrate as nitrogen (NO3-N), soluble reactive phosphorus (SRP), and suspended sediments (SS). We predicted that uncertainty would be greatest in NO3-N and SS due to complex relationships between constituent concentration and discharge. We also predicted lower uncertainty with a composite method compared to regression or interpolation methods. Contrary to predictions, we observed the lowest uncertainty in annual NO3-N load estimates (relative error 1.5–23%); however, uncertainty was greatest in SS load estimates, consistent with predictions (relative error 19–96%). For all constituents, we also generally observed reductions in uncertainty by up to 34% using the composite method compared to regression and interpolation approaches, as predicted. These results highlight differences in uncertainty among different constituents and will aid in model selection for future studies requiring accurate and precise estimates of constituent load.  相似文献   

12.
Studies requiring ambient exposure assessments invariably ask: How often should measurements be taken? Answer to such questions is dictated by budgetary considerations as well as spatial and temporal variability in the data. For example, do we obtain measurements during all seasons, all months within seasons, weeks within months and days within weeks? On one hand, we can obtain a one-time snapshot sample and regard it as representing the "true" mean exposure. On the other hand, we may obtain a large number of measurements over time and then average these in order to represent this "true" mean exposure. The former estimate is the least expensive but may also be the least precise while the latter, may be very precise but prohibitively costly. In this paper, we demonstrate how a pilot study can be undertaken with a potentially promising and feasible sampling plan for the full-scale study. By applying the statistical methodology of variance component analysis (VCA) to the pilot study data and exploiting mathematical relationship between the variance of the overall mean exposure and posited variance components, we can develop a sampling design with decreased sampling costs and/or increased precision of the mean exposure. Our approach was applied to determine sampling design choices for an on-going study that aimed at assessing ambient particulate matter exposure. We conclude that a pilot study followed by the VCA analysis may often lead to sampling design choices that offer considerable cost savings and, at the same time, promise to provide relatively precise estimates of the mean exposure for the subsequent full-scale study.  相似文献   

13.
Measuring hydrocarbons from aircraft represents one way to infer biogenic emissions at the surface. The focus of this paper is to show that complementary remote sensing information can be provided by optical measurements of a vegetation index, which is readily measured with high temporal coverage using reflectance data. We examine the similarities between the vegetation index and in situ measurements of the chemicals isoprene, methacrolein, and alpha-pinene to estimate whether the temporal behavior of the in situ measurements of these chemicals could be better understood by the addition of the vegetation index. Data were compared for flights conducted around Houston in August and September 2000. The three independent sets of chemical measurements examined correspond reasonably well with the vegetation index curves for the majority of flight days. While low values of the vegetation index always correspond to low values of the in situ chemical measurements, high values of the index correspond to both high and low values of the chemical measurements. In this sense it represents an upper limit when compared with in situ data (assuming the calibration constant is adequately chosen). This result suggests that while the vegetation index cannot represent a purely predictive quantity for the in situ measurements, it represents a complementary measurement that can be useful in understanding comparisons of various in situ observations, particularly when these observations occur with relatively low temporal frequency. In situ isoprene measurements and the vegetation index were also compared to an isoprene emission inventory to provide additional insight on broad issues relating to the use of vegetation indices in emission database development.  相似文献   

14.
Intervention analysis techniques are described for identifying and statistically modelling trends which may be present in water quality time series. At the exploratory data analysis stage, simple graphical and modelling methods can be employed for visually detecting and examining trends in a time series caused by one or more external interventions. For instance, a plot of a robust locally weighted regression smooth through a graph of the observations over time may reveal trends and other interesting statistical properties contained in the time series. In addition, statistical tests, such as different versions of the nonparametric Mann-Kendall test, can be used to detect the presence of trends caused by unknown or known external interventions. To characterize rigorously and estimate trends which may be known in advance or else detected using exploratory data analysis studies, different parametric methods can be utilized at the confirmatory data analysis stage. Specifically, the time series modelling approach to intervention analysis can be employed to estimate the magnitudes of the changes in the mean level of the series due to the interventions. Particular types of regression models can also be used for estimating trends, especially when there are many missing observations. To demonstrate how intervention analysis methods can be effectively used in environmental impact assessment, representative applications to water quality time series are presented.Invited Paper for Presentation at The Workshop on Statistical Methods for the Assessment of Point Source Pollution, The Canada Centre for Inland Waters, Burlington, Ontario, Canada, L7R 4A6, September 12–14, 1988.  相似文献   

15.
Many countries have a national forest inventory (NFI) designed to produce statistically sound estimates of forest parameters. However, this type of inventory may not provide reliable results for forest damage which usually affects only small parts of the forest in a country. For this reason, specially designed forest damage inventories are performed in many countries, sometimes in coordination with the NFIs. In this study, we evaluated a new approach for damage inventory where existing NFI data form the basis for two-phase sampling for stratification and remotely sensed auxiliary data are applied for further improvement of precision through post-stratification. We applied Monte Carlo sampling simulation to evaluate different sampling strategies linked to different damage scenarios. The use of existing NFI data in a two-phase sampling for stratification design resulted in a relative efficiency of 50 % or lower, i.e., the variance was at least halved compared to a simple random sample of the same size. With post-stratification based on simulated remotely sensed auxiliary data, there was additional improvement, which depended on the accuracy of the auxiliary data and the properties of the forest damage. In many cases, the relative efficiency was further reduced by as much as one-half. In conclusion, the results show that substantial gains in precision can be obtained by utilizing auxiliary information in forest damage surveys, through two-phase sampling, through post-stratification, and through the combination of these two approaches, i.e., post-stratified two-phase sampling for stratification.  相似文献   

16.
在国内一个电厂的烟道上安装了超声波流速仪,进行了参比测试实验、在线校准实验、半年连续运行实验。参比测试结果显示,超声波流速仪对烟道截面流速的代表性好,在烟道直管段长度不够的情况下速度场系数精密度为1.88%,修正后与参比方法的相对误差为0.57%;在线校准实验显示,超声波流速仪零点和量程稳定性好,6个月没有明显漂移;连续运行实验期内有效数据采集率为99.9%,且与锅炉负荷相关性高。实验表明,采用超声波流速仪可以提高烟气流速测量的技术水平和可靠性,降低质量控制的难度和费用。  相似文献   

17.
As the requirements of the Water Framework Directive (WFD) and the US Clean Water Act (USCWA) for the maintenance of microbiological water quality in 'protected areas' highlight, there is a growing recognition that integrated management of point and diffuse sources of microbial pollution is essential. New information on catchment microbial dynamics and, in particular, the sources of faecal indicator bacteria found in bathing and shellfish harvesting waters is a pre-requisite for the design of any 'programme of measures' at the drainage basin scale to secure and maintain compliance with existing and new health-based microbiological standards. This paper reports on a catchment-scale microbial source tracking (MST) study in the Leven Estuary drainage basin, northwest England, an area for which quantitative faecal indicator source apportionment empirical data and land use information were also collected. Since previous MST studies have been based on laboratory trials using 'manufactured' samples or analyses of spot environmental samples without the contextual microbial flux data (under high and low flow conditions) and source information, such background data are needed to evaluate the utility of MST in USCWA total maximum daily load (TMDL) assessments or WFD 'Programmes of Measures'. Thus, the operational utility of MST remains in some doubt. The results of this investigation, using genotyping of Bacteroidetes using polymerase chain reaction (PCR) and male-specific ribonucleic acid coliphage (F + RNA coliphage) using hybridisation, suggest some discrimination is possible between livestock- and human-derived faecal indicator concentrations but, in inter-grade areas, the degree to which the tracer picture reflected the land use pattern and probable faecal indicator loading were less distinct. Interestingly, the MST data was more reliable on high flow samples when much of the faecal indicator flux from catchment systems occurs. Whilst a useful supplementary tool, the MST information did not provide quantitative source apportionment for the study catchment. Thus, it could not replace detailed empirical measurement of microbial flux at key catchment outlets to underpin faecal indicator source apportionment. Therefore, the MST techniques reported herein currently may not meet the standards required to be a useful forensic tool, although continued development of the methods and further catchment scale studies could increase confidence in such methods for future application.  相似文献   

18.
Geostatistical strategy for soil sampling: the survey and the census   总被引:4,自引:0,他引:4  
A soil sampling strategy for spatially correlated variables using the tools of geostatistical analysis is developed. With a minimum of equations, the logic of geostatistical analysis is traced from the modeling of a semi-variogram to the output isomaps of pollution estimates and their standard deviations. These algorithms provide a method to balance precision, accuracy, and costs. Their axiomatic assumptions dictate a two-stage sampling strategy. The first stage is a sampling survey, using a radial gird, to collect enough data to define, by a semi-variogram, the ranges of influence and the orientation of the correlation structure of the pollutant plume. The second stage is a census of the suspected area with grid shape, sizes and orientation dictated by the semi-variogram. The subsequent kriging analysis of this data gives isopleth maps of the pollution field and the standard error isomap of this contouring. These outputs make the monitoring data understandable for the decision maker.  相似文献   

19.
The objective of this paper is to study the impact of the mesh size of the digital elevation model (DEM) on terrain attributes within an Annualized AGricultural NonPoint Source pollution (AnnAGNPS) Model simulation at watershed scale and provide a correction of slope gradient for low resolution DEMs. The effect of different grid sizes of DEMs on terrain attributes was examined by comparing eight DEMs (30, 40, 50, 60, 70, 80, 90, and 100 m). The accuracy of the AnnAGNPS stimulation on runoff, sediments, and nutrient loads is evaluated. The results are as follows: (1) Rnoff does not vary much with decrease of DEM resolution whereas soil erosion and total nitrogen (TN) load change prominently. There is little effect on runoff simulation of AnnAGNPS modeling by the amended slope using an adjusted 50 m DEM. (2) A decrease of sediment yield and TN load is observed with an increase of DEM mesh size from 30 to 60 m; a slight decrease of sediment and TN load with the DEM mesh size bigger than 60 m. There is similar trend for total phosphorus (TP) variation, but with less range of variation, the simulation of sediment, TN, and TP increase, in which sediment increase up to 1.75 times compared to the model using unadjusted 50 m DEM. In all, the amended simulation still has a large difference relative to the results using 30 m DEM. AnnAGNPS is less reliable for sediment loading prediction in a small hilly watershed. (3) Resolution of DEM has significant impact on slope gradient. The average, minimum, maximum of slope from the various DEMs reduced obviously with the decrease of DEM precision. For the grade of 0~15°, the slopes at lower resolution DEM are generally bigger than those at higher resolution DEM. But for the grade bigger than 15°, the slopes at lower resolution DEM are generally smaller than those at higher resolution DEM. So it is necessary to adjust the slope with a fitting equation. A cubic model is used for correction of slope gradient from lower resolution to that from higher resolution. Results for Dage watershed showed that fine meshes are desired to avoid large underestimates of sediment and total nitrogen loads and moderate underestimates of total phosphorus loads even with the slopes for the 50 m DEM adjusted to be more similar to the slopes from the 30 m DEM. Decreasing the mesh size beyond this threshold does not substantially affect the computed runoff flux but generated prediction errors for nitrogen and sediment yields. So the appropriate DEM will control error and make simulation at acceptable level.  相似文献   

20.
顶空气相色谱法测定土壤中BTEX   总被引:5,自引:0,他引:5  
本文采用顶空气相色谱法从土壤样品中分离挥发性有机物。应用该技术,苯、甲苯、乙基苯及二甲苯异构体(BTEX)均得到有效分离。该方法有较好的精密度(变异系数65%,回收率89~103%)和低的检出限(对于苯为05ng/g)。土壤样品中加入水或加热,可提高方法的灵敏度。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号