首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
While water quality benchmarks for the protection of aquatic life have been in use in some jurisdictions for several decades (USA, Canada, several European countries), more and more countries are now setting up their own national water quality benchmark development programs. In doing so, they either adopt an existing method from another jurisdiction, update on an existing approach, or develop their own new derivation method. Each approach has its own advantages and disadvantages, and many issues have to be addressed when setting up a water quality benchmark development program or when deriving a water quality benchmark. Each of these tasks requires a special expertise. They may seem simple, but are complex in their details. The intention of this paper was to provide some guidance for this process of water quality benchmark development on the program level, for the derivation methodology development, and in the actual benchmark derivation step, as well as to point out some issues (notably the inclusion of adapted populations and cryptic species and points to consider in the use of the species sensitivity distribution approach) and future opportunities (an international data repository and international collaboration in water quality benchmark development).  相似文献   

2.
This paper presents a novel approach for assessing the risk of consumer product ingredients in surface waters that receive untreated wastewater. The approach utilizes the water quality simulation model QUAL2E for predicting the impact caused by conventional pollutants, as well as the exposure concentration of consumer product ingredients. This approach invokes an impact zone concept whereby the receiving water can be thought of as a natural wastewater treatment system. After the river has recovered via self-purification, the ecosystem is then assessed by traditional risk assessment methods. This approach was validated using data collected from the Balatuin River, which is located in the Philippines. Results from this study support the use of QUAL2E for assessing the risk of consumer product ingredients in riverine systems receiving untreated wastewater.  相似文献   

3.
Air quality zones are used by regulatory authorities to implement ambient air standards in order to protect human health. Air quality measurements at discrete air monitoring stations are critical tools to determine whether an air quality zone complies with local air quality standards or is noncompliant. This study presents a novel approach for evaluation of air quality zone classification methods by breaking the concentration distribution of a pollutant measured at an air monitoring station into compliance and exceedance probability density functions (PDFs) and then using Monte Carlo analysis with the Central Limit Theorem to estimate long-term exposure. The purpose of this paper is to compare the risk associated with selecting one ambient air classification approach over another by testing the possible exposure an individual living within a zone may face. The chronic daily intake (CDI) is utilized to compare different pollutant exposures over the classification duration of 3 years between two classification methods. Historical data collected from air monitoring stations in Kuwait are used to build representative models of 1-hr NO2 and 8-hr O3 within a zone that meets the compliance requirements of each method. The first method, the “3 Strike” method, is a conservative approach based on a winner-take-all approach common with most compliance classification methods, while the second, the 99% Rule method, allows for more robust analyses and incorporates long-term trends. A Monte Carlo analysis is used to model the CDI for each pollutant and each method with the zone at a single station and with multiple stations. The model assumes that the zone is already in compliance with air quality standards over the 3 years under the different classification methodologies. The model shows that while the CDI of the two methods differs by 2.7% over the exposure period for the single station case, the large number of samples taken over the duration period impacts the sensitivity of the statistical tests, causing the null hypothesis to fail. Local air quality managers can use either methodology to classify the compliance of an air zone, but must accept that the 99% Rule method may cause exposures that are statistically more significant than the 3 Strike method.

Implications: A novel method using the Central Limit Theorem and Monte Carlo analysis is used to directly compare different air standard compliance classification methods by estimating the chronic daily intake of pollutants. This method allows air quality managers to rapidly see how individual classification methods may impact individual population groups, as well as to evaluate different pollutants based on dosage and exposure when complete health impacts are not known.  相似文献   


4.
Species sensitivity distributions (SSDs) are increasingly used in both ecological risk assessment and derivation of water quality criteria. However, there has been debate about the choice of an appropriate approach for derivation of water quality criteria based on SSDs because the various methods can generate different values. The objective of this study was to compare the differences among various methods. Data sets of acute toxicities of 12 substances to aquatic organisms, representing a range of classes with different modes of action, were studied. Nine typical statistical approaches, including parametric and nonparametric methods, were used to construct SSDs for 12 chemicals. Water quality criteria, expressed as hazardous concentration for 5 % of species (HC5), were derived by use of several approaches. All approaches produced comparable results, and the data generated by the different approaches were significantly correlated. Variability among estimates of HC5 of all inclusive species decreased with increasing sample size, and variability was similar among the statistical methods applied. Of the statistical methods selected, the bootstrap method represented the best-fitting model for all chemicals, while log-triangle and Weibull were the best models among the parametric methods evaluated. The bootstrap method was the primary choice to derive water quality criteria when data points are sufficient (more than 20). If the available data are few, all other methods should be constructed, and that which best describes the distribution of the data was selected.  相似文献   

5.
Abstract

The management of tropospheric ozone (O3) is particularly difficult. The formulation of emission control strategies requires considerable information including: (1) emission inventories, (2) available control technologies, (3) meteorological data for critical design episodes, and (4) computer models that simulate atmospheric transport and chemistry. The simultaneous consideration of this information during control strategy design can be exceedingly difficult for a decision-maker. Traditional management approaches do not explicitly address cost minimization. This study presents a new approach for designing air quality management strategies; a simple air quality model is used conjunctively with a complex air quality model to obtain low-cost management strategies. A simple air quality model is used to identify potentially good solutions, and two heuristic methods are used to identify cost-effective control strategies using only a small number of simple air quality model simulations. Subsequently, the resulting strategies are verified and refined using a complex air quality model. The use of this approach may greatly reduce the number of complex air quality model runs that are required. An important component of this heuristic design framework is the use of the simple air quality model as a screening and exploratory tool. To achieve similar results with the simple and complex air quality models, it may be necessary to “tweak” or calibrate the simple model. A genetic algorithm-based optimization procedure is used to automate this tweaking process. These methods are demonstrated to be computationally practical using two realistic case studies, which are based on data from a metropolitan region in the United States.  相似文献   

6.
Recent years have seen considerable improvement in water quality standards (QS) for metals by taking account of the effect of local water chemistry conditions on their bioavailability. We describe preliminary efforts to further refine water quality standards, by taking account of the composition of the local ecological community (the ultimate protection objective) in addition to bioavailability. Relevance of QS to the local ecological community is critical as it is important to minimise instances where quality classification using QS does not reconcile with a quality classification based on an assessment of the composition of the local ecology (e.g. using benthic macroinvertebrate quality assessment metrics such as River InVertebrate Prediction and Classification System (RIVPACS)), particularly where ecology is assessed to be at good or better status, whilst chemical quality is determined to be failing relevant standards. The alternative approach outlined here describes a method to derive a site-specific species sensitivity distribution (SSD) based on the ecological community which is expected to be present at the site in the absence of anthropogenic pressures (reference conditions). The method combines a conventional laboratory ecotoxicity dataset normalised for bioavailability with field measurements of the response of benthic macroinvertebrate abundance to chemical exposure. Site-specific QSref are then derived from the 5%ile of this SSD. Using this method, site QSref have been derived for zinc in an area impacted by historic mining activities. Application of QSref can result in greater agreement between chemical and ecological metrics of environmental quality compared with the use of either conventional (QScon) or bioavailability-based QS (QSbio). In addition to zinc, the approach is likely to be applicable to other metals and possibly other types of chemical stressors (e.g. pesticides). However, the methodology for deriving site-specific targets requires additional development and validation before they can be robustly applied during surface water classification.  相似文献   

7.
Air quality models are used to make decisions regarding the construction of industrial plants, the types of fuel that will be burnt and the types of pollution control devices that will be used. It is important to know the uncertainties that are associated with these model predictions. Standard analytical methods found in elementary statistics textbooks for estimating uncertainties are generally not applicable since the distributions of performance measures related to air quality concentrations are not easily transformed to a Gaussian shape. This paper suggests several possible resampling procedures that can be used to calculate uncertainties or confidence limits on air quality model performance. In these resampling methods, many new data sets are drawn from the original data set using an empirical set of rules. A few alternate forms of the socalled bootstrap and jackknife resampling procedures are tested using a concocted data set with a Gaussian parent distributions, with the result that the jackknife is the most efficient procedure to apply, although its confidence bounds are slightly overestimated. The resampling procedures are then applied to predictions by seven air quality models for the Carpinteria coastal dispersion experiment. Confidence intervals on the fractional mean bias and the normalized mean square error are calculated for each model and for differences between models. It is concluded that these uncertainties are sometimes so large for data sets consisting of about 20 elements that it cannot be stated with 95% confidence that the performance measure for the ‘best’ model is significantly different from that for another model.  相似文献   

8.
With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.  相似文献   

9.
The management of tropospheric ozone (O3) is particularly difficult. The formulation of emission control strategies requires considerable information including: (1) emission inventories, (2) available control technologies, (3) meteorological data for critical design episodes, and (4) computer models that simulate atmospheric transport and chemistry. The simultaneous consideration of this information during control strategy design can be exceedingly difficult for a decision-maker. Traditional management approaches do not explicitly address cost minimization. This study presents a new approach for designing air quality management strategies; a simple air quality model is used conjunctively with a complex air quality model to obtain low-cost management strategies. A simple air quality model is used to identify potentially good solutions, and two heuristic methods are used to identify cost-effective control strategies using only a small number of simple air quality model simulations. Subsequently, the resulting strategies are verified and refined using a complex air quality model. The use of this approach may greatly reduce the number of complex air quality model runs that are required. An important component of this heuristic design framework is the use of the simple air quality model as a screening and exploratory tool. To achieve similar results with the simple and complex air  相似文献   

10.
Air pollution control officials often make the simplifying assumption that air pollutant concentrations are independent samples from a stationary probability distribution. If the parent distribution really is stationary and correctly chosen, maximum likelihood estimation almost always will provide the best possible estimate of its parameters. However, the air pollution literature makes little if any mention of this fact and often suggests using the method of moments or the method of fractiles to estimate the parameters of an assumed distribution, and using the results for computing design values to determine the control level required to meet an air quality standard. No estimate is made in the air pollution literature of the magnitude of the difference produced by these different methods.This paper investigates the effectiveness of three different approaches for estimating parameters using a lognormal distribution as an example: (a) method of fractiles; (b) method of moments and (c) method of maximum likelihood. The error associated with each approach for computing emission controls is determined by sampling from a true stationary lognormal distribution using computer simulation. These results then are compared with a fourth approach, direct empirical linear rollback, in which no model is used and design values are calculated using raw observations. The latter approach often is used in practical situations by air pollution control personnel.In 100 simulated years at a site experiencing the same lognormally distributed air pollution in the precontrol state, the correct control level was 50%. The following control levels were calculated: Empirical rollback, 22–82%; Method of fractiles, 32–64%; Method of moments, 41–59% and Method of maximum likelihood, 46–54%, with most years very close to the true value of 50%.Thus, the maximum likelihood approach effectively reduces the variance by ‘filtering out’ the effect of random phenomena occurring during the year and would be the method of choice if the observations are indeed distributed as they are assumed.  相似文献   

11.
Of many available methods for limiting ground level pollutant concentrations, tall stacks are many times the simplest, most effective, and least costly. Although this is theoretically explicit, field validation of the soundness of this approach is often hampered by lack of comparable "before" and "after" data. In this study at the Alma Power Plant, appropriate air quality and meteorological measurements were made for several years before and after conversion from short to tall stacks. Comparison of these data show that the tall stack has reduced ambient levels of SO2 by from 50 to 95 % in the vicinity of the plant. This study also found that use of a Turner-Briggs dispersion model in a valley situation gave fairly accurate and reliable estimates of air quality. The model was useful in designing the tall stack, assessing its impact and locating air quality monitors.  相似文献   

12.
Traffic emission estimation in developing countries is a key-issue for air pollution management. In most cases, comprehensive bottom-up methodologies cannot be applied in mid-sized cities because of the resource cost related to their application. In this paper, a simplified emission estimation model (SEEM) is evaluated. The model is based on a top-down approach and gives annual global hot emission. Particular attention is paid to the quality of the input traffic data. The quality of results is assessed by application of the SEEM model in the Chilean Gran Concepción urban area and by comparison with a bottom-up approach that has been led for the year 2000. The SEEM model estimates emissions with an accuracy of about 20% and is related to important resource savings. The results of the SEEM model are then distributed in space with a disaggregation approach and using GIS techniques. The relevancy of the disaggregation approach is evaluated among several possibilities through statistical methods. A spatial disaggregation using principal roads density gives the best results in terms of emissions repartition and gives a globally accurate image of the distribution of hot emissions in a mid-sized city.  相似文献   

13.
The body of information presented in this paper is directed to those individuals concerned with developing or implementing screening strategies for characterizing organic emissions from incinerators and other combustion sources. The need to characterize hazardous waste incinerator emissions for multiple organic compounds has been steadily increasing for several years. The regulatory approach makes use of a type of indicator compound procedure that concentrates on principal organic hazardous constituents (POHCs). In addition to continuing interest in POHCs, interest has been growing in the types and concentrations of products of incomplete combustion (PICs). Sampling and analysis methods have been developed previously for approximately 225 of the more important POHCs and PICs. These methods may be used as components of a cost-effective screening protocol aimed at maximum characterization of emissions, whether the project budget is large or small. This paper contains a discussion of fundamental principles of several kinds of screening strategies and recommends an approach suitable for incinerators and other combustion sources. The concept of a risk-driven analysis strategy is introduced and illustrated with a simplified example.  相似文献   

14.
Abstract

The field of ozone air quality modeling, or as it is commonly referred to, photochemical air quality modeling, has undergone rapid change in recent years. Improvements in model components, as well as in methods of interpreting model performance, have contributed to this change. Attendant with this rapid change has been a growing need for those developing and using air quality models and policy makers to have a common understanding of the use and role of models in the decision making process. This Critical Review highlights recent advances and continuing problem areas in photochemical air quality modeling. Emphasis is placed on the components and input data for such models, model performance evaluation, and the implications for their use in regulatory decisions.  相似文献   

15.
Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively.  相似文献   

16.
To maximize the findings of animal experiments to inform likely health effects in humans, a thorough review and evaluation of the animal evidence is required. Systematic reviews and, where appropriate, meta-analyses have great potential in facilitating such an evaluation, making efficient use of the animal evidence while minimizing possible sources of bias. The extent to which systematic review and meta-analysis methods have been applied to evaluate animal experiments to inform human health is unknown. Using systematic review methods, we examine the extent and quality of systematic reviews and meta-analyses of in vivo animal experiments carried out to inform human health. We identified 103 articles meeting the inclusion criteria: 57 reported a systematic review, 29 a systematic review and a meta-analysis, and 17 reported a meta-analysis only. The use of these methods to evaluate animal evidence has increased over time. Although the reporting of systematic reviews is of adequate quality, the reporting of meta-analyses is poor. The inadequate reporting of meta-analyses observed here leads to questions on whether the most appropriate methods were used to maximize the use of the animal evidence to inform policy or decision-making. We recommend that guidelines proposed here be used to help improve the reporting of systematic reviews and meta-analyses of animal experiments. Further consideration of the use and methodological quality and reporting of such studies is needed.  相似文献   

17.
To maximize the findings of animal experiments to inform likely health effects in humans, a thorough review and evaluation of the animal evidence is required. Systematic reviews and, where appropriate, meta-analyses have great potential in facilitating such an evaluation, making efficient use of the animal evidence while minimizing possible sources of bias. The extent to which systematic review and meta-analysis methods have been applied to evaluate animal experiments to inform human health is unknown. Using systematic review methods, we examine the extent and quality of systematic reviews and meta-analyses of in vivo animal experiments carried out to inform human health. We identified 103 articles meeting the inclusion criteria: 57 reported a systematic review, 29 a systematic review and a meta-analysis, and 17 reported a meta-analysis only. The use of these methods to evaluate animal evidence has increased over time. Although the reporting of systematic reviews is of adequate quality, the reporting of meta-analyses is poor. The inadequate reporting of meta-analyses observed here leads to questions on whether the most appropriate methods were used to maximize the use of the animal evidence to inform policy or decision-making. We recommend that guidelines proposed here be used to help improve the reporting of systematic reviews and meta-analyses of animal experiments. Further consideration of the use and methodological quality and reporting of such studies is needed.  相似文献   

18.
A rapid analytical approach for determination of polycyclic aromatic hydrocarbons (PAHs) present in real samples of particulate matter (PM10 filters) was investigated, based on the use of water under sub critical conditions, and the subsequent determination by GC-MS (SIM). The method avoids the use of large volumes of organic solvents as dichloromethane, toluene or other unhealthy liquid organic mixtures which are normally used in time-consuming conventional sample preparation methods. By using leaching times <1 h, the method allows determination of PAHs in the range of ng/m3 (detection limits between 0.05 and 0.2 ng/m3 for 1458 m3 of sampled air) with a precision expressed as RSD between 5.6% and 11.2%. The main idea behind this approach is to raise the temperature and pressure of water inside a miniaturized laboratory-made extraction unit and to decrease its dielectric constant from 80 to nearly 20. This effect allows an increase in the solubility of low polarity hydrocarbons such as PAHs. In this way, an extraction step of a few minutes can be sufficient for a quantitative extraction of airborne particles collected in high volume PM10 samplers. Parameters such as: extraction flow, static or dynamic extraction times and water volume were optimized by using a standard reference material. Technical details are given and a comparison using real samples is made between the conventional Soxhlet extraction method and the proposed approach.The proposed approach can be used as a quantitative method to characterize low molecular PAHs and simultaneously as a screening method for high molecular weight PAHs, because the recoveries are not quantitative for molecular weights over 202. In the specific case of the Santiago metropolitan area, due to the frequent occurrence of particulate matter during high pollution episodes, this approach was applied as an efficient short-time screening method for urban PAHs. Application of this screening method is recommended especially during the winter, when periods of clear detriment of the atmospheric and meteorological conditions occur in the area.  相似文献   

19.
Dry deposition contributes significantly to the acidification of ecosystems. However, difficulties in measuring dry deposition of reactive gases and fine particles make routine direct monitoring impractical. An alternate approach is to use the “concentration monitoring” method in which dry deposition flux is estimated as the product of measured concentration and estimated deposition velocity. A sampling system that performs over the period of 6 hours to 7 days, depending on atmospheric concentrations, has been developed. It consists of a Teflon cyclone to exclude particles larger than about 2 μm, selective solid adsorption media for reactive gases—some of which are sampled from a transition flow to avoid possible bias from particle evaporation, a particle filter, and a final gas adsorption filter to collect the remaining trace gas. The sampler Is the first reported application of transition flow mass transfer for the collection and quantitative measurement of trace atmospheric gases. Laboratory and field tests have shown that the sampler performs well for HNO3(g).  相似文献   

20.
The purpose of this paper is to demonstrate the use of some statistical methods for examining trends in ambient ozone air quality downwind of major urban areas. To this end, daily maximum 1 -hr ozone concentrations measured over New Jersey, metropolitan New York City and Connecticut for the period 1980 to 1989 were assembled and analyzed. This paper discusses the application of the bootstrap method, extreme value statistics and a nonparametric test for evaluating trends in urban ozone air quality. The results indicate that although there is an improvement in ozone air quality downwind of New York City, there has been little change in ozone levels upwind of New York City during this ten-year period.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号