首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
ABSTRACT

Designing air quality management strategies is complicated by the difficulty in simultaneously considering large amounts of relevant data, sophisticated air quality models, competing design objectives, and unquantifiable issues. For many problems, mathematical optimization can be used to simplify the design process by identifying cost-effective solutions. Optimization applications for controlling nonlinearly reactive pollutants such as tropospheric ozone, however, have been lacking because of the difficulty in representing nonlinear chemistry in mathematical programming models.

We discuss the use of genetic algorithms (GAs) as an alternative optimization approach for developing ozone control strategies. A GA formulation is described and demonstrated for an urban-scale ozone control problem in which controls are considered for thousands of pollutant sources simultaneously. A simple air quality model is integrated into the GA to represent ozone transport and chemistry. Variations of the GA formulation for multiobjective and chance-constrained optimization are also presented. The paper concludes with a discussion of the practicality of using more sophisticated, regulatory-scale air quality models with the GA. We anticipate that such an approach will be practical in the near term for supporting regulatory decision-making.  相似文献   

2.
Carbon bond (CB-III) fractions for non-methane organic carbon compounds (NMOC) measured in the background alrmass adverted into several urban areas in the eastern and southern United States are reported. These, together with ozone measured aloft, were used In an Empirical Kinetic Modeling Approach (EKMA) to model urban ozone production and urban ozone control strategies.

Over a range of zero to double the mean of the measured NMOC concentrations aloft (0 to 70 ppbC) and zero to the highest ozone levels recorded aloft (0 to 65 ppb), it was found that urban ozone production and control strategies were relatively insensitive to NMOC from aloft. However, urban ozone production was sensitive to ozone from aloft, while ozone control strategies were insensitive to ozone from aloft.  相似文献   

3.
The 1990 Clean Air Act Amendments require states with O3 nonattainment areas to adopt regulations to enforce reasonable available control technologies (RACT) for NOX stationary sources by November 1992. However, if the states can demonstrate that such measures will have an adverse effect on air quality, NOX requirements may be waived. To assist the states in making this decision, the U.S. EPA is attempting to develop guidelines for the states to use in deciding whether NOX reductions will have a positive or negative impact on O3 air quality. Although NOX is a precursor of O3, at low VOC/NOX ratios, the reduction of NOX can result in increased peak O3. EPA is examining existing information on VOC/NOX ratios to develop “rules of thumb” to guide the states in their decision-making process. An examination of 6 a.m. to 9 a.m. VOC/NOX ratios at a number of sites in the eastern U.S. indicates that the ratio is highly variable from day-to-day and there is no apparent relationship between ratios measured at different sites within the same area. In addition, statistical analysis failed to identify significant relationships between the 6 a.m. to 9 a.m. VOC/NOX ratio and the maximum 1-hr. O3 within a given area. Since we know from smog chamber and modeling studies that such a relationship exists, this further invalidates the assumption that a ratio measured at a single site is representative of the ratio for the entire region. Based on this Information, we conclude that having the 6 a.m. to 9 a.m. ambient VOC/NOX ratio for a given area is insufficient information, by itself, to decide whether a VOC-alone, a NOx-alone, or a combined VOC-NOX reduction strategy is a viable or optimum O3-reduction strategy.  相似文献   

4.
ABSTRACT

An intercomparison study has been performed with six empirical ozone interpolation procedures to predict hourly concentrations in ambient air between monitoring stations. The objective of the study is to use monitoring network data to empirically identify an improved procedure to estimate ozone concentrations at subject exposure points. Four of the procedures in the study are currently used in human exposure models (nearest monitors daily mean and maximum, regression estimate used in the U.S. Environmental Protection Agency's (EPA) pNEM, and inverse distance weighting), and two are being evaluated for this purpose (kriging in space and kriging in space and time). The study focused on spatial estimation during June 1-June 5, 1996, with relatively high observed ozone levels over Houston, Texas. The study evaluated these procedures at three types of locations with monitors of varying proximity. Results from the empirical evaluation indicate that kriging in space and time provides excellent estimates of ozone concentrations within a monitoring network, while the more often used techniques failed to capture observed pollutant concentrations. Improved estimation of pollutant concentrations within the region, and thus at subject locations, should result in improved exposure modeling.  相似文献   

5.
6.
Abstract

The U.S. Environmental Protection Agency (EPA) Quality Assurance (QA) Guidance Document 2.12: Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I Equivalent Methods1 (Document 2.12) requires conditioning of PM2.5 filters at 20-23 °C and 30-40% relative humidity (RH) for 24 hr prior to gravimetric analysis. Variability of temperature and humidity may not exceed ±2 °C and ±5% RH during the conditioning period. The quality assurance team at EPA Region 2’s regional laboratory designed a PM2.5 weighing facility that operates well within these strict performance requirements.

The traditional approach to meeting the performance requirements of Document 2.12 for PM2.5 filter analysis is to build a walk-in room, with costs typically exceeding $100,000. The initial one-time capital cost for the laboratory at EPA’s Edison, NJ, facility was approximately $24,000. Annual costs [e.g., National Institute of Standards and Technology (NIST) recertifications and nitrogen replacement cylinders used for humidity control] are approximately $500. The average 24-hr variabilities in temperature and RH in the Region 2 weighing chamber are small, ±0.2 °C and ±0.8% RH, respectively. The mass detection limit for the PM2.5 weighing system of 47-mm stretched Teflon (lab blank) filters is 6.3 μg. This facility demonstrates an effective and economical example for states and other organizations planning PM2.5 weighing facilities.  相似文献   

7.
The management of tropospheric ozone (O3) is particularly difficult. The formulation of emission control strategies requires considerable information including: (1) emission inventories, (2) available control technologies, (3) meteorological data for critical design episodes, and (4) computer models that simulate atmospheric transport and chemistry. The simultaneous consideration of this information during control strategy design can be exceedingly difficult for a decision-maker. Traditional management approaches do not explicitly address cost minimization. This study presents a new approach for designing air quality management strategies; a simple air quality model is used conjunctively with a complex air quality model to obtain low-cost management strategies. A simple air quality model is used to identify potentially good solutions, and two heuristic methods are used to identify cost-effective control strategies using only a small number of simple air quality model simulations. Subsequently, the resulting strategies are verified and refined using a complex air quality model. The use of this approach may greatly reduce the number of complex air quality model runs that are required. An important component of this heuristic design framework is the use of the simple air quality model as a screening and exploratory tool. To achieve similar results with the simple and complex air  相似文献   

8.
ABSTRACT

A hybrid nonlinear regression (NLR) model and a neural network (NN) model, each designed to forecast next-day maximum 1-hr average ground-level O3 concentrations in Louisville, KY, were compared for two O3 seasons—1998 and 1999. The model predictions were compared for the forecast mode, using forecasted meteorological data as input, and for the hindcast mode, using observed meteorological data as input. The two models performed nearly the same in the forecast mode. For the two seasons combined, the mean absolute forecast error was 12.5 ppb for the NLR model and 12.3 ppb for the NN model. The detection rate of 120 ppb threshold exceedances was 42% for each model in the forecast mode. In the hindcast mode, the NLR model performed marginally better than the NN  相似文献   

9.
ABSTRACT

Project MOHAVE was a major monitoring, modeling, and data analysis study whose objectives included the estimation of the contributions of the Mohave Power Project (MPP) and other sources to visibility impairment in the southwestern United States, in particular at Grand Canyon National Park. A major element of Project MOHAVE was the release of perfluorocarbon tracers at MPP and other locations during 50-day summer and 30-day winter intensive study periods. Tracer data (from about 30 locations) were sequestered until several source and receptor models were used to predict tracer concentrations. None of the models was successful in predicting the tracer concentrations; squared correlation coefficients between predicted and measured tracer were all less than 0.2, and most were less than 0.1.  相似文献   

10.
Abstract

The location of the northeastern Iberian Peninsula (NEIP) in the northwestern Mediterranean basin, the presence of the Pyrenees mountain range (with altitudes >3000 m), and the influence of the Mediterranean Sea and the large valley canalization of Ebro river induce an extremely complicated structure for the dispersion of photochemical pollutants. Air pollution studies in very complex terrains such as the NEIP require high-resolution modeling for resolving the very complex dynamics of flows. To deal with the influence of larger-scale transport, however, high-resolution models have to be nested in larger models to generate appropriate initial and boundary conditions for the finer resolution domains. This article shows the results obtained through the utilization of the MM5-EMICAT2000-CMAQ multiscale-nested air quality model relating the sensitivity regimes for ozone (O3)-nitrogen oxides (NOx)-volatile organic compounds (VOCs) in an area of high geographical complexity, like the industrial area of Tarragona, located in the NEIP. The model was applied with fine temporal (one-hour) and spatial resolution (cells of 24 km, 2 km, and 1 km) to represent the chemistry and transport of tropospheric O3 and other photochemical species with respect to different hypothetical scenarios of emission controls and to quantify the influence of different emission sources in the area. Results indicate that O3 chemistry in the industrial domain of Tarragona is strongly sensitive to VOCs; the higher percentages of reduction for ground-level O3 are achieved when reducing by 25% the emissions of industrial VOCs. On the contrary, reductions in the industrial emissions of NOx contribute to a strong increase in hourly peak levels of O3. At the same time, the contribution of on-road traffic and biogenic emissions to ground-level O3 concentrations in the area is negligible with respect to the pervasive weight of industrial sources. This analysis provides an assessment of the effectiveness of different policies for the control of emission of precursors by comparing the modeled results for different scenarios.  相似文献   

11.
Achievement of air quality goals now more than ever requires careful consideration of alternative control strategies in view of national concerns with energy and the economy. Three strategies which might be used by coal-fired steam electric plants to achieve ambient air quality standards for sulfur dioxide have been compared, and the analysis shows that the desired objective can be achieved using the intermittent control strategy with substantially less impact on the environment, less consumption of energy, and at a much lower economic cost than using either stack gas scrubbing or low-sulfur coal.  相似文献   

12.
13.
ABSTRACT

This article describes an effort to re-examine the scientific bases of the existing, more than two decades-old U.S. Environmental Protection Agency (EPA) policy on volatile organic compound reactivity in light of recent scientific knowledge and understanding. The existing policy allows “negligibly reactive” organic emissions, that is, emissions with ambient ozone production potential lower than that of ethane, to be exempted from all ozone regulations. It relies on use of kOH and incremental reactivity data for determining whether an organic compound is negligibly reactive. Recent scientific evidence suggests that (1) exempting the negligibly reactive organic emissions from all regulations is unjustifiable, (2) the choice of ethane as the benchmark organic species for distinguishing reactive from negligibly reactive organics may be inappropriate, (3) the assumptions and methods used for classifying organic compounds as “reactive” and “negligibly reactive” should be reconsidered, and (4) the volatility factor should be considered, more appropriately, in much the same way as the reactivity factor.  相似文献   

14.
Foliar injury and shoot fresh weight responses of soybeans (Glycine max L.) ‘Lee 68’ and ‘Dare’ exposed to mixtures of ozone (O3) and sulfur dioxide (SO2) were greater than additive (synergistic), less than additive (antagonistic), or additive. The result depended on the concentrations of O3 and SO2, the exposure duration, and the amount of injury caused by each gas singly. Synergism usually occurred when injury from O3 or SO2 singly was slight to moderate. Antagonism usually occurred when injury from either gas singly was severe. In many cases of antagonism, the injury and fresh weight effects of the mixture were less than those from SO2 alone, suggesting that O3 can sometimes protect soybeans from SO2.  相似文献   

15.
In 2010, the U.S. National Aeronautics and Space Administration (NASA) initiated the Air Quality Applied Science Team (AQAST) as a 5-year, $17.5-million award with 19 principal investigators. AQAST aims to increase the use of Earth science products in air quality-related research and to help meet air quality managers’ information needs. We conducted a Web-based survey and a limited number of follow-up interviews to investigate federal, state, tribal, and local air quality managers’ perspectives on usefulness of Earth science data and models, and on the impact AQAST has had. The air quality managers we surveyed identified meeting the National Ambient Air Quality Standards for ozone and particulate matter, emissions from mobile sources, and interstate air pollution transport as top challenges in need of improved information. Most survey respondents viewed inadequate coverage or frequency of satellite observations, data uncertainty, and lack of staff time or resources as barriers to increased use of satellite data by their organizations. Managers who have been involved with AQAST indicated that the program has helped build awareness of NASA Earth science products, and assisted their organizations with retrieval and interpretation of satellite data and with application of global chemistry and climate models. AQAST has also helped build a network between researchers and air quality managers with potential for further collaborations.

Implications: NASA’s Air Quality Applied Science Team (AQAST) aims to increase the use of satellite data and global chemistry and climate models for air quality management purposes, by supporting research and tool development projects of interest to both groups. Our survey and interviews of air quality managers indicate they found value in many AQAST projects and particularly appreciated the connections to the research community that the program facilitated. Managers expressed interest in receiving continued support for their organizations’ use of satellite data, including assistance in retrieving and interpreting data from future geostationary platforms meant to provide more frequent coverage for air quality and other applications.  相似文献   


16.
Simplified algorithms are presented for estimating the cost of controlling sulfur dioxide (SO2) emissions from existing coal-fired power plants on a state-by-state basis. Results are obtained using the detailed Utility Control Strategy Model (UCSM) to calculate the Impacts of emission reductions ranging from approximately 30 percent to 90 percent of projected 1995 emissions for 18 different scenarios and 36 states. Scenarios include the use of two dry SO2 removal technologies (lime spray dryers and LIMB) as potential options for power plant retrofit, in addition to currently available emission control options including coal switching, coal cleaning and wet flue gas desulfurization (FGD). Technical assumptions relating to FGD system performance and the upgrading of existing cold-side electrostatic precipitators (ESP) for reduced sulfur levels are also analyzed, along with the effects of interest rates, coal prices, coal choice restrictions, plant lifetime, and plant operating levels. Results are summarized in the form of a 3-term polynomial equation for each state, giving total annualized SO2 control cost as a function of the total SO2 emissions reduction for each scenario. Excellent statistical fits to UCSM results are obtained for these generalized equations.  相似文献   

17.
The stomatal resistance, measured with a ventilated diffusion porometer at various times before, during, and after exposure to 20–25 pphm ozone, was followed in water-stressed or well-watered beans, beans exposed at either low (37%) or high (73%) atmospheric humidity, and two tobacco cultivars exposed at the same two humidities. The two tobacco cultivars that were compared were the 03-susceptible Bel W-3 and the 03-resistant Consolidated L. The stomata of the water-stressed but unwilted bean plants closed quickly from a resistance of 2.9 ± 0.3 sec/cm to 8.4 ± 1.0 sec/cm when exposed to O3 whereas those in the unstressed plants closed slowly from a resistance of 2.5 ± 0.6 sec/cm to 5.2 ± 0.8 sec/cm after exposure to O3 for 10 min. Exposure to 03 for 30 min in the moist atmosphere caused no change in stomatal resistance of the bean plants whereas in the dry atmosphere the stomata closed from a resistance of 3.7 ± 0.4 sec/cm to 6.7 ± 0.6 sec/cm, but opened again when ozonation was terminated. With tobacco exposed to O3 in a dry atmosphere the stomata of the 03-resistant cultivar closed more rapidly than the 03-susceptible variety, whereas in a moist atmosphere the stomata of both cultivars closed slowly and equally during the 60 min of ozonation.  相似文献   

18.
Present evidence suggests that ozone is the most damaging of all air pollutants affecting vegetation. It is the principal oxidant in the photochemical smog complex. Concentrations of ozone have exceeded 0.5 part per million (ppm) in the Los Angeles area. One-tenth of this level for 8 hours is known to injure very sensitive tobacco varieties. Many plant species are visibly affected after a few hours exposure at concentrations much lower than 0.5 ppm. There is also some evidence that ozone reduces plant growth. Many factors must be taken into account when considering standards to protect vegetation from ozone damage. These include ozone concentration and methods of measurement, time of exposure, possible additive effects of other pollutants, sensitivity of plant species, their economic value, and the extent of injury which can be tolerated. The response of a species to the pollutant is conditioned by genetic factors and environmental conditions. Lack of specific routine methods for measuring ozone in ambient air is a handicap. California and Colorado established standards for oxidants at 0.15 and 0.10 ppm, respectively, for 1 hour. How these standards relate to the ozone dosage causing acute and chronic injury to various plant species is discussed.  相似文献   

19.
Ozone disintegration of excess biomass and application to nitrogen removal.   总被引:1,自引:0,他引:1  
A pilot-scale facility integrated with an ozonation unit was built to investigate the feasibility of using ozone-disintegration byproducts of wasted biomass as a carbon source for denitrification. Ozonation of biomass resulted in mass reduction by mineralization as well as by ozone-disintegrated biosolids recycling. Approximately 50% of wasted solids were recovered as available organic matter (ozonolysate), which included nonsettleable microparticles and soluble fractions. Microparticles were observed in abundance at relatively low levels of ozone doses, while soluble fractions became dominant at higher levels of ozone doses in ozone-disintegrated organics. Batch denitrification experiments showed that the ozonolysate could be used as a carbon source with a maximum denitrification rate of 3.66 mg nitrogen (N)/g volatile suspended solids (VSS) x h. Ozonolysate was also proven to enhance total nitrogen removal efficiency in the pilot-scale treatment facility. An optimal chemical oxygen demand (COD)-to-nitrogen ratio for complete denitrification was estimated as 5.13 g COD/g N. The nitrogen-removal performance of the modified intermittently decanted extended aeration process dependent on an external carbon supply could be described as a function of solids retention time.  相似文献   

20.
ABSTRACT

A pollution source may release residuals to any of several environmental media, depending on the process design and control strategies. These residuals then are subject to transfer, transport, and transformation within the interconnected compartments of the environmental system. The exposure and susceptibility of people and other receptors to pollutants are different in these various media, and so the risks imposed will vary according to the fate of the pollutants in the system. Because of interactions between compartments in the system, a single-medium approach to environmental management that mitigates problems in one environmental medium at a time independently of risks through other media may not minimize the aggregate risk a receptor receives from all pathways. Alternatively, a multimedia approach advocates focusing on the full environmental system providing pathways for exposure and selecting risk management strategies based on minimization of the aggregate and cumulative risk from all pathways and all compounds. This study combines multimedia risk analysis and an optimization framework to examine a methodology for selecting waste treatment/disposal and pollution control measures, applies the methodology to a sludge management decision problem, and considers the implications for continued use of single-medium analyses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号