首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 531 毫秒
1.
In order to calculate total concentrations for comparison to ambient air quality standards, monitored background concentrations are often combined with model predicted concentrations. Models have low skill in predicting the locations or time series of observed concentrations. Further, adding fixed points on the probability distributions of monitored and predicted concentrations is very conservative and not mathematically correct. Simply adding the 99th percentile predicted to the 99th percentile background will not yield the 99th percentile of the combined distributions. Instead, an appropriate distribution can be created by calculating all possible pairwise combinations of the 1-hr daily maximum observed background and daily maximum predicted concentration, from which a 99th percentile total value can be obtained. This paper reviews some techniques commonly used for determining background concentrations and combining modeled and background concentrations. The paper proposes an approach to determine the joint probabilities of occurrence of modeled and background concentrations. The pairwise combinations approach yields a more realistic prediction of total concentrations than the U.S. Environmental Protection Agency's (EPA) guidance approach and agrees with the probabilistic form of the National Ambient Air Quality Standards.

Implications: EPA's current approaches to determining background concentrations for compliance modeling purposes often lead to “double counting” of background concentrations and actual plume impacts and thus lead to overpredictions of total impacts. Further, the current Tier 1 approach of simply adding the top ends of the background and model predicted concentrations (e.g., adding the 99th percentiles of these distributions together) results in design value concentrations at probabilities in excess of the form of the National Ambient Air Quality Standards.  相似文献   

2.
3.
Previous analyses of continuously measured compounds in Fort McKay, an indigenous community in the Athabasca Oil Sands, have detected increasing concentrations of nitrogen dioxide (NO2) and total hydrocarbons (THC), but not of sulfur dioxide (SO2), ozone (O3), total reduced sulfur compounds (TRS), or particulate matter (aerodynamic diameter <2.5 μm; PM2.5). Yet the community frequently experiences odors, dust, and reduced air quality. The authors used Fort McKay’s continuously monitored air quality data (1998–2014) as a case study to assess techniques for air quality analysis that make no assumptions regarding type of change. Linear trend analysis detected increasing concentrations of higher percentiles of NO2, nitric oxide (NO), and nitrogen oxides (NOx), and THC. However, comparisons of all compounds between an early industrial expansion period (1998–2001) and current day (2011–2014) show that concentrations of NO2, SO2, THC, TRS, and PM2.5 have significantly increased, whereas concentrations of O3 are significantly lower. An assessment of the frequency and duration of periods when concentrations of each compound were above a variety of thresholds indicated that the frequency of air quality events is increasing for NO2 and THC. Assessment of change over time with odds ratios of the 25th, 50th, 75th, and 90th percentile concentrations for each compound compared with an estimate of natural background variability showed that concentrations of TRS, SO2, and THC are dynamic, higher than background, and changes are nonlinear and nonmonotonic. An assessment of concentrations as a function of wind direction showed a clear and generally increasing influence of industry on air quality. This work shows that evaluating air quality without assumptions of linearity reveals dynamic changes in air quality in Fort McKay, and that it is increasingly being affected by oil sands operations.

Implications: Understanding the nature and types of air quality changes occurring in a community or region is essential for the development of appropriate air quality management policies. Time-series trending of air quality data is a common tool for assessing air quality changes and is often used to assess the effectiveness of current emission management programs. The use of this tool, in the context of oil sands development, has significant limitations, and alternate air quality change analysis approaches need to be applied to ensure that the impact of this development on air quality is fully understood so that appropriate emission management actions can be taken.  相似文献   


4.
Abstract

This work assessed the usefulness of a current air quality model (American Meteorological Society/Environmental Protection Agency Regulatory Model [AERMOD]) for predicting air concentrations and deposition of perfluorooctanoate (PFO) near a manufacturing facility. Air quality models play an important role in providing information for verifying permitting conditions and for exposure assessment purposes. It is important to ensure traditional modeling approaches are applicable to perfluorinated compounds, which are known to have unusual properties. Measured field data were compared with modeling predictions to show that AERMOD adequately located the maximum air concentration in the study area, provided representative or conservative air concentration estimates, and demonstrated bias and scatter not significantly different than that reported for other compounds. Surface soil/grass concentrations resulting from modeled deposition flux also showed acceptable bias and scatter compared with measured concentrations of PFO in soil/grass samples. Errors in predictions of air concentrations or deposition may be best explained by meteorological input uncertainty and conservatism in the PRIME algorithm used to account for building downwash. In general, AERMOD was found to be a useful screening tool for modeling the dispersion and deposition of PFO in air near a manufacturing facility.  相似文献   

5.
The U.S. Environmental Protection Agency (EPA), state and local agencies have focused their efforts in assessing secondary fine particulate matter (aerodynamic diameter ≤2.5 µm; PM2.5) formation in prevention of significant deterioration (PSD) air dispersion modeling. The National Association of Clean Air Agencies (NACAA) developed a method to account for secondary PM2.5 formation by using sulfur dioxide (SO2) and nitrogen oxides (NOx) offset ratios. These ratios are used to estimate the secondary formation of sulfate and nitrate PM2.5. These ratios were first introduced by the EPA for nonattainment areas in the Implementation of the New Source Review (NSR) Program for Particulate Matter Less than 2.5 Micrometers (PM2.5), 73 FR 28321, to offset emission increases of direct PM2.5 emissions with reductions of PM2.5 precursors and vice versa. Some regulatory agencies such as the Minnesota Pollution Control Agency (MPCA) have developed area-specific offset ratios for SO2 and NOx based on Comprehensive Air Quality Model with Extensions (CAMx) evaluations for air dispersion modeling analyses. The current study evaluates the effect on American Meteorological Society/Environmental Protection Agency Regulatory Model (AERMOD) predicted concentrations from the use of EPA and MPCA developed ratios. The study assesses the effect of these ratios on an electric generating utility (EGU), taconite mine, food processing plant, and a pulp and paper mill. The inputs used for these four scenarios are based on common stack parameters and emissions based on available data. The effect of background concentrations also evaluates these scenarios by presenting results based on uniform annual PM2.5 background values. This evaluation study helps assess the viability of the offset ratio method developed by NACAA in estimating primary and secondary PM2.5 concentrations. An alternative Tier 2 approach to combine modeled and monitored concentrations is also presented.

Implications:

On January 4, 2012, the EPA committed to engage in rulemaking to evaluate updates to the Guideline on Air Quality Models (Appendix W of 40 CFR 51) and, as appropriate, incorporate new analytical techniques or models for secondary PM2.5. As a result, the National Association of Clean Air Agencies (NACAA) developed a screening method involving offset ratios to account for secondary PM2.5 formation. The use of this method is promising to evaluate total (direct and indirect) PM2.5 impacts for permitting purposes. Therefore, the evaluation of this method is important to determine its viability for widespread use.  相似文献   


6.
An evaluation of the steady-state dispersion model AERMOD was conducted to determine its accuracy at predicting hourly ground-level concentrations of sulfur dioxide (SO2) by comparing model-predicted concentrations to a full year of monitored SO2 data. The two study sites are comprised of three coal-fired electrical generating units (EGUs) located in southwest Indiana. The sites are characterized by tall, buoyant stacks, flat terrain, multiple SO2 monitors, and relatively isolated locations. AERMOD v12060 and AERMOD v12345 with BETA options were evaluated at each study site. For the six monitor–receptor pairs evaluated, AERMOD showed generally good agreement with monitor values for the hourly 99th percentile SO2 design value, with design value ratios that ranged from 0.92 to 1.99. AERMOD was within acceptable performance limits for the Robust Highest Concentration (RHC) statistic (RHC ratios ranged from 0.54 to 1.71) at all six monitors. Analysis of the top 5% of hourly concentrations at the six monitor–receptor sites, paired in time and space, indicated poor model performance in the upper concentration range. The amount of hourly model predicted data that was within a factor of 2 of observations at these higher concentrations ranged from 14 to 43% over the six sites. Analysis of subsets of data showed consistent overprediction during low wind speed and unstable meteorological conditions, and underprediction during stable, low wind conditions. Hourly paired comparisons represent a stringent measure of model performance; however, given the potential for application of hourly model predictions to the SO2 NAAQS design value, this may be appropriate. At these two sites, AERMOD v12345 BETA options do not improve model performance.

Implications:

A regulatory evaluation of AERMOD utilizing quantile-quantile (Q–Q) plots, the RHC statistic, and 99th percentile design value concentrations indicates that model performance is acceptable according to widely accepted regulatory performance limits. However, a scientific evaluation examining hourly paired monitor and model values at concentrations of interest indicates overprediction and underprediction bias that is outside of acceptable model performance measures. Overprediction of 1-hr SO2 concentrations by AERMOD presents major ramifications for state and local permitting authorities when establishing emission limits.  相似文献   


7.
Determination of the effect of vehicle emissions on air quality near roadways is important because vehicles are a major source of air pollution. A near-roadway monitoring program was undertaken in Chicago between August 4 and October 30, 2014, to measure ultrafine particles, carbon dioxide, carbon monoxide, traffic volume and speed, and wind direction and speed. The objective of this study was to develop a method to relate short-term changes in traffic mode of operation to air quality near roadways using data averaged over 5-min intervals to provide a better understanding of the processes controlling air pollution concentrations near roadways. Three different types of data analysis are provided to demonstrate the type of results that can be obtained from a near-roadway sampling program based on 5-min measurements: (1) development of vehicle emission factors (EFs) for ultrafine particles as a function of vehicle mode of operation, (2) comparison of measured and modeled CO2 concentrations, and (3) application of dispersion models to determine concentrations near roadways. EFs for ultrafine particles are developed that are a function of traffic volume and mode of operation (free flow and congestion) for light-duty vehicles (LDVs) under real-world conditions. Two air quality models—CALINE4 (California Line Source Dispersion Model, version 4) and AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model)—are used to predict the ultrafine particulate concentrations near roadways for comparison with measured concentrations. When using CALINE4 to predict air quality levels in the mixing cell, changes in surface roughness and stability class have no effect on the predicted concentrations. However, when using AERMOD to predict air quality in the mixing cell, changes in surface roughness have a significant impact on the predicted concentrations.

Implications: The paper provides emission factors (EFs) that are a function of traffic volume and mode of operation (free flow and congestion) for LDVs under real-world conditions. The good agreement between monitoring and modeling results indicates that high-resolution, simultaneous measurements of air quality and meteorological and traffic conditions can be used to determine real-world, fleet-wide vehicle EFs as a function of vehicle mode of operation under actual driving conditions.  相似文献   


8.
AERCOARE is a meteorological data preprocessor for the American Meteorological Society and U.S Environmental Protection Agency (EPA) Regulatory Model (AERMOD). AERCOARE includes algorithms developed during the Coupled-Ocean Atmosphere Response Experiment (COARE) to predict surface energy fluxes and stability from routine overwater measurements. The COARE algorithm is described and the implementation in AERCOARE is presented. Model performance for the combined AERCOARE-AERMOD modeling approach was evaluated against tracer measurements from four overwater field studies. Relatively better model performance was found when lateral turbulence measurements were available and when several key input variables to AERMOD were constrained. Namely, requiring the mixed layer height to be greater than 25 m and not allowing the Monin Obukhov length to be less than 5 m improved model performance in low wind speed stable conditions. Several options for low wind speed dispersion in AERMOD also affected the model performance results. Model performance for the combined AERCOARE-AERMOD modeling approach was found to be comparable to the current EPA regulatory Offshore Coastal Model (OCD) for the same tracer studies. AERCOARE-AERMOD predictions were also compared to simulations using the California Puff-Advection Model (CALPUFF) that also includes the COARE algorithm. Many model performance measures were found to be similar, but CALPUFF had significantly less scatter and better performance for one of the four field studies. For many offshore regulatory applications, the combined AERCOARE-AERMOD modeling approach was found to be a viable alternative to OCD the currently recommended model.

Implications: A new meteorological preprocessor called AERCOARE was developed for offshore source dispersion modeling using the U.S. Environmental Protection Agency (EPA) regulatory model AERMOD. The combined AERCOARE-AERMOD modeling approach allows stakeholders to use the same dispersion model for both offshore and onshore applications. This approach could replace current regulatory practices involving two completely different modeling systems. As improvements and features are added to the dispersion model component, AERMOD, such techniques can now also be applied to offshore air quality permitting.  相似文献   


9.
The Deepwater Horizon oil spill is considered one of the largest marine oil spills in the history of the United States. Air emissions associated with the oil spill caused concern among residents of Southeast Louisiana. The purpose of this study was to assess ambient concentrations of benzene (n=3,887) and fine particulate matter (n=102,682) during the oil spill and to evaluate potential exposure disparities in the region. Benzene and fine particulate matter (PM2.5) concentrations in the targeted parishes were generally higher following the oil spill, as expected. Benzene concentrations reached 2 to 19 times higher than background, and daily exceedances of PM2.5 were 10 to 45 times higher than background. Both benzene and PM2.5 concentrations were considered high enough to exceed public health criteria, with measurable exposure disparities in the coastal areas closer to the spill and clean-up activities. These findings raise questions about public disclosure of environmental health risks associated with the oil spill. The findings also provide a science-based rationale for establishing health-based action levels in future disasters.

Implications: Benzene and particulate matter monitoring during the Deepwater Horizon oil spill revealed that ambient air quality was a likely threat to public health and that residents in coastal Louisiana experienced significantly greater exposures than urban residents. Threshold air pollution levels established for the oil spill apparently were not used as a basis for informing the public about these potential health impacts. Also, despite carrying out the most comprehensive air monitoring ever conducted in the region, none of the agencies involved provided integrated analysis of the data or conclusive statements about public health risk. Better information about real-time risk is needed in future environmental disasters.  相似文献   


10.
Air quality sensors are becoming increasingly available to the general public, providing individuals and communities with information on fine-scale, local air quality in increments as short as 1 min. Current health studies do not support linking 1-min exposures to adverse health effects; therefore, the potential health implications of such ambient exposures are unclear. The U.S. Environmental Protection Agency (EPA) establishes the National Ambient Air Quality Standards (NAAQS) and Air Quality Index (AQI) on the best science available, which typically uses longer averaging periods (e.g., 8 hr; 24 hr). Another consideration for interpreting sensor data is the variable relationship between pollutant concentrations measured by sensors, which are short-term (1 min to 1 hr), and the longer term averages used in the NAAQS and AQI. In addition, sensors often do not meet federal performance or quality assurance requirements, which introduces uncertainty in the accuracy and interpretation of these readings. This article describes a statistical analysis of data from regulatory monitors and new real-time technology from Village Green benches to inform the interpretation and communication of short-term air sensor data. We investigate the characteristics of this novel data set and the temporal relationships of short-term concentrations to 8-hr average (ozone) and 24-hr average (PM2.5) concentrations to examine how sensor readings may relate to the NAAQS and AQI categories, and ultimately to inform breakpoints for sensor messages. We consider the empirical distributions of the maximum 8-hr averages (ozone) and 24-hr averages (PM2.5) given the corresponding short-term concentrations, and provide a probabilistic assessment. The result is a robust, empirical comparison that includes events of interest for air quality exceedances and public health communication. Concentration breakpoints are developed for short-term sensor readings such that, to the extent possible, the related air quality messages that are conveyed to the public are consistent with messages related to the NAAQS and AQI.

Implications: Real-time sensors have the potential to provide important information about fine-scale current air quality and local air quality events. The statistical analysis of short-term regulatory and sensor data, coupled with policy considerations and known health effects experienced over longer averaging times, supports interpretation of such short-term data and efforts to communicate local air quality.  相似文献   


11.
It is axiomatic that good measurements are integral to good public policy for environmental protection. The generalized term for “measurements” includes sampling and quantitation, data integrity, documentation, network design, sponsorship, operations, archiving, and accessing for applications. Each of these components has evolved and advanced over the last 200 years as knowledge of atmospheric chemistry and physics has matured. Air quality was first detected by what people could see and smell in contaminated air. Gaseous pollutants were found to react with certain materials or chemicals, changing the color of dissolved reagents such that their light absorption at selected wavelengths could be related to both the pollutant chemistry and its concentration. Airborne particles have challenged the development of a variety of sensory devices and laboratory assays for characterization of their enormous range of physical and chemical properties. Advanced electronics made possible the sampling, concentration, and detection of gases and particles, both in situ and in laboratory analysis of collected samples. Accurate and precise measurements by these methods have made possible advanced air quality management practices that led to decreasing concentrations over time. New technologies are leading to smaller and cheaper measurement systems that can further expand and enhance current air pollution monitoring networks.

Implications: Ambient air quality measurement systems have a large influence on air quality management by determining compliance, tracking trends, elucidating pollutant transport and transformation, and relating concentrations to adverse effects. These systems consist of more than just instrumentation, and involve extensive support efforts for siting, maintenance, calibration, auditing, data validation, data management and access, and data interpretation. These requirements have largely been attained for criteria pollutants regulated by National Ambient Air Quality Standards, but they are rarely attained for nonroutine measurements and research studies.  相似文献   


12.
Air quality zones are used by regulatory authorities to implement ambient air standards in order to protect human health. Air quality measurements at discrete air monitoring stations are critical tools to determine whether an air quality zone complies with local air quality standards or is noncompliant. This study presents a novel approach for evaluation of air quality zone classification methods by breaking the concentration distribution of a pollutant measured at an air monitoring station into compliance and exceedance probability density functions (PDFs) and then using Monte Carlo analysis with the Central Limit Theorem to estimate long-term exposure. The purpose of this paper is to compare the risk associated with selecting one ambient air classification approach over another by testing the possible exposure an individual living within a zone may face. The chronic daily intake (CDI) is utilized to compare different pollutant exposures over the classification duration of 3 years between two classification methods. Historical data collected from air monitoring stations in Kuwait are used to build representative models of 1-hr NO2 and 8-hr O3 within a zone that meets the compliance requirements of each method. The first method, the “3 Strike” method, is a conservative approach based on a winner-take-all approach common with most compliance classification methods, while the second, the 99% Rule method, allows for more robust analyses and incorporates long-term trends. A Monte Carlo analysis is used to model the CDI for each pollutant and each method with the zone at a single station and with multiple stations. The model assumes that the zone is already in compliance with air quality standards over the 3 years under the different classification methodologies. The model shows that while the CDI of the two methods differs by 2.7% over the exposure period for the single station case, the large number of samples taken over the duration period impacts the sensitivity of the statistical tests, causing the null hypothesis to fail. Local air quality managers can use either methodology to classify the compliance of an air zone, but must accept that the 99% Rule method may cause exposures that are statistically more significant than the 3 Strike method.

Implications: A novel method using the Central Limit Theorem and Monte Carlo analysis is used to directly compare different air standard compliance classification methods by estimating the chronic daily intake of pollutants. This method allows air quality managers to rapidly see how individual classification methods may impact individual population groups, as well as to evaluate different pollutants based on dosage and exposure when complete health impacts are not known.  相似文献   


13.
To improve U.S. air quality, there are many regulations on-the-way (OTW) and on-the-books (OTB), including mobile source California Low Emission Vehicle third generation (LEV III) and federal Tier 3 standards. This study explores the effects of those regulations by using the U.S. Environmental Protection Agency's (EPA) Community Multiscale Air Quality (CMAQ) model for 8-hr ozone concentrations in the western and eastern United States in the years 2018 and 2030 during a month with typical high ozone concentrations, July. Alterations in pollutant emissions can be due to technological improvements, regulatory amendments, and changes in growth. In order to project emission rates for future years, the impacts of all of these factors were estimated. This study emphasizes the potential light-duty vehicle emission changes by year to predict ozone levels. The results of this study show that most areas have decreases in 8-hr ozone concentrations in the year 2030, although there are some areas with increased concentrations. Additionally, there are areas with 8-hr ozone concentrations greater than the current U.S. National Ambient Air Quality Standard level, which is 75 ppb.

Implications:

To improve U.S. air quality, many regulations are on the way and on the books, including mobile source California LEV III and federal Tier 3 standards. This study explores the effects of those regulations for 8-hr ozone concentrations in the western and eastern United States in the years 2018 and 2030. The results of this study show that most areas have decreases in 8-hr ozone concentrations in 2030, although there are some areas with increased concentrations. Additionally, there are areas with 8-hr ozone concentrations greater than the current U.S. National Ambient Air Quality Standard level.  相似文献   


14.
An explicit NOx chemistry method has been implemented in AERMOD version 15181, ADMSM. The scheme has been evaluated by comparison with the methodologies currently recommended by the U.S. EPA for Tier 3 NO2 calculations, that is, OLM and PVMRM2. Four data sets have been used for NO2 chemistry method evaluation. Overall, ADMSM-modeled NO2 concentrations show the most consistency with the AERMOD calculations of NOx and the highest Index of Agreement; they are also on average lower than those of both OLM and PVMRM2. OLM shows little consistency with modeled NOx concentrations and markedly overpredicts NO2. PVMRM2 shows performance closer to that of ADMSM than OLM; however, its behavior is inconsistent with modeled NOx in some cases and it has less good statistics for NO2. The trend in model performance can be explained by examining the features particular to each chemistry method: OLM can be considered as a screening model as it calculates the upper bound of conversion from NO to NO2 possible with the background O3 concentration; PVMRM2 includes a much-improved estimate of in-plume O3 but is otherwise similar to OLM, assuming instantaneous reaction of NO with O3; and ADMSM allows for the rate of this reaction and also the photolysis of NO2. Evaluation with additional data sets is needed to further clarify the relative performance of ADMSM and PVMRM2.

Implications: Extensive evaluation of the current AERMOD Tier 3 chemistry methods OLM and PVMRM2, alongside a new scheme that explicitly calculates the oxidation of NO by O3 and the reverse photolytic reaction, shows that OLM consistently overpredicts NO2 concentrations. PVMRM2 performs well in general, but there are some cases where this method overpredicts NO2. The new explicit NOx chemistry scheme, ADMSM, predicts NO2 concentrations that are more consistent with both the modeled NOx concentrations and the observations.  相似文献   


15.
Air pollution caused by ship exhaust emission is receiving more and more attention. The physical and chemical properties of fuels, such as sulfur content and PAHs content, potentially had a significant influence on air pollutant emissions from inland vessels. In order to investigate the effects of fuel qualities on atmospheric pollutant emissions systematically, a series of experiments was conducted based on the method of actual ship testing. As a result, SO2, PM and NOx emission rates all increased with the increase of main engine rotating speed under cruise mode, while PM and NOx emission factors were inversely proportional to the main engine rotating speed. Moreover, SO2 emission factor changed little with the increase of the main engine rotating speed. In summary, the fuel-dependent specific emission of SO2 was a direct reflection of the sulfur content in fuel. The PM emission increased with the increase of sulfur content and PAHs content in fuel. However, fuel qualities impacted little on NOx emissions from inland vessels because of NOx formation mechanisms and conditions.

Implications: Ship activity is considered to be the third largest source of air pollution in China. In particular, air pollutants emitted from ships in river ports and waterways have a direct impact on regional air quality and pose threat on the health of local residents owing to high pollutants concentration and poor air diffusion. The study on the relationship between air pollutant emissions and fuel quality of inland vessels can provide foundational data for local authority to formulate reasonable and appropriate policies for reducing atmospheric pollution due to inland vessels.  相似文献   


16.
The Imperial County Community Air Monitoring Network was developed as part of a community-engaged research study to provide real-time particulate matter (PM) air quality information at a high spatial resolution in Imperial County, California. The network augmented the few existing regulatory monitors and increased monitoring near susceptible populations. Monitors were both calibrated and field validated, a key component of evaluating the quality of the data produced by the community monitoring network. This paper examines the performance of a customized version of the low-cost Dylos optical particle counter used in the community air monitors compared with both PM2.5 and PM10 (particulate matter with aerodynamic diameters <2.5 and <10 μm, respectively) federal equivalent method (FEM) beta-attenuation monitors (BAMs) and federal reference method (FRM) gravimetric filters at a collocation site in the study area. A conversion equation was developed that estimates particle mass concentrations from the native Dylos particle counts, taking into account relative humidity. The R2 for converted hourly averaged Dylos mass measurements versus a PM2.5 BAM was 0.79 and that versus a PM10 BAM was 0.78. The performance of the conversion equation was evaluated at six other sites with collocated PM2.5 environmental beta-attenuation monitors (EBAMs) located throughout Imperial County. The agreement of the Dylos with the EBAMs was moderate to high (R2 = 0.35–0.81).

Implications: The performance of low-cost air quality sensors in community networks is currently not well documented. This paper provides a methodology for quantifying the performance of a next-generation Dylos PM sensor used in the Imperial County Community Air Monitoring Network. This air quality network provides data at a much finer spatial and temporal resolution than has previously been possible with government monitoring efforts. Once calibrated and validated, these high-resolution data may provide more information on susceptible populations, assist in the identification of air pollution hotspots, and increase community awareness of air pollution.  相似文献   


17.
In 2010, the U.S. National Aeronautics and Space Administration (NASA) initiated the Air Quality Applied Science Team (AQAST) as a 5-year, $17.5-million award with 19 principal investigators. AQAST aims to increase the use of Earth science products in air quality-related research and to help meet air quality managers’ information needs. We conducted a Web-based survey and a limited number of follow-up interviews to investigate federal, state, tribal, and local air quality managers’ perspectives on usefulness of Earth science data and models, and on the impact AQAST has had. The air quality managers we surveyed identified meeting the National Ambient Air Quality Standards for ozone and particulate matter, emissions from mobile sources, and interstate air pollution transport as top challenges in need of improved information. Most survey respondents viewed inadequate coverage or frequency of satellite observations, data uncertainty, and lack of staff time or resources as barriers to increased use of satellite data by their organizations. Managers who have been involved with AQAST indicated that the program has helped build awareness of NASA Earth science products, and assisted their organizations with retrieval and interpretation of satellite data and with application of global chemistry and climate models. AQAST has also helped build a network between researchers and air quality managers with potential for further collaborations.

Implications: NASA’s Air Quality Applied Science Team (AQAST) aims to increase the use of satellite data and global chemistry and climate models for air quality management purposes, by supporting research and tool development projects of interest to both groups. Our survey and interviews of air quality managers indicate they found value in many AQAST projects and particularly appreciated the connections to the research community that the program facilitated. Managers expressed interest in receiving continued support for their organizations’ use of satellite data, including assistance in retrieving and interpreting data from future geostationary platforms meant to provide more frequent coverage for air quality and other applications.  相似文献   


18.
19.
Visibility degradation, one of the most noticeable indicators of poor air quality, can occur despite relatively low levels of particulate matter when the risk to human health is low. The availability of timely and reliable visibility forecasts can provide a more comprehensive understanding of the anticipated air quality conditions to better inform local jurisdictions and the public. This paper describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada’s operational Regional Air Quality Deterministic Prediction System (RAQDPS) for the Lower Fraser Valley of British Columbia. A baseline model (GM-IMPROVE) was constructed using the revised IMPROVE algorithm based on unprocessed forecasts from the RAQDPS. Three additional prototypes (UMOS-HYB, GM-MLR, GM-RF) were also developed and assessed for forecast performance of up to 48 hr lead time during various air quality and meteorological conditions. Forecast performance was assessed by examining their ability to provide both numerical and categorical forecasts in the form of 1-hr total extinction and Visual Air Quality Ratings (VAQR), respectively. While GM-IMPROVE generally overestimated extinction more than twofold, it had skill in forecasting the relative species contribution to visibility impairment, including ammonium sulfate and ammonium nitrate. Both statistical prototypes, GM-MLR and GM-RF, performed well in forecasting 1-hr extinction during daylight hours, with correlation coefficients (R) ranging from 0.59 to 0.77. UMOS-HYB, a prototype based on postprocessed air quality forecasts without additional statistical modeling, provided reasonable forecasts during most daylight hours. In terms of categorical forecasts, the best prototype was approximately 75 to 87% correct, when forecasting for a condensed three-category VAQR. A case study, focusing on a poor visual air quality yet low Air Quality Health Index episode, illustrated that the statistical prototypes were able to provide timely and skillful visibility forecasts with lead time up to 48 hr.

Implications: This study describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada’s operational Regional Air Quality Deterministic Prediction System. The main applications include tourism and recreation planning, input into air quality management programs, and educational outreach. Visibility forecasts, when supplemented with the existing air quality and health based forecasts, can assist jurisdictions to anticipate the visual air quality impacts as perceived by the public, which can potentially assist in formulating the appropriate air quality bulletins and recommendations.  相似文献   


20.
Air and water quality are impacted by extreme weather and climate events on time scales ranging from minutes to many months. This review paper discusses the state of knowledge of how and why extreme events are changing and are projected to change in the future. These events include heat waves, cold waves, floods, droughts, hurricanes, strong extratropical cyclones such as nor'easters, heavy rain, and major snowfalls. Some of these events, such as heat waves, are projected to increase, while others, with cold waves being a good example, will decrease in intensity in our warming world. Each extreme's impact on air or water quality can be complex and can even vary over the course of the event.

Implications:

Because extreme weather and climate events impact air and water quality, understanding how the various extremes are changing and are projected to change in the future has ramifications on air and water quality management.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号