首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
Abstract

Despite the widespread application of photochemical air quality models (AQMs) in U.S. state implementation planning (SIP) for attainment of the ambient ozone standard, documentation for the reliability of projections has remained highly subjective. An “idealized” evaluation framework is proposed that provides a means for assessing reliability. Applied to 18 cases of regulatory modeling in the early 1990s in North America, a comparative review of these applications is reported. The intercomparisons suggest that more than two thirds of these AQM applications suffered from having inadequate air quality and meteorological databases. Emissions representations often were unreliable; uncertainties were too high. More than two thirds of the performance evaluation efforts were judged to be substandard compared with idealized goals. Meteorological conditions chosen according regulatory guidelines were limited to one or two cases and tended to be similar, thus limiting the extent to which public policy makers could be confident that the emission controls adopted would yield attainment for a broad range of adverse atmospheric conditions. More than half of the studies reviewed did not give sufficient attention to addressing the potential for compensating errors. Corroborative analyses were conducted in only one of the 18 studies reviewed. Insufficient attention was given to the estimation of model and/or input database errors, uncertainties, or variability in all of the cases examined. However, recent SIP and policy‐related regional modeling provides evidence of substantial improvements in the underlying science and available modeling systems used for regulatory decision making. Nevertheless, the availability of suitable databases to support increasingly sophisticated modeling continues to be a concern for many locations. Thus, AQM results may still be subject to significant uncertainties. The evaluative process used here provides a framework for modelers and public policy makers to assess the adequacy of contemporary and future modeling work.  相似文献   

2.
To comply with the federal 8-hr ozone standard, the state of Texas is creating a plan for Houston that strictly follows the U.S. Environmental Protection Agency's (EPA) guidance for demonstrating attainment. EPA's attainment guidance methodology has several key assumptions that are demonstrated to not be completely appropriate for the unique observed ozone conditions found in Houston. Houston's ozone violations at monitoring sites are realized as gradual hour-to-hour increases in ozone concentrations, or by large hourly ozone increases that exceed up to 100 parts per billion/hr. Given the time profiles at the violating monitors and those of nearby monitors, these large increases appear to be associated with small parcels of spatially limited plumes of high ozone in a lower background of urban ozone. Some of these high ozone parcels and plumes have been linked to a combination of unique wind conditions and episodic hydrocarbon emission events from the Houston Ship Channel. However, the regulatory air quality model (AQM) does not predict these sharp ozone gradients. Instead, the AQM predicts gradual hourly increases with broad regions of high ozone covering the entire Houston urban core. The AQM model performance can be partly attributed to EPA attainment guidance that prescribes the removal in the baseline model simulation of any episodic hydrocarbon emissions, thereby potentially removing any nontypical causes of ozone exceedances. This paper shows that attainment of all monitors is achieved when days with observed large hourly variability in ozone concentrations are filtered from attainment metrics. Thus, the modeling and observational data support a second unique cause for how ozone is formed in Houston, and the current EPA methodology addresses only one of these two causes.  相似文献   

3.
Past studies indicate a nationwide potential low-sulfur coal supply deficit in 1975 arising from extremely low-sulfur State Implementation Plan requirements which cannot ail be met in time by available coal and gas cleaning technology. One means to alleviate this net deficit would be to grant variances where at least primary air quality standards would be maintained.

An extensive modeling analysis was conducted by EPA and Walden Research on a large number of power plants in 51 AQCRs located in 20 states to determine if compliance extensions at these plants could significantly reduce the projected deficit of lowsulfur coal. Using simulation modeling, air quality impact at each plant for projected 1975 operations was determined with application of SIP regulatory requirements and with a full variance from SIP requirements for coal-fired boilers. The results from this investigation indicate that the attainment of primary SO2 air quality standards for the coal-fired plants would probably not be jeopardized by the application of full variance status to 34% of the plants and limited variance status to an additional 22% of the plants. No variance is appropriate for the remaining plants. The projected annual reduction In low-sulfur coal demand (less than 1.0% sulfur) is approximately 137 million tons. The projected shift in the average coal sulfur distribution is from 1.2% under SIP status to 2.1% under the applicable variance status. The power plant variance strategy appears, then, to offer a potentially feasible approach toward alleviating the low-sulfur coal deficit problem without jeopardizing attainment of primary air quality standards. It should be emphasized that compliance extensions are not the only way, or the most desirable way, of dealing with this problem. The final selection of a strategy for a given state or AQCR and the implementation of that strategy involve many questions and policy matters beyond the scope of this study.  相似文献   

4.
The updated regulatory framework for demonstrating that future 8-hr ozone (O3) design values will be at or below the National Ambient Air Quality Standards (NAAQS) provides guidelines for the development of a State Implementation Plan (SIP) that includes methods based on photochemical modeling and analytical techniques. One of the suggested approaches is the relative reduction factor (RRF) for estimating the efficacy of emission reductions. In this study, the sensitivity of model-predicted responses towards emission reductions to the choice of meteorology and chemical mechanisms was examined. While the different modeling simulations generally were found to be in agreement on whether predicted future-year design values would be above or below the NAAQS for 8-hr O3 at a majority of the monitoring locations in the eastern United States, differences existed for a small percentage of monitors (approximately 6.4%). Another issue investigated was the ability of the attainment demonstration procedure to predict changes in monitored O3 design values. A retrospective analysis was performed by comparing predicted O3 design values from model simulations using emission estimates for 1996 and 2001 with monitored O3 design values for 2001. Results indicated that an average gross error of approximately 5 ppb was present between modeled and observed design values and that, at approximately 27% of all sites, model-predicted and observed design values disagreed as to whether the design value was above or below the NAAQS. Retrospective analyses such as the one presented in this study can provide valuable insights into the strengths and limitations of modeling and analysis techniques used to predict future design values over time periods of a decade or more for the purpose of developing SIPs. Furthermore, such analyses could provide avenues for improvement and added confidence in the use of the RRF approach for addressing attainment of the NAAQS.  相似文献   

5.
In Houston, some of the highest measured 8-hr ozone (O3) peaks are characterized by sudden increases in observed concentrations of at least 40 ppb in 1 hr, or 60 ppb in 2 hr. Measurements show that these large hourly changes appear at only a few monitors and span a narrow geographic area, suggesting a spatially heterogeneous field of O3 concentrations. This study assessed whether a regulatory air quality model (AQM) can simulate this observed behavior. The AQM did not reproduce the magnitude or location of some of the highest observed hourly O3 changes, and it also failed to capture the limited spatial extent. On days with measured large hourly changes in O3 concentrations, the AQM predicted high O3 over large regions of Houston, resulting in overpredictions at several monitors. This analysis shows that the model can make high O3, but on these days the predicted spatial field suggests that the model had a different cause. Some observed large hourly changes in O3 concentrations have been linked to random releases of industrial volatile organic compounds (VOCs). In the AQM emission inventory, there are several emission events when an industrial point source increases VOC emissions in excess of 10,000 mol/hr. One instance increased predicted downwind O3 concentrations up to 25 ppb. These results show that the modeling system is responsive to a large VOC release, but the timing and location of the release, and meteorological conditions, are critical requirements. Attainment of the O3 standard requires the use of observational data and AQM predictions. If the large observed hourly changes are indicative of a separate cause of high O3, then the model may not include that cause, which might result in regulators enacting control strategies that could be ineffective.

Implications To show the attainment of the O3 standard, the U.S. Environmental Protection Agency (EPA) requires the use of observations and model predictions under the assumption that simulations are capable of reproducing observed phenomena. The regulatory model is unable to reproduce observed behavior measured in the observational database. If the large observed hourly changes were indicative of a separate cause of high O3, then the model would not include that cause. Inaccurate model predictions may prompt air quality regulators to enact control strategies that are effective in the modeling system, but prove ineffective in the real world.  相似文献   

6.
States rely upon photochemical models to predict the impacts of air quality attainment strategies, but the performance of those predictions is rarely evaluated retrospectively. State implementation plans (SIPs) developed to attain the 1997 U.S. standard for fine particulate matter (PM2.5; denoting particles smaller than 2.5 microns in diameter) by 2009 provide the first opportunity to assess modeled predictions of PM2.5 reductions at the state level. The SIPs were the first to rely upon a speciated modeled attainment test methodology recommended by the U.S. Environmental Protection Agency to predict PM2.5 concentrations and attainment status. Of the 23 eastern U.S. regions considered here, all but one achieved the 15 μg/m3 standard by 2009, and the other achieved it the following year, with downward trends sustained in subsequent years. The attainment tests predicted 2009 PM2.5 design values at individual monitors with a mean bias of 0.38 μg/m3 and mean error of 0.68 μg/m3, and were 95% accurate in predicting whether a monitor would achieve the standard. All of the errors were false alarms, in which the monitor observed attainment after a modeled prediction of an exceedance; in these cases, the states used weight-of-evidence determinations to argue that attainment was likely. Overall, PM2.5 concentrations at monitors in the SIP regions declined by 2.6 μg/m3 from 2000–2004 to 2007–2009, compared with 1.6 μg/m3 in eastern U.S. regions originally designated as attainment. Air quality improvements tended to be largest at monitors that were initially the most polluted.
ImplicationsAs states prepare to develop plans for attaining a more stringent standard for fine particulate matter, this retrospective analysis documents substantial and sustained air quality improvements achieved under the previous standard. Significantly larger air quality improvements in regions initially designated nonattainment of the 1997 standard indicate that this status prompted heightened control efforts. The speciated modeled attainment test is found to be accurate and slightly conservative in predicting particulate concentrations for the cases considered here, providing confidence for its use in upcoming attainment plans.  相似文献   

7.
The U.S. Environmental Protection Agency in 1997 revised the 1-hr ozone (O3) National Ambient Air Quality Standard (NAAQS) to one based on an 8-hr average, resulting in potential nonattainment status for substantial portions of the eastern United States. The regulatory process provides for the development of a state implementation plan that includes a demonstration that the projected future O3 concentrations will be at or below the NAAQS based on photochemical modeling and analytical techniques. In this study, four photochemical modeling systems, based on two photochemical models, Community Model for Air Quality and the Comprehensive Air Quality Model with extensions, and two emissions processing models, Sparse Matrix Optimization Kernel for Emissions and Emissions Modeling System, were applied to the eastern United States, with emphasis on the northeastern Ozone Transport Region in terms of their response to oxides of nitrogen and volatile organic carbon-focused controls on the estimated design values. With the 8-hr O3 NAAQS set as a bright-line test, it was found that a given area could be termed as being in or out of attainment of the NAAQS depending upon the modeling system. This suggests the need to provide an estimate of model-to-model uncertainty in the relative reduction factor (RRF) for a better understanding of the uncertainty in projecting the status of an area's attainment. Results indicate that the model-to-model differences considered in this study introduce  相似文献   

8.
The capping of stationary source emissions of NOx in 22 states and the District of Columbia is federally mandated by the NOx SIP Call legislation with the intended purpose of reducing downwind O3 concentrations. Monitors for NO, NO2, and the reactive oxides of nitrogen into which these two compounds are converted will record data to evaluate air quality model (AQM) predictions. Guidelines for testing these models indicate the need for semicontinuous measurements as close to real time as possible but no less frequently than once per hour. The measurement uncertainty required for AQM testing must be less than +/-20% (+/-10% for NO2) at mixing ratios of 1 ppbv and higher for NO, individual NOz component compounds, and NOy. This article is a review and discussion of different monitoring methods, some currently used in research and others used for routine monitoring. The performance of these methods is compared with the monitoring guidelines. Recommendations for advancing speciated and total NOy monitoring technology and a listing of demonstrated monitoring approaches are also presented.  相似文献   

9.
In Houston, some of the highest measured 8-hr ozone (O3) peaks are characterized by sudden increases in observed concentrations of at least 40 ppb in 1 hr or 60 ppb in 2 hr. Measurements show that these large hourly changes appear at only a few monitors and span a narrow geographic area, suggesting a spatially heterogeneous field of O3 concentrations. This study assessed whether a regulatory air quality model (AQM) can simulate this observed behavior. The AQM did not reproduce the magnitude or location of some of the highest observed hourly O3 changes, and it also failed to capture the limited spatial extent. On days with measured large hourly changes in O3 concentrations, the AQM predicted high O3 over large regions of Houston, resulting in overpredictions at several monitors. This analysis shows that the model can make high O3, but on these days the predicted spatial field suggests that the model had a different cause. Some observed large hourly changes in O3 concentrations have been linked to random releases of industrial volatile organic compounds (VOCs). In the AQM emission inventory, there are several emission events when an industrial point source increases VOC emissions in excess of 10,000 mol/hr. One instance increased predicted downwind O3 concentrations up to 25 ppb. These results show that the modeling system is responsive to a large VOC release, but the timing and location of the release, and meteorological conditions, are critical requirements. Attainment of the O3 standard requires the use of observational data and AQM predictions. If the large observed hourly changes are indicative of a separate cause of high O3, then the model may not include that cause, which might result in regulators enacting control strategies that could be ineffective.  相似文献   

10.
Abstract

The U.S. Environmental Protection Agency in 1997 revised the 1-hr ozone (O3) National Ambient Air Quality Standard (NAAQS) to one based on an 8-hr average, resulting in potential nonattainment status for substantial portions of the eastern United States. The regulatory process provides for the development of a state implementation plan that includes a demonstration that the projected future O3 concentrations will be at or below the NAAQS based on photochemical modeling and analytical techniques.

In this study, four photochemical modeling systems, based on two photochemical models, Community Model for Air Quality and the Comprehensive Air Quality Model with extensions, and two emissions processing models, Sparse Matrix Optimization Kernel for Emissions and Emissions Modeling System, were applied to the eastern United States, with emphasis on the northeastern Ozone Transport Region in terms of their response to oxides of nitrogen and volatile organic carbon-focused controls on the estimated design values. With the 8-hr O3 NAAQS set as a bright-line test, it was found that a given area could be termed as being in or out of attainment of the NAAQS depending upon the modeling system. This suggests the need to provide an estimate of model-to-model uncertainty in the relative reduction factor (RRF) for a better understanding of the uncertainty in projecting the status of an area's attainment. Results indicate that the model-to-model differences considered in this study introduce an uncertainty of the future estimated design value of ~3–5 ppb.  相似文献   

11.
ABSTRACT

Photochemical air quality simulation models are now used widely in evaluating the merits of alternative emissions control strategies on spatial scales from metropolitan to sub-continental. Greatly varying levels of resources have been available to support modeling, from relatively comprehensive databases and evaluation of performance to a paucity of aerometric data for developing model inputs. Where data are sparse, many alternative outcomes are consistent with the knowledge at hand. Where performance evaluation is inadequately supported, the probability of error may be high. In each instance, uncertainties may be large when compared with the signal of interest, and thus confidence in the reliability of the model as an estimator of future air quality may come into question.

This paper proposes a qualitative procedure for assessing whether a particular application of a modeling system is likely to be potentially unreliable, suggesting that either (1) modification and further evaluation is needed, if supportable, prior to adoption for regulatory application; or (2) the model should not be used if improvement is not supportable. The procedure is proposed for use by policy-makers, staffs of public agencies, air quality managers, environmental staffs of industrial organizations, and other interested parties. The proposed use of the procedure is (1) to assess, a priori, whether a proposed application is likely to be judged questionable or unacceptably uncertain in outcome; and (2) to provide, a posteriori, a basis for judging quickly the likely quality of model performance. The procedure is presented with tropospheric ozone as the pollutant of concern. With adjustments, however, the procedure should be applicable for particu-late matter and other pollutants of interest.  相似文献   

12.
For a large-scale, unanticipated release of a toxic chemical into the atmosphere, it is recommended for nearby populations to shelter indoors. Two new metrics to quantify the community-scale effectiveness of shelter-in-place (SIP) are introduced. The casualty reduction factor (CRF) quantifies the expected reduction in casualties if SIP is performed. The safety-factor multiplier (SFM) quantifies the extent of toxic-load reduction for individuals in each exposed building. In this paper, idealized models are combined to explore the relationships among important input parameters and the SIP-effectiveness metrics. A Gaussian plume model predicts ambient concentrations for a hypothetical release event. A box model predicts indoor concentrations in buildings. A toxic-load model links exposure to health consequences. SIP effectiveness varies significantly with the toxic-load exponent, m, which characterizes the dose–response relationship. Another influential variable is a dimensionless time scale, ξ, equal to the release duration multiplied by the building air-exchange rate. Other factors that influence SIP effectiveness include the magnitude of the release relative to the toxicity of the pollutant, atmospheric transport and dispersion rates, and punctual termination of SIP once the toxic cloud has passed. SIP can be effective for short-duration releases (ξ<1), especially for chemicals with m of 2–3 or higher. If m=1, punctual termination at the end of the event can be important to ensure SIP effectiveness.  相似文献   

13.
The current status of the mathematical modeling of atmospheric particulate matter (PM) is reviewed in this paper. Simulating PM requires treating various processes, including the formation of condensable species, the gas/ particle partitioning of condensable compounds, and in some cases, the evolution of the particle size distribution. The algorithms available to simulate these processes are reviewed and discussed. Eleven 3-dimensional (3-D) Eulerian air quality models for PM are reviewed in terms of their formulation and past applications. Results of past performance evaluations of 3-D Eulerian PM models are presented. Currently, 24-hr average PM2.5 concentrations appear to be predicted within 50% for urban-scale domains. However, there are compensating errors among individual particulate species. The lowest errors tend to be associated with SO4(2-), while NO3-, black carbon (BC), and organic carbon (OC) typically show larger errors due to uncertainties in emissions inventories and the prediction of the secondary OC fraction. Further improvements and performance evaluations are recommended.  相似文献   

14.
ABSTRACT

Three-dimensional air quality models (AQMs) represent the most powerful tool to follow the dynamics of air pollutants at urban and regional scales. Current AQMs can account for the complex interactions between gas-phase chemistry, aerosol growth, cloud and scavenging processes, and transport. However, errors in model applications still exist due in part to limitations in the models themselves and in part to uncertainties in model inputs. Four-dimensional data assimilation (FDDA) can be used as a top-down tool to validate several of the model inputs, including emissions inventories, based on ambient measurements. Previously, this FDDA technique was used to estimate adjustments in the strength and composition of emissions of gas-phase primary species and O3 precursors.

In this paper, we present an extension to the FDDA technique to incorporate the analysis of particulate matter (PM) and its precursors. The FDDA approach consists of an iterative optimization procedure in which an AQM is coupled to an inverse model, and adjusting the emissions minimizes the difference between ambient measurements  相似文献   

15.
Three-dimensional air quality models (AQMs) represent the most powerful tool to follow the dynamics of air pollutants at urban and regional scales. Current AQMs can account for the complex interactions between gas-phase chemistry, aerosol growth, cloud and scavenging processes, and transport. However, errors in model applications still exist due in part to limitations in the models themselves and in part to uncertainties in model inputs. Four-dimensional data assimilation (FDDA) can be used as a top-down tool to validate several of the model inputs, including emissions inventories, based on ambient measurements. Previously, this FDDA technique was used to estimate adjustments in the strength and composition of emissions of gas-phase primary species and O3 precursors. In this paper, we present an extension to the FDDA technique to incorporate the analysis of particulate matter (PM) and its precursors. The FDDA approach consists of an iterative optimization procedure in which an AQM is coupled to an inverse model, and adjusting the emissions minimizes the difference between ambient measurements and model-derived concentrations. Here, the FDDA technique was applied to two episodes, with the modeling domain covering the eastern United States, to derive emission adjustments of domainwide sources of NO., volatile organic compounds (VOCs), CO, SO2, NH3, and fine organic aerosol emissions. Ambient measurements used include gas-phase inorganic and organic species and speciated fine PM. Results for the base-case inventories used here indicate that emissions of SO2 and CO appear to be estimated reasonably well (requiring minor revisions), while emissions of NOx, VOC, NH3, and organic PM with aerodynamic diameter less than 2.5 microm (PM2.5) require more significant revision.  相似文献   

16.
Abstract

The capping of stationary source emissions of NOx in 22 states and the District of Columbia is federally mandated by the NOx SIP Call legislation with the intended purpose of reducing downwind O3 concentrations. Monitors for NO, NO2, and the reactive oxides of nitrogen into which these two compounds are converted will record data to evaluate air quality model (AQM) predictions. Guidelines for testing these models indicate the need for semicontinuous measurements as close to real time as possible but no less frequently than once per hour. The measurement uncertainty required for AQM testing must be less than ±20% (±10% for NO2) at mixing ratios of 1 ppbv and higher for NO, individual NOz component compounds, and NOy. This article is a review and discussion of different monitoring methods, some currently used in research and others used for routine monitoring. The performance of these methods is compared with the monitoring guidelines. Recommendations for advancing speciated and total NOy monitoring technology and a listing of demonstrated monitoring approaches are also presented.  相似文献   

17.

In this study, a multi-level-factorial risk-inference-based possibilistic-probabilistic programming (MRPP) method is proposed for supporting water quality management under multiple uncertainties. The MRPP method can handle uncertainties expressed as fuzzy-random-boundary intervals, probability distributions, and interval numbers, and analyze the effects of uncertainties as well as their interactions on modeling outputs. It is applied to plan water quality management in the Xiangxihe watershed. Results reveal that a lower probability of satisfying the objective function (θ) as well as a higher probability of violating environmental constraints (q i ) would correspond to a higher system benefit with an increased risk of violating system feasibility. Chemical plants are the major contributors to biological oxygen demand (BOD) and total phosphorus (TP) discharges; total nitrogen (TN) would be mainly discharged by crop farming. It is also discovered that optimistic decision makers should pay more attention to the interactions between chemical plant and water supply, while decision makers who possess a risk-averse attitude would focus on the interactive effect of q i and benefit of water supply. The findings can help enhance the model’s applicability and identify a suitable water quality management policy for environmental sustainability according to the practical situations.

  相似文献   

18.
Air toxics emission inventories play an important role in air quality regulatory activities. Recently, Minnesota Pollution Control Agency (MPCA) staff compiled a comprehensive air toxics emission inventory for 1996. While acquiring data on the mass of emissions is a necessary first step, equally important is developing information on the potential toxicity of the emitted pollutants. To account for the toxicity of the pollutants in the emission inventory, inhalation health benchmarks for acute effects, chronic effects, and cancer were used to weight the mass of emissions. The 1996 Minnesota emissions inventory results were ranked by mass of emissions and by an index comprised of emissions divided by health benchmarks. The results show that six of eight pollutants ranked highest by toxicity were also the pollutants of concern indicated in environmental monitoring data and modeling data. Monitoring data and modeling results did not show high impacts of the other two pollutants that were identified by the toxicity-based emission ranking method. The biggest limitation in this method is the lack of health benchmark values for many pollutants. Despite uncertainties and limited information, this analysis provides useful information for further targeting pollutants and source categories for control.  相似文献   

19.
Abstract

Understanding ozone response to its precursor emissions is crucial for effective air quality management practices. This nonlinear response is usually simulated using chemical transport models, and the modeling results are affected by uncertainties in emissions inputs. In this study, a high ozone episode in the southeastern United States is simulated using the Community Multiscale Air Quality (CMAQ) model. Uncertainties in ozone formation and response to emissions controls due to uncertainties in emission rates are quantified using the Monte Carlo method. Instead of propagating emissions uncertainties through the original CMAQ, a reduced form of CMAQ is formulated using directly calculated first- and second-order sensitivities that capture the nonlinear ozone concentration-emission responses. This modification greatly reduces the associated computational cost. Quantified uncertainties in modeled ozone concentrations and responses to various emissions controls are much less than the uncertainties in emissions inputs. Average uncertainties in modeled ozone concentrations for the Atlanta area are less than 10% (as measured by the inferred coefficient of variance [ICOV]) even when emissions uncertainties are assumed to vary between a factor of 1.5 and 2. Uncertainties in the ozone responses generally decrease with increased emission controls. Average uncertainties (ICOV) in emission-normalized ozone responses range from 4 to 22%, with the smaller being associated with controlling of the relatively certain point nitrogen oxide (NOx) emissions and the larger resulting from controlling of the less certain mobile NOx emissions. These small uncertainties provide confidence in the model applications, such as in performance evaluation, attainment demonstration, and control strategy development.  相似文献   

20.
ABSTRACT

The current status of the mathematical modeling of atmospheric particulate matter (PM) is reviewed in this paper. Simulating PM requires treating various processes, including the formation of condensable species, the gas/ particle partitioning of condensable compounds, and in some cases, the evolution of the particle size distribution. The algorithms available to simulate these processes are reviewed and discussed. Eleven 3-dimensional (3-D) Eulerian air quality models for PM are reviewed in terms of their formulation and past applications. Results of past performance evaluations of 3-D Eulerian PM models are presented. Currently, 24-hr average PM2.5 concentrations appear to be predicted within 50% for urban-scale domains. However, there are compensating errors among individual particulate species. The lowest errors tend to be associated with SO4 2-, while NO3 -, black carbon (BC), and organic carbon (OC) typically show larger errors due to uncertainties in emissions inventories and the prediction of the secondary OC fraction. Further improvements and performance evaluations are recommended.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号