首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The Danish Meteorological Institute (DMI) has developed an operational forecasting system for ozone concentrations in the Atmospheric Boundary Layer; this system is called the Danish Atmospheric Chemistry FOrecasting System (DACFOS). At specific sites where real-time ozone concentration measurements are available, a statistical after-treatment of DACFOS’ results adjusts the next 48 h ozone forecasts. This post-processing of DACFOS’ forecasts is based on an adaptive linear regression model using an optimal state estimator algorithm. The regression analysis uses different linear combinations of meteorological parameters (such as temperature, wind speed, surface heat flux and atmospheric boundary layer height) supplied by the Numerical Weather Prediction model DMI-HIRLAM. Several regressions have been tested for six monitoring stations in Denmark and in England, and four of the linear combinations have been selected to be employed in an automatic forecasting system. A statistical study comparing observations and forecasts shows that this system yields higher correlation coefficients as well as smaller biases and RMSE values than DACFOS; the present post-processing thus improves DACFOS’ forecasts. This system has been operational since June 1998 at the DMI's monitoring station in the north of Copenhagen, for which a new ozone forecast is presented every 6 h on the DMI's internet public homepage.  相似文献   

2.
In this study, the particulate matter (with an aerodynamic diameter <10 μm; PM10) profile of Turkey with data from the air quality monitoring stations located throughout the country was used. The number of stations (119) was reduced to 55 after a missing data treatment for statistical analyses. First, a classification method was developed based on ongoing national and international (European Commission directives) legislations to categorize air zones into six groups, from a “Very Clear Air Zone” to a “Polluted Air Zone.” Then, a Geographic Information System (GIS)-based interpolation technique and statistical analyses (correlation analysis and factor analysis) were used to generate PM10 pollution profiles of the annual heating time and nonheating time periods. Finally, the coherent air pollution management zones of Turkey, based on air quality criteria and measured data using a GIS-based model supported by statistical analyses, were suggested. Based on the analysis, four hot spots were identified: (i) the eastern part of the Black Sea region; (ii) the northeastern part of inland Anatolia; (iii) the western part of Northeastern Anatolia; and (vi) the eastern part of Turkey. The possible reasons for the elevated PM10 levels are discussed using topographic, climatologic, land use, and energy utilization parameters. Finally, the suggested air zones were compared with the administrative air zones, which were newly developed by the Turkish Ministry of Environment and Forestry, to evaluate the level of agreement between the two.

Implications: The evaluation of air quality profiles of specific regions is important in the development and/or application of an effective air quality management strategy. Factor analysis (FA), together with correlation analysis (CA), provides useful information to classify air pollution management areas over regional networks that have historical time-series air quality data. In this study, which relied on a FA- and CA-based methodology, the coherent air pollution management zones of Turkey after using a GIS-based model were suggested. Policy makers and scientist can use these suggested zones to construct better air quality management strategies.  相似文献   

3.
Abstract

The National Oceanic and Atmospheric Administration recently sponsored the New England Forecasting Pilot Program to serve as a “test bed” for chemical forecasting by providing all of the elements of a National Air Quality Forecasting System, including the development and implementation of an evaluation protocol. This Pilot Program enlisted three regional-scale air quality models, serving as prototypes, to forecast ozone (O3) concentrations across the northeastern United States during the summer of 2002. A suite of statistical metrics was identified as part of the protocol that facilitated evaluation of both discrete forecasts (observed versus modeled concentrations) and categorical forecasts (observed versus modeled exceedances/nonexceedances) for both the maximum 1-hr (125 ppb) and 8-hr (85 ppb) forecasts produced by each of the models. Implementation of the evaluation protocol took place during a 25-day period (August 5–29), utilizing hourly O3 concentration data obtained from over 450 monitors from the U.S. Environment Protection Agency’s Air Quality System network.  相似文献   

4.
A simplified hybrid statistical-deterministic chemistry-transport model, is used in real time for the prediction of ozone in the area of Paris during Summer 1999. We present here a statistical validation of this experiment. We distinguish the forecasts in the urban area from forecasts in the pollution plume downwind of the city. The validation of model forecasts, up to 3 days ahead, is performed against ground based observations within and up to 50 km outside of Paris. In the urban area, ozone levels are fairly well forecast, with correlation coefficients between forecast and observations ranging between 0.7 and 0.8 and root mean square errors in the range 15–20 μg m−3 at short lead times. While the bias of urban forecast is very low, the largest peaks are somehow underestimated. The ozone plume amplitude is generally well reproduced, even at long lead times (root mean square errors of about 20–30 μg m−3), while the direction of the plume is only captured at short lead times (about 70% of the time). The model has difficulties in forecasting the direction of the plume under stagnant weather conditions. We estimate the model ability to forecast concentrations above 180 μg m−3, which are of practical relevance to air quality managers. It is found that about 60% of these events are well forecast, even at long lead times, while the exact monitoring station where the exceedance is observed can only be forecast at short lead times. Finally, we found that about half of the forecast error is due to the error in the estimation of the boundary conditions, which are forecast by a simple linear regression model here.  相似文献   

5.
Visibility degradation, one of the most noticeable indicators of poor air quality, can occur despite relatively low levels of particulate matter when the risk to human health is low. The availability of timely and reliable visibility forecasts can provide a more comprehensive understanding of the anticipated air quality conditions to better inform local jurisdictions and the public. This paper describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada’s operational Regional Air Quality Deterministic Prediction System (RAQDPS) for the Lower Fraser Valley of British Columbia. A baseline model (GM-IMPROVE) was constructed using the revised IMPROVE algorithm based on unprocessed forecasts from the RAQDPS. Three additional prototypes (UMOS-HYB, GM-MLR, GM-RF) were also developed and assessed for forecast performance of up to 48 hr lead time during various air quality and meteorological conditions. Forecast performance was assessed by examining their ability to provide both numerical and categorical forecasts in the form of 1-hr total extinction and Visual Air Quality Ratings (VAQR), respectively. While GM-IMPROVE generally overestimated extinction more than twofold, it had skill in forecasting the relative species contribution to visibility impairment, including ammonium sulfate and ammonium nitrate. Both statistical prototypes, GM-MLR and GM-RF, performed well in forecasting 1-hr extinction during daylight hours, with correlation coefficients (R) ranging from 0.59 to 0.77. UMOS-HYB, a prototype based on postprocessed air quality forecasts without additional statistical modeling, provided reasonable forecasts during most daylight hours. In terms of categorical forecasts, the best prototype was approximately 75 to 87% correct, when forecasting for a condensed three-category VAQR. A case study, focusing on a poor visual air quality yet low Air Quality Health Index episode, illustrated that the statistical prototypes were able to provide timely and skillful visibility forecasts with lead time up to 48 hr.

Implications: This study describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada’s operational Regional Air Quality Deterministic Prediction System. The main applications include tourism and recreation planning, input into air quality management programs, and educational outreach. Visibility forecasts, when supplemented with the existing air quality and health based forecasts, can assist jurisdictions to anticipate the visual air quality impacts as perceived by the public, which can potentially assist in formulating the appropriate air quality bulletins and recommendations.  相似文献   


6.
Air quality forecasting is a recent development, with most programs initiated only in the last 20 years. During the last decade, forecast preparation procedure—the forecast rote—has changed dramatically. This paper summarizes the unique challenges posed by air quality forecasting, details the current forecast rote, and analyzes prospects for future improvements. Because air quality forecasts must diagnose and predict several pollutants and their precursors in addition to standard meteorological variables, it is, compared with weather forecasts, a higher-uncertainty forecast. Forecasters seek to contain the uncertainty by “anchoring” the forecast, using an a priori field, and then “adjusting” the forecast using additional information. The air quality a priori, or first guess, field is a blend of past, current, and near-term future observations of the pollutants of interest, on both local and regional scales, and is typically coupled with predicted air parcel trajectories. Until recently, statistical methods, based on long-term training data sets, were used to adjust the first guess. However, reductions in precursor emissions in the United States, beginning in the late 1990s and continuing to the present, eroded the stationarity assumption for the training data sets and degraded forecast skill. Beginning in the mid-2000s, output from modified numerical air quality prediction (NAQP) models, originally developed to test pollution control strategies, became available in near real time for forecast support. The current adjustment process begins with the analyses and postprocessing of individual NAQP models and their ad hoc ensembles, often in concert with new statistical techniques. The final adjustment step uses forecaster expertise to assess the impact of mesoscale features not resolved by the NAQP models. It is expected that advances in model resolution, chemical data assimilation, and the formulation of emissions fields will improve mesoscale predictions by NAQP models and drive future changes in the forecast rote.

Implications: Routine air quality forecasts are now issued for nearly all the major U.S. metropolitan areas. Methods of forecast preparation—the forecast rote—have changed significantly in the last decade. Numerical air quality models have matured and are now an indispensable part of the forecasting process. All forecasting methods, particularly statistically based models, must be continually calibrated to account for ongoing local- and regional-scale emission reductions.  相似文献   


7.
In this study, the particulate matter (with an aerodynamic diameter <10 microm; PM10) profile of Turkey with data from the air quality monitoring stations located throughout the country was used. The number of stations (119) was reduced to 55 after a missing data treatment for statistical analyses. First, a classification method was developed based on ongoing national and international (European Commission directives) legislations to categorize air zones into six groups, from a "Very Clear Air Zone" to a "Polluted Air Zone". Then, a Geographic Information System (GIS)-based interpolation technique and statistical analyses (correlation analysis and factor analysis) were used to generate PM10 pollution profiles of the annual heating time and nonheating time periods. Finally, the coherent air pollution management zones of Turkey, based on air quality criteria and measured data using a GIS-based model supported by statistical analyses, were suggested. Based on the analysis, four hot spots were identified: (i) the eastern part of the Black Sea region; (ii) the northeastern part of inland Anatolia; (iii) the western part of Northeastern Anatolia; and (vi) the eastern part of Turkey. The possible reasons for the elevated PM10 levels are discussed using topographic, climatologic, land use, and energy utilization parameters. Finally, the suggested air zones were compared with the administrative air zones, which were newly developed by the Turkish Ministry of Environment and Forestry, to evaluate the level of agreement between the two.  相似文献   

8.
Abstract

Airborne fine particle sulfur data from the summer intensive of Project MOHAVE (Measurement of Haze and Visual Effects) was analyzed by the Receptor Model Applied to Patterns in Space (RMAPS) model, a novel multivariate receptor-oriented model that applies to secondary and primary species. The sulfur data from 17 sites were found to be well predicted by three spatial patterns interpreted as sources along the valley of the Colorado River; transport from sources located to the southwest; and transport from sources located to the southeast. The model was tested by using parameters derived from the 17-site data set to apportion sulfur for six sites that were not part of the original data set. The sulfur apportionment for these six sites was in agreement with the original apportionment and the physical interpretation of the spatial patterns given above. The effects of systematic and random error on the sulfur apportionment were estimated. The amount of sulfur associated with the Colorado River valley sources was rather insensitive to both types of error. For the two sites in the Grand Canyon National Park, the fraction of total particulate sulfur from the Colorado River valley source is estimated to be in the range of 27-65% at Meadview and 11-28% at Hopi Point.  相似文献   

9.
There is an urgent need to provide accurate air quality information and forecasts to the general public and environmental health decision-makers. This paper develops a hierarchical space–time model for daily 8-h maximum ozone concentration (O3) data covering much of the eastern United States. The model combines observed data and forecast output from a computer simulation model known as the Eta Community Multi-scale Air Quality (CMAQ) forecast model in a very flexible, yet computationally fast way, so that the next day forecasts can be computed in real-time operational mode. The model adjusts for spatio-temporal biases in the Eta CMAQ forecasts and avoids a change of support problem often encountered in data fusion settings where real data have been observed at point level monitoring sites, but the forecasts from the computer model are provided at grid cell levels. The model is validated with a large amount of set-aside data and is shown to provide much improved forecasts of daily O3 concentrations in the eastern United States.  相似文献   

10.
Abstract

Many large metropolitan areas experience elevated concentrations of ground-level ozone pollution during the summertime “smog season”. Local environmental or health agencies often need to make daily air pollution forecasts for public advisories and for input into decisions regarding abatement measures and air quality management. Such forecasts are usually based on statistical relationships between weather conditions and ambient air pollution concentrations. Multivariate linear regression models have been widely used for this purpose, and well-specified regressions can provide reasonable results. However, pollution-weather relationships are typically complex and nonlinear—especially for ozone—properties that might be better captured by neural networks. This study investigates the potential for using neural networks to forecast ozone pollution, as compared to traditional regression models. Multiple regression models and neural networks are examined for a range of cities under different climate and ozone regimes, enabling a comparative study of the two approaches. Model comparison statistics indicate that neural network techniques are somewhat (but not dramatically) better than regression models for daily ozone prediction, and that all types of models are sensitive to different weather-ozone regimes and the role of persistence in aiding predictions.  相似文献   

11.
The objective of this research was to develop a statistical model to predict one day in advance both the maximum and 8 h (10 am–5 pm) average ozone for Houston (TX). A loess/generalized additive model (GAM) approach was taken to model development. Ozone data (1983–1991) from ten stations in the immediate Houston area were used in the study. The meteorological data came from the Houston International Airport. The models were developed using data for April through October for 1983–1987 and 1989–1990. Forecasts were developed for 1988 and 1991. The final model, which was multiplicative in nature, contained three interaction terms for the west/east and south/north wind components (average of hourly values from 8 pm to 5 am, 6 am to 9 am, and 10 am to 5 pm). Opaque cloud cover (averaged over the period 10  am to 5 pm), yesterday’s maximum ozone, today’s maximum temperature and morning mixing depth were also important variables in the model.Individual forecasts were generated for all ten stations in the Houston area using observed meteorology. In addition forecasts were produced for three measures of the network as a whole. The root-mean-square prediction error for the 8 h average forecasts ranged from 13.2 to 16.3 ppb (with R2 ranging from 0.66 to 0.73) for the individual stations and from 18.5 to 22.0 ppb (with R2 ranging from 0.61 to 0.68) for maximum ozone. A detailed examination was undertaken for a day on which the forecast was much too low.  相似文献   

12.
This study was undertaken to identify seasonal and source effects on the par-ticulate contaminants of the New York City atmosphere and ultimately to relate the concentrations of these contaminants to the tissue concentrations in residents of New York City. Continual weekly samples of particulates have been collected at three stations in the New York area on 8 by 10 in. glass fiber filters at a flow rate of 20 cfm.

The sample is ashed with a Tracerlab Low Temperature Asher and leached with nitric acid. Metals analyzed by the Atomic Absorption method include Pb, V, Cd, Cr, Cu, Mn, Ni, and Zn. Lead-210, total particulate, and benzene and acetone soluble organic material are also determined.

The data have been related to various meteorological parameters over a one year period to define significant seasonal and source influences, as well as site to site variations. Very significant inverse correlations to temperature are obtained for suspended particulates, vanadium, and nickel at both Manhattan and Bronx sites. Particulates show a less significant inverse correlation to temperature In lower Manhattan. Oil-fired space heating sources appear to account for as much as 50% of the particulates in the Bronx at the peak demand period.

Lead, copper, and cadmium show a general inverse correlation to average wind speed, and a direct correlation to temperature. The latter is most likely due to an inverse relation between wind speed and temperature. The heating season input for particulates, vanadium, and nickel is so great as to overcome most of the dilution effect due to winds. The other elements having more constant nonseasonal inputs, definitely reflect the effects of the wind.

The most significant site effect occurs with cadmium, which has a concentration in lower Manhattan three times that of the Bronx over a period of six to seven months in the summer and fall. The differences observed for cadmium and particulates may be explained by emission source factors which have not as yet been studied.  相似文献   

13.
The contribution of ZAMG to MONAROP consists of special weather forecasts to control the SOCs sampling procedure and of the analysis of the specific transport processes for SOCs, which is still in progress.In this paper, air pollutant transport into the Alps is demonstrated by examples of inorganic pollutants: Measurements of NOx and ozone provide evidence for air pollutant transport by local wind systems (valley and slope winds), especially at low elevated sites of the Alps. In addition, trajectory analyses for the high elevation sites demonstrate the importance of large scale synoptic air pollutant transport. The effects of these transport processes with different spatial and temporal scales are governed by the physical and chemical properties of the particular pollutant.First results for the high alpine MONARPOP stations show that air masses from east Europe influence mostly Sonnblick (Austria), whereas the influence of the Po basin is strongest at Weissfluhjoch (Switzerland).  相似文献   

14.
This two-part paper reports on the development, implementation, and improvement of a version of the Community Multi-Scale Air Quality (CMAQ) model that assimilates real-time remotely-sensed aerosol optical depth (AOD) information and ground-based PM2.5 monitor data in routine prognostic application. The model is being used by operational air quality forecasters to help guide their daily issuance of state or local-agency-based air quality alerts (e.g. action days, health advisories). Part 1 describes the development and testing of the initial assimilation capability, which was implemented offline in partnership with NASA and the Visibility Improvement State and Tribal Association of the Southeast (VISTAS) Regional Planning Organization (RPO). In the initial effort, MODIS-derived aerosol optical depth (AOD) data are input into a variational data-assimilation scheme using both the traditional Dark Target and relatively new “Deep Blue” retrieval methods. Evaluation of the developmental offline version, reported in Part 1 here, showed sufficient promise to implement the capability within the online, prognostic operational model described in Part 2. In Part 2, the addition of real-time surface PM2.5 monitoring data to improve the assimilation and an initial evaluation of the prognostic modeling system across the continental United States (CONUS) is presented.

Implications: Air quality forecasts are now routinely used to understand when air pollution may reach unhealthy levels. For the first time, an operational air quality forecast model that includes the assimilation of remotely-sensed aerosol optical depth and ground based PM2.5 observations is being used. The assimilation enables quantifiable improvements in model forecast skill, which improves confidence in the accuracy of the officially-issued forecasts. This helps air quality stakeholders be more effective in taking mitigating actions (reducing power consumption, ride-sharing, etc.) and avoiding exposures that could otherwise result in more serious air quality episodes or more deleterious health effects.  相似文献   

15.
Abstract

A time series approach using autoregressive integrated moving average (ARIMA) modeling has been used in this study to obtain maximum daily surface ozone (O3) concentration forecasts. The order of the fitted ARIMA model is found to be (1,0,1) for the surface O3 data collected at the airport in Brunei Darussalam during the period July 1998-March 1999. The model forecasts of one-day-ahead maximum O3 concentrations have been found to be reasonably close to the observed concentrations. The model performance has been evaluated on the basis of certain commonly used statistical measures. The overall model performance is found to be quite satisfactory as indicated by the values of Fractional Bias, Normalized Mean Square Error, and Mean Absolute Percentage Error as 0.025, 0.02, and 13.14% respectively.  相似文献   

16.

Two experiments were conducted in male Sprague Dawley (SD) rats (175–200 g) to determine changes in the activities of endogenous antioxidants superoxide dismutase (SOD), glutathione peroxidase (GPX), cytochrome P450 (ethoxyresorufin deethylase; EROD) and concentrations of glutathione (GSH) in the blood, liver, and small intestinal mucosa (IM). In both experiments, six rats/group were fed diets based on the AIN-93M diet (Control) or the same modified to contain either 500 mg calcium (Low Ca), 7 mg Zn (Low Zn): 2 mg copper (Low Cu), 60 mg zinc (High Zn) or 12 mg copper (High Cu) in the following combination: Control, LCa/LZn, LCa/LZn/LCu, or HZn/HCu, with and without a pesticide mixture containing acephate, endosulfan, and thiram at 25% LD50 for four or two weeks. Pesticides decreased feed intake and weight gain in all groups by 28%. Erythrocyte SOD was higher than control in the HZn/HCu group and in the LCa/LZn/LCu and HZn/HCu groups with pesticide (P# 0.05). Plasma GPX declined by more than 55% in all the groups with and without pesticides compared to the control. The LCa/LZn/LCu and HZn/HCu diets with and without pesticides reduced GPX in the IM by up to 88%, 40%, and 74%, respectively, than the control. Plasma GSH was about 20% higher than the control in most groups with and without pesticides in the diet. Liver and IM GSH were higher than the control in the HZn/HCu group, whereas IM GSH concentrations were lower than the control in the LCa/LZn and LCa/LZn/LCu groups (P#0.05). All three experimental diets with and without pesticides had a significant effect on liver EROD activity (P#0.05). The results indicate that endogenous antioxidants and EROD were independently modified by dietary zinc and copper levels and pesticides.  相似文献   

17.
Concerns about the levels of dust deposition in the vicinity of coal-fired power stations in North Yorkshire, in particular Drax Power Station, prompted the commissioning of a detailed monitoring study in the area. This paper describes the first two years' work. The first 12-month study concentrated on the village of Barlow close to Drax Power Station, whilst in the second 12-month study, monitoring sites were spread along a transect passing through the power station belt formed by Ferrybridge, Eggborough and Drax Power Stations. Two monitoring sites were common to both 12-month studies, thus giving two years of continuous monitoring. Pairs of wet Frisbee dust deposit gauges (based on inverted Frisbees) were located at each site. Undissolved particulate matter from each gauge was weighed and characterized by microscopic examination of individual particles. The first 12-month study revealed a downward gradient in dust deposition rate and cenosphere content with distance from Drax Power Station. The high cenosphere content at Barlow, especially at the eastern end, suggested that there was a significant contribution from coal-fired power stations. In the second year, the overall pattern of dust deposition rate and cenosphere content across the power station belt suggested that power stations were contributing to higher levels. In particular, relatively high levels were again found at Barlow. Wind direction correlations point to the fly-ash tip next to Drax Power Station as being the source of cenospheres arriving at Barlow. It is concluded that in both years the fly-ash tip Drax Power Station was making a significant contribution to higher than expected dust deposition rates at Barlow, particularly its eastern end. Other villages in the area may also have been affected by dust originating from coal-fired power stations.  相似文献   

18.
The post-harvest burning of agricultural fields is commonly used to dispose of crop residue and provide other desired services such as pest control. Despite careful regulation of burning, smoke plumes from field burning in the Pacific Northwest commonly degrade air quality, particularly for rural populations. In this paper, ClearSky, a numerical smoke dispersion forecast system for agricultural field burning that was developed to support smoke management in the Inland Pacific Northwest, is described. ClearSky began operation during the summer through fall burn season of 2002 and continues to the present. ClearSky utilizes Mesoscale Meteorological Model version 5 (MM5v3) forecasts from the University of Washington, data on agricultural fields, a web-based user interface for defining burn scenarios, the Lagrangian CALPUFF dispersion model and web-served animations of plume forecasts. The ClearSky system employs a unique hybrid source configuration, which treats the flaming portion of a field as a buoyant line source and the smoldering portion of the field as a buoyant area source. Limited field observations show that this hybrid approach yields reasonable plume rise estimates using source parameters derived from recent field burning emission field studies. The performance of this modeling system was evaluated for 2003 by comparing forecast meteorology against meteorological observations, and comparing model-predicted hourly averaged PM2.5 concentrations against observations. Examples from this evaluation illustrate that while the ClearSky system can accurately predict PM2.5 surface concentrations due to field burning, the overall model performance depends strongly on meteorological forecast error. Statistical evaluation of the meteorological forecast at seven surface stations indicates a strong relationship between topographical complexity near the station and absolute wind direction error with wind direction errors increasing from approximately 20° for sites in open areas to 70° or more for sites in very complex terrain. The analysis also showed some days with good forecast meteorology with absolute mean error in wind direction less than 30° when ClearSky correctly predicted PM2.5 surface concentrations at receptors affected by field burns. On several other days with similar levels of wind direction error the model did not predict apparent plume impacts. In most of these cases, there were no reported burns in the vicinity of the monitor and, thus, it appeared that other, non-reported burns were responsible for the apparent plume impact at the monitoring site. These cases do not provide information on the performance of the model, but rather indicate that further work is needed to identify all burns and to improve burn reports in an accurate and timely manner. There were also a number of days with wind direction errors exceeding 70° when the forecast system did not correctly predict plume behavior.  相似文献   

19.
In operational forecasting of the surface O3 by statistical modelling, it is customary to assume the O3 time series to be generated through a homoskedastic process. In the present work, we’ve taken heteroskedasticity of the O3 time series explicitly into account and have shown how it resulted in O3 forecasts with improved forecast confidence intervals. Moreover, it also enabled us to make more accurate probability forecasts of ozone episodes in the urban areas. The study has been conducted on daily maximum O3 time series for four urban sites of two major European cities, Brussels and London. The sites are: Brussels (Molenbeek) (B1), Brussels (PARL.EUROPE) (B2), London (Brent) (L1) and London (Bloomsbury) (L2). Fast Fourier Transform (FFT) has been used to model the periodicities (annual periodicity is especially distinct) exhibited by the time series. The residuals of “actual data subtracted with their corresponding FFT component” exhibited stationarity and have been modelled using ARIMA (Autoregressive Integrated Moving Average) process. The MAPEs (Mean absolute percentage errors) using FFT–ARIMA for one day ahead 100 out of sample forecasts, were obtained as follows: 20%, 17.8%, 19.7% and 23.6% at the sites B1, B2, L1 and L2. The residuals obtained through FFT–ARIMA have been modelled using GARCH (Generalized Autoregressive Conditional Heteroskedastic) process. The conditional standard deviations obtained using GARCH have been used to estimate the improved forecast confidence intervals and to make probability forecasts of ozone episodes. At the sites B1, B2, L1 and L2, 91.3%, 90%, 70.6% and 53.8% of the times probability forecasts of ozone episodes (for one day ahead 30 out of sample) have correctly been made using GARCH as against 82.6%, 80%, 58.8% and 38.4% without GARCH. The incorporation of GARCH also significantly reduced the no. of false alarms raised by the models.  相似文献   

20.
An automated forecast system for ozone in seven Kentucky metropolitan areas has been operational since 2004. The forecast system automatically downloads the required input data twice each day, produces next-day forecasts of metro area peak 8-h average ozone concentration using a computer coded hybrid nonlinear regression (NLR) model, and posts the results on a website. The automated models were similar to previous NLR models, first applied to forecasting ozone in the Louisville metro area. The forecast system operated reliably during the 2004 and 2005 O3 seasons, producing at least one forecast per day better than 99% of the time. The forecast accuracy of the automated system was good. For all 2004 and 2005 forecasts, the mean absolute error was equal to 8.7 ppb, or 15.6% of the overall mean concentration. The overall detection rate of air quality standard exceedences was 56%, and the overall false alarm rate was 42%. In Louisville, the performance of the automated system was comparable to that of expert forecasters using the NLR model as a forecast tool.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号