首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
The many advances made in air quality model evaluation procedures during the past ten years are discussed and some components of model uncertainty presented. Simplified statistical procedures for operational model evaluation are suggested. The fundamental model performance measures are the mean bias, the mean square error, and the correlation. The bootstrap resampling technique is used to estimate confidence limits on the performance measures, In order to determine if a model agrees satisfactorily with data or if one model is significantly different from another model. Applications to two tracer experiments are described.

It is emphasized that review and evaluation of the scientific components of models are often of greater Importance than the strictly statistical evaluation. A necessary condition for acceptance Of a model should be that it is scientifically correct. It Is shown that even in research-grade tracer experiments, data Input errors can cause errors In hourly-average model predictions of point concentrations almost as large as the predictions themselves. The turbulent or stochastic component of model uncertainty has a similar magnitude. These components of the uncertainty decrease as averaging time increases.  相似文献   

2.
A simple air pollution model has been tested for New York City with data from five mornings which were characterized by urban heat island effects. The model is nondiffusive and relies primarily upon conservation of mass. The correlation coefficient between over 400 predicted and observed values of SO2 mixing ratio was 0.83.  相似文献   

3.
This paper presents an evaluation of four gaussian (GM, HIWAY, AIRPOL-4, CALINE-2), and three numerical (DANARD, MROAD 2, ROADS) models with the tracer gas data collected in the General Motors experiment. Various statistical techniques are employed to quantify the predictive capability of each of the above models. In general, the three numerical models performed rather poorly compared to the gaussian models. For this data set, the model with the best performance in accurately predicting the measured concentrations was the GM model followed in order by AIRPOL-4, HIWAY, CALINE-2, DANARD, MR0AD2, and ROADS. Although the GM model provides by far a better simulation than any of the models tested here, it is skewed toward underprediction. As a screening tool for regulatory purposes, however, HIWAY model would be useful since this model has the highest percentage in the category of overprediction if the concentration data in the range of 50th percentile through 100th percentile are included in the analysis. The present version of the HIWAY model for stable and parallel wind-road conditions warrants modifications to improve its predictive capability. Current studies indicate that the modified HIWAY model can be used with greater confidence by the regulatory agencies.  相似文献   

4.
This paper reviews four commonly used statistical methods for environmental data analysis and discusses potential pitfalls associated with application of these methods through real case study data. The four statistical methods are percentile and confidence interval, correlation coefficient, regression analysis, and analysis of variance (ANOVA). The potential pitfall for estimation of percentile and confidence interval includes the automatic assumption of a normal distribution to environmental data, which so often show a log-normal distribution. The potential pitfall for correlation coefficient includes the use of a wide range of data points in which the maximum in value may trivialize other smaller data points and consequently skew the correlation coefficient. The potential pitfall for regression analysis includes the propagation of uncertainties of input variables to the regression model prediction, which may be even more uncertain. The potential pitfall for ANOVA includes the acceptance of a hypothesis as a weak argument to imply a strong conclusion. As demonstrated in this paper, we may draw very different conclusions based on statistical analysis if the pitfalls are not identified. Reminder and enlightenment obtained from the pitfalls are given at the end of this article.  相似文献   

5.
This paper reviews four commonly used statistical methods for environmental data analysis and discusses potential pitfalls associated with application of these methods through real case study data. The four statistical methods are percentile and confidence interval, correlation coefficient, regression analysis, and analysis of variance (ANOVA). The potential pitfall for estimation of percentile and confidence interval includes the automatic assumption of a normal distribution to environmental data, which so often show a log-normal distribution. The potential pitfall for correlation coefficient includes the use of a wide range of data points in which the maximum in value may trivialize other smaller data points and consequently skew the correlation coefficient. The potential pitfall for regression analysis includes the propagation of uncertainties of input variables to the regression model prediction, which may be even more uncertain. The potential pitfall for ANOVA includes the acceptance of a hypothesis as a weak argument to imply a strong conclusion. As demonstrated in this paper, we may draw very different conclusions based on statistical analysis if the pitfalls are not identified. Reminder and enlightenment obtained from the pitfalls are given at the end of this article.  相似文献   

6.
The RAM model provided by the U.S. EPA has been applied to the metropolitan Detroit area for SO2 concentrations and is compared to concentrations predicted by a numerical model and to field data obtained by the 14 station air sampling network maintained by the Wayne County Air Pollution Control Division. Great care was taken to develop the emission inventory. Based upon examination of the temporal and spatial correspondence of the respective model predictions and observed concentrations, the correlation coefficients for the 24-hour averaged data, the correlation coefficients for over 700 3-hour averaged observations, and the cumulative frequency distributions of the model output and observations, it is concluded that the numerical model provides a superior predictive tool to evaluate cause and effect relations, but that the RAM model, at far lower cost, predicts the correct magnitude of the worst events. Hence RAM might well be used in the Detroit Area for statistically based regulatory decisions.  相似文献   

7.
Statistical analysis of regulatory ecotoxicity tests.   总被引:10,自引:0,他引:10  
ANOVA-type data analysis, i.e.. determination of lowest-observed-effect concentrations (LOECs), and no-observed-effect concentrations (NOECs), has been widely used for statistical analysis of chronic ecotoxicity data. However, it is more and more criticised for several reasons, among which the most important is probably the fact that the NOEC depends on the choice of test concentrations and number of replications and rewards poor experiments, i.e., high variability, with high NOEC values. Thus, a recent OECD workshop concluded that the use of the NOEC should be phased out and that a regression-based estimation procedure should be used. Following this workshop, a working group was established at the French level between government, academia and industry representatives. Twenty-seven sets of chronic data (algae, daphnia, fish) were collected and analysed by ANOVA and regression procedures. Several regression models were compared and relations between NOECs and ECx, for different values of x, were established in order to find an alternative summary parameter to the NOEC. Biological arguments are scarce to help in defining a negligible level of effect x for the ECx. With regard to their use in the risk assessment procedures, a convenient methodology would be to choose x so that ECx are on average similar to the present NOEC. This would lead to no major change in the risk assessment procedure. However, experimental data show that the ECx depend on the regression models and that their accuracy decreases in the low effect zone. This disadvantage could probably be reduced by adapting existing experimental protocols but it could mean more experimental effort and higher cost. ECx (derived with existing test guidelines, e.g., regarding the number of replicates) whose lowest bounds of the confidence interval are on average similar to present NOEC would improve this approach by a priori encouraging more precise experiments. However, narrow confidence intervals are not only linked to good experimental practices, but also depend on the distance between the best model fit and experimental data. At least, these approaches still use the NOEC as a reference although this reference is statistically not correct. On the contrary, EC50 are the most precise values to estimate on a concentration response curve, but they are clearly different from the NOEC and their use would require a modification of existing assessment factors.  相似文献   

8.
In air pollution epidemiology, error in measurements of correlated pollutants has been advanced as a reason to distrust regressions that find statistically significant weak associations. Much of the related debate in the literature and elsewhere has been qualitative. To promote quantitative evaluation of such errors, this paper develops an air pollution time-series model based on correlations among unit-normal variables. Assuming there are no other sources of bias present, the model shows the expected amount of relative bias in the regression coefficients of a bivariate regression of coarse and fine particulate matter measurements on daily mortality. The model only requires information on instrumental error and spatial variability, along with the observed regression coefficients and information on the true fine-course correlation. Analytical results show that if one pollutant is truly more harmful than the other, then it must be measured more precisely than the other in order not to bias the ratio of the fine and course regression coefficients. Utilizing published data, a case study of the Harvard Six-Cities study illustrates use of the model and emphasizes the need for data on spatial variability across the study area. Current epidemiology time-series regressions can use this model to address the general concern of correlated pollutants with differing measurement errors.  相似文献   

9.
Abstract

Confidence interval construction for central tendency is a problem of practical consequence for those who must analyze air contaminant data. Determination of compliance with relevant ambient air quality criteria and assessment of associated health risks depend upon quantifying the uncertainty of estimated mean pollutant concentrations. The bootstrap is a resampling technique that has been steadily gaining popularity and acceptance during the past several years. A potentially powerful application of the bootstrap is the construction of confidence intervals for any parameter of any underlying distribution. Properties of bootstrap confidence intervals were determined for samples generated from lognormal, gamma, and Weibull distributions. Bootstrap t intervals, while having smaller coverage errors than Student's t or other bootstrap methods, under-cover for small samples from skewed distributions. Therefore, we caution against using the bootstrap to construct confidence intervals for the mean without first considering the effects of sample size and skew. When sample sizes are small, one might consider using the median as an estimate of central tendency. Confidence intervals for the median are easy to construct and do not under-cover. Data collected by the Northeast States for Coordinated Air Use Management (NESCAUM) are used to illustrate application of the methods discussed.  相似文献   

10.
Emission projections are important for environmental policy, both to evaluate the effectiveness of abatement strategies and to determine legislation compliance in the future. Moreover, including uncertainty is an essential added value for decision makers. In this work, projection values and their associated uncertainty are computed for pollutant emissions corresponding to the most significant activities from the national atmospheric emission inventory in Spain. Till now, projections had been calculated under three main scenarios: “without measures” (WoM), “with measures” (WM) and “with additional measures” (WAM). For the first one, regression techniques had been applied, which are inadequate for time-dependent data. For the other scenarios, values had been computed taking into account expected activity growth, as well as policies and measures. However, only point forecasts had been computed. In this work statistical methodology has been applied for: a) Inclusion of projection intervals for future time points, where the width of the intervals is a measure of uncertainty. b) For the WoM scenario, ARIMA models are applied to model the dynamics of the processes. c) In the WM scenario, bootstrap is applied as an additional non-parametric tool, which does not rely on distributional assumptions and is thus more general. The advantages of using ARIMA models for the WoM scenario including uncertainty are shown. Moreover, presenting the WM scenario allows observing if projected emission values fall within the intervals, thus showing if the measures to be taken to reach the scenario imply a significant improvement. Results also show how bootstrap techniques incorporate stochastic modelling to produce forecast intervals for the WM scenario.  相似文献   

11.
Fuzzy QSARs for predicting logKoc of persistent organic pollutants   总被引:2,自引:0,他引:2  
Uddameri V  Kuchanur M 《Chemosphere》2004,54(6):771-776
  相似文献   

12.
The Danish Meteorological Institute (DMI) has developed an operational forecasting system for ozone concentrations in the Atmospheric Boundary Layer; this system is called the Danish Atmospheric Chemistry FOrecasting System (DACFOS). At specific sites where real-time ozone concentration measurements are available, a statistical after-treatment of DACFOS’ results adjusts the next 48 h ozone forecasts. This post-processing of DACFOS’ forecasts is based on an adaptive linear regression model using an optimal state estimator algorithm. The regression analysis uses different linear combinations of meteorological parameters (such as temperature, wind speed, surface heat flux and atmospheric boundary layer height) supplied by the Numerical Weather Prediction model DMI-HIRLAM. Several regressions have been tested for six monitoring stations in Denmark and in England, and four of the linear combinations have been selected to be employed in an automatic forecasting system. A statistical study comparing observations and forecasts shows that this system yields higher correlation coefficients as well as smaller biases and RMSE values than DACFOS; the present post-processing thus improves DACFOS’ forecasts. This system has been operational since June 1998 at the DMI's monitoring station in the north of Copenhagen, for which a new ozone forecast is presented every 6 h on the DMI's internet public homepage.  相似文献   

13.
The purpose of this project was to investigate the relationship of ambient air quality measurements between two analytical methods, referred to as the total oxidant method and the chemiluminescent method. These two well documented analytical methods were run simultaneously, side by side, at a site on the Houston ship channel. They were calibrated daily. The hourly averages were analyzed by regression techniques and the confidence intervals were calculated for the regression lines. Confidence intervals for point estimates were also calculated. These methods were used with all data sets with values greater than 10 parts per billion and again with values greater than 30 parts per billion. A regression line was also calculated for a second set of data for the preceding year. These data were generated before a chromium triox-ide scrubber was installed to eliminate possible chemical interferences with the Kl method.

The results show that in general the chemiluminescent ozone method tends to produce values as much as two times higher than the simultaneous total oxidant values. In one set of data collected an 80 ppb chemiluminescent ozone value predicted a value of 43.9 ppb total oxidant with a 95% confidence interval of 7.7 to 80.4 ppb. In the second set of data an 80 ppb chemiluminescent ozone value predicted a value of 78 ppb total oxidant with a 95% confidence interval of 0.4 to 156 ppb. Other statistical analyses confirmed that either measurement was a very poor predictor of the other.  相似文献   

14.
Tens of thousands of chemicals are currently marketed worldwide, but only a small number of these compounds has been measured in effluents or the environment to date. The need for screening methodologies to select candidates for environmental monitoring is therefore significant. To meet this need, the Swedish Chemicals Agency developed the Exposure Index (EI), a model for ranking emissions to a number of environmental matrices based on chemical quantity used and use pattern. Here we evaluate the EI. Data on measured concentrations of organic chemicals in sewage treatment plants, one of the recipients considered in the EI model, were compiled from the literature, and the correlation between predicted emission levels and observed concentrations was assessed by linear regression analysis. The adequacy of the parameters employed in the EI was further explored by calibration of the model to measured concentrations. The EI was found to be of limited use for ranking contaminant levels in STPs; the r2 values for the regressions between predicted and observed values ranged from 0.02 (= 0.243) to 0.14 (= 0.007) depending on the dataset. The calibrated version of the model produced only slightly better predictions although it was fitted to the experimental data. However, the model is a valuable first step in developing a high throughput screening tool for organic contaminants, and there is potential for improving the EI algorithm.  相似文献   

15.
This paper presents a statistical model that is capable of predicting ozone levels from precursor concentrations and meteorological conditions during daylight hours in the Shuaiba Industrial Area (SIA) of Kuwait. The model has been developed from ambient air quality data that was recorded for one year starting from December 1994 using an air pollution mobile monitoring station. The functional relationship between ozone level and the various independent variables has been determined by using a stepwise multiple regression modelling procedure. The model contains two terms that describe the dependence of ozone on nitrogen oxides (NOx) and nonmethane hydrocarbon precursor concentrations, and other terms that relate to wind direction, wind speed, sulphur dioxide (SO2) and solar energy. In the model, the levels of the precursors are inversely related to ozone concentration, whereas SO2 concentration, wind speed and solar radiation are positively correlated. Typically, 63 % of the variation in ozone levels can be explained by the levels of NOx. The model is shown to be statistically significant and model predictions and experimental observations are shown to be consistent. A detailed analysis of the ozone-temperature relationship is also presented; at temperatures less than 27 °C there is a positive correlation between temperature and ozone concentration whereas at temperatures greater than 27 °C a negative correlation is seen. This is the first time a non-monotonic relationship between ozone levels and temperature has been reported and discussed.  相似文献   

16.
Species sensitivity distributions (SSDs) are increasingly used in both ecological risk assessment and derivation of water quality criteria. However, there has been debate about the choice of an appropriate approach for derivation of water quality criteria based on SSDs because the various methods can generate different values. The objective of this study was to compare the differences among various methods. Data sets of acute toxicities of 12 substances to aquatic organisms, representing a range of classes with different modes of action, were studied. Nine typical statistical approaches, including parametric and nonparametric methods, were used to construct SSDs for 12 chemicals. Water quality criteria, expressed as hazardous concentration for 5 % of species (HC5), were derived by use of several approaches. All approaches produced comparable results, and the data generated by the different approaches were significantly correlated. Variability among estimates of HC5 of all inclusive species decreased with increasing sample size, and variability was similar among the statistical methods applied. Of the statistical methods selected, the bootstrap method represented the best-fitting model for all chemicals, while log-triangle and Weibull were the best models among the parametric methods evaluated. The bootstrap method was the primary choice to derive water quality criteria when data points are sufficient (more than 20). If the available data are few, all other methods should be constructed, and that which best describes the distribution of the data was selected.  相似文献   

17.
This paper describes the results of a measurement and modeling study of carbon monoxide (CO) concentrations In the proximity of intersections. Analysis for model performance of paired observed and predicted CO concentrations are presented. Two methodologies of pollutant prediction were used: the Intersection Midblock Model (IMM) and a statistical multiple linear regression. The results showed that both methods underpredicted frequently and dispensed results that were site specific. In addition, correlations of IMM predicted concentrations to observed concentrations were poor (typically r2 values <0.25). Various explanations for this observation are proposed. The statistical approach exhibited an improved accuracy over that of IMM. However, some of the independent variables used might be difficult to obtain as a routine measurement, and use of a one or two independent parameter model yielded adjusted R2 values comparable to the r2 values observed with IMM. Based on these results, an Intersection model applicable under a wide range of conditions of traffic, meteorology, and geometry is not available. Research Is needed to develop one, since its use would often be called on in the development of air quality sections of Environmental Assessments or Environmental Impact Statements.  相似文献   

18.
W.Brock Neely 《Chemosphere》1984,13(7):813-819
A theoretical relation has been established between the water solubility of an organic chemical and the ratio of the acute fish LC50 at two different time periods. The theory was tested by examining a data base of 24 chemicals. The finding of a positive correlation between the observed and calculated ratio of the 96 hr LC50 to the 24 hr LC50 helped to substantiate the theory.  相似文献   

19.
Inorganic arsenic (InAs) is a ubiquitous metalloid that has been shown to exert multiple adverse health outcomes. Urinary InAs and its metabolite concentration has been used as a biomarker of arsenic (As) exposure in some epidemiological studies, however, quantitative relationship between daily InAs exposure and urinary InAs metabolites concentration has not been well characterized. We collected a set of 24-h duplicated diet and spot urine sample of the next morning of diet sampling from 20 male and 19 female subjects in Japan from August 2011 to October 2012. Concentrations of As species in duplicated diet and urine samples were determined by using liquid chromatography-ICP mass spectrometry with a hydride generation system. Sum of the concentrations of urinary InAs and methylarsonic acid (MMA) was used as a measure of InAs exposure. Daily dietary InAs exposure was estimated to be 0.087 µg kg?1 day?1 (Geometric mean, GM), and GM of urinary InAs+MMA concentrations was 3.5 ng mL?1. Analysis of covariance did not find gender-difference in regression coefficients as significant (P > 0.05). Regression equation Log 10 [urinary InAs+MMA concentration] = 0.570× Log 10 [dietary InAs exposure level per body weight] + 1.15 was obtained for whole data set. This equation would be valuable in converting urinary InAs concentration to daily InAs exposure, which will be important information in risk assessment.  相似文献   

20.
Background, Aim and Scope Air quality is an field of major concern in large cities. This problem has led administrations to introduce plans and regulations to reduce pollutant emissions. The analysis of variations in the concentration of pollutants is useful when evaluating the effectiveness of these plans. However, such an analysis cannot be undertaken using standard statistical techniques, due to the fact that concentrations of atmospheric pollutants often exhibit a lack of normality and are autocorrelated. On the other hand, if long-term trends of any pollutant’s emissions are to be detected, meteorological effects must be removed from the time series analysed, due to their strong masking effects. Materials and Methods The application of statistical methods to analyse temporal variations is illustrated using monthly carbon monoxide (CO) concentrations observed at an urban site. The sampling site is located at a street intersection in central Valencia (Spain) with a high traffic density. Valencia is the third largest city in Spain. It is a typical Mediterranean city in terms of its urban structure and climatology. The sampling site started operation in January 1994 and monitored CO ground level concentrations until February 2002. Its geographic coordinates are W0°22′52″ N39°28′05″ and its altitude is 11 m. Two nonparametric trend tests are applied. One of these is robust against serial correlation with regards to the false rejection rate, when observations have a strong persistence or when the sample size per month is small. A nonparametric analysis of the homogeneity of trends between seasons is also discussed. A multiple linear regression model is used with the transformed data, including the effect of meteorological variables. The method of generalized least squares is applied to estimate the model parameters to take into account the serial dependence of the residuals of this model. This study also assesses temporal changes using the Kolmogorov-Zurbenko (KZ) filter. The KZ filter has been shown to be an effective way to remove the influence of meteorological conditions on O3 and PM to examine underlying trends. Results The nonparametric tests indicate a decreasing, significant trend in the sampled site. The application of the linear model yields a significant decrease every twelve months of 15.8% for the average monthly CO concentration. The 95% confidence interval for the trend ranges from 13.9% to 17.7%. The seasonal cycle also provides significant results. There are no differences in trends throughout the months. The percentage of CO variance explained by the linear model is 90.3%. The KZ filter separates out long, short-term and seasonal variations in the CO series. The estimated, significant, long-term trend every year results in 10.3% with this method. The 95% confidence interval ranges from 8.8% to 11.9%. This approach explains 89.9% of the CO temporal variations. Discussion The differences between the linear model and KZ filter trend estimations are due to the fact that the KZ filter performs the analysis on the smoothed data rather than the original data. In the KZ filter trend estimation, the effect of meteorological conditions has been removed. The CO short-term componentis attributable to weather and short-term fluctuations in emissions. There is a significant seasonal cycle. This component is a result of changes in the traffic, the yearly meteorological cycle and the interactions between these two factors. There are peaks during the autumn and winter months, which have more traffic density in the sampled site. There is a minimum during the month of August, reflecting the very low level of vehicle emissions which is a direct consequence of the holiday period. Conclusions The significant, decreasing trend implies to a certain extent that the urban environment in the area is improving. This trend results from changes in overall emissions, pollutant transport, climate, policy and economics. It is also due to the effect of introducing reformulated gasoline. The additives enable vehicles to burn fuel with a higher air/fuel ratio, thereby lowering the emission of CO. The KZ filter has been the most effective method to separate the CO series components and to obtain an estimate of the long-term trend due to changes in emissions, removing the effect of meteorological conditions. Recommendations and Perspectives Air quality managers and policy-makers must understand the link between climate and pollutants to select optimal pollutant reduction strategies and avoid exceeding emission directives. This paper analyses eight years of ambient CO data at a site with a high traffic density, and provides results that are useful for decision-making. The assessment of long-term changes in air pollutants to evaluate reduction strategies has to be done while taking into account meteorological variability  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号