首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper examines the effects of measurement uncertainty on various summary statistics that are routinely used in air quality data analysis. Analytical approximations and computer simulation techniques are employed to illustrate and quantify how the uncertainty associated with an individual measurement results in an uncertainty for different summary statistics. Measurement uncertainty may be viewed as consisting of bias and imprecision. It is shown that even when there is no bias for individual measurements it is possible for imprecision alone to result in bias for certain commonly used summary statistics. Different types of statistics are shown to be less influenced by measurement imprecision and, consequently, a data set may be acceptable for some purpose but not for others. The desired precision of the summary statistic may be viewed as a guide in determining an acceptable level of imprecision for individual measurements.  相似文献   

2.
Helsel DR 《Chemosphere》2006,65(11):2434-2439
The most commonly used method in environmental chemistry to deal with values below detection limits is to substitute a fraction of the detection limit for each nondetect. Two decades of research has shown that this fabrication of values produces poor estimates of statistics, and commonly obscures patterns and trends in the data. Papers using substitution may conclude that significant differences, correlations, and regression relationships do not exist, when in fact they do. The reverse may also be true. Fortunately, good alternative methods for dealing with nondetects already exist, and are summarized here with references to original sources. Substituting values for nondetects should be used rarely, and should generally be considered unacceptable in scientific research. There are better ways.  相似文献   

3.
Numerous ozone exposure statistics were calculated using hourly ozone data from crop yield loss experiments previously conducted for alfalfa, fresh market and processing tomatoes, cotton, and dry beans in an ambient ozone gradient near Los Angeles, California. Exposure statistics examined included peak (maximum daily hourly) and mean concentrations above specific threshold levels, and concentrations during specific time periods of the day. Peak and mean statistics weighted for ozone concentration and time period statistics weighted for hour of the day were also determined. Polynomial regression analysis was used to relate each of 163 ozone statistics to crop yield. Performance of the various statistics was rated by comparing residual mean square (RMS) values. The analyses demonstrated that no single statistic was best for all crop species. Ozone statistics with a threshold level performed well for most crops, but optimum threshold level was dependent upon crop species and varied with the particular statistics calculated. The data indicated that daily hours of exposure above a critical high-concentration threshold related well to crop yield for alfalfa, market tomatoes, and dry beans. The best statistic for cotton yield was an average of all daily peak ozone concentrations. Several different types of ozone statistics performed similarly for processing tomatoes. These analyses suggest that several ozone summary statistics should be examined in assessing the relationship of ambient ozone exposure to crop yield. Where no clear statistical preference is indicated among several statistics, those most biologically relevant should be selected.  相似文献   

4.
Much progress has been made in recent years to address the estimation of summary statistics, using data that are subject to censoring of results that fall below the limit of detection (LOD) for the measuring instrument. Truncated data methods (e.g., Tobit regression) and multiple-imputation are two approaches for analyzing data results that are below the LOD. To apply these methods requires an assumption about the underlying distribution of the data. Because the log-normal distribution has been shown to fit many data sets obtained from environmental measurements, the common practice is to assume that measurements of environmental factors can be described by log-normal distributions. This article describes methods for obtaining estimates of percentiles and their associated confidence intervals when the results are log-normal and a fraction of the results are below the LOD. We present limited simulations to demonstrate the bias of the proposed estimates and the coverage probability of their associated confidence intervals. Estimation methods are used to generate summary statistics for 2,3,7,8-tetrachloro dibenzo-p-dioxin (2,3,7,8-TCDD) using data from a 2001 background exposure study in which PCDDs/PCDFs/cPCBs in human blood serum were measured in a Louisiana population. Because the congener measurements used in this study were subject to variable LODs, we also present simulation results to demonstrate the effect of variable LODs on the multiple-imputation process.  相似文献   

5.
Assessments of past environmental policies—termed accountability studies—contribute important information to the decision-making process used to review the efficacy of past policies, and subsequently aid in the development of effective new policies. These studies have used a variety of methods that have achieved varying levels of success at linking improvements in air quality and/or health to regulations. The Health Effects Institute defines the air pollution accountability framework as a chain of events that includes the regulation of interest, air quality, exposure/dose, and health outcomes, and suggests that accountability research should address impacts for each of these linkages. Early accountability studies investigated short-term, local regulatory actions (for example, coal use banned city-wide on a specific date or traffic pattern changes made for Olympic Games). Recent studies assessed regulations implemented over longer time and larger spatial scales. Studies on broader scales require accountability research methods that account for effects of confounding factors that increase over time and space. Improved estimates of appropriate baseline levels (sometimes termed “counterfactual”—the expected state in a scenario without an intervention) that account for confounders and uncertainties at each link in the accountability chain will help estimate causality with greater certainty. In the direct accountability framework, researchers link outcomes with regulations using statistical methods that bypass the link-by-link approach of classical accountability. Direct accountability results and methods complement the classical approach. New studies should take advantage of advanced planning for accountability studies, new data sources (such as satellite measurements), and new statistical methods. Evaluation of new methods and data sources is necessary to improve investigations of long-term regulations, and associated uncertainty should be accounted for at each link to provide a confidence estimate of air quality regulation effectiveness. The final step in any accountability is the comparison of results with the proposed benefits of an air quality policy.

Implications: The field of air pollution accountability continues to grow in importance to a number of stakeholders. Two frameworks, the classical accountability chain and direct accountability, have been used to estimate impacts of regulatory actions, and both require careful attention to confounders and uncertainties. Researchers should continue to develop and evaluate both methods as they investigate current and future air pollution regulations.  相似文献   


6.
The modeling of transport of organic liquid contaminants through the vadose zone often requires three-phase relative permeabilities. Since these are difficult to measure, predictive models are usually used. The objective of this study is to assess the ability of eight common models to predict the drainage relative permeability to oil in a three-phase system (water-oil-air). A comparison of the models' estimates using data set from Oak [Oak, M.J., 1990. Three-phase relative permeability of water-wet Berea. In: Seventh Symposium on Enhanced Oil Recovery, Paper SPE/Doe 20183. Tulsa, OK, April 22-25] showed that they provide very different predictions for the same system. The goodness of the models does not increase with the amount of data or computation that the models require. Also, the calculations showed how different interpretations of the models and of the terminology associated with them can significantly impact the predictions. Thus, considerable error may be introduced into the simulations of organic liquid transport in the vadose zone depending on the selection and interpretation of the three-phase relative permeability model.  相似文献   

7.
Californians are exposed daily to concentrations of ozone (O3) that are among the highest in the United States. Recently, the state adopted a new 8-hr ambient standard of 0.070 ppm, more stringent than the current federal standard. The new standard is based on controlled human studies and on dozens of epidemiologic studies reporting associations between O3 at current ambient levels and a wide range of adverse health outcomes. Clearly, the new O3 standards will require further reductions in the precursor pollutants and additional expenditures for pollution control. Therefore, it is important to quantify the incremental health benefits of moving from current conditions to the new California standard. In this paper, a standard methodology is applied to quantify the health benefits associated with O3 concentration reductions in California. O3 concentration reductions are estimated using ambient monitoring data and a proportional rollback approach in which changes are specific to each air basin, and control strategies may impact concentrations both below and above the standard. Health impacts are based on published epidemiologic studies, including O3-related mortality and morbidity, and economic values are assigned to these outcomes based on willingness-to-pay and cost-of-illness studies. Central estimates of this research indicate that attaining the California 8-hr standard, relative to current concentrations, would result in annual reductions of 630 cases of premature mortality, 4200 respiratory hospital admissions, 660 pediatric emergency room visits for asthma, 4.7 million days of school loss, and 3.1 million minor restricted activity days, with a median estimated economic value of dollar 4.5 billion. Sensitivity analyses indicate that these findings are robust with respect to exposure assessment methods but are influenced by assumptions about the slope of the concentration-response function in threshold models and the magnitude of the O3-mortality relationship. Although uncertainties exist for several components of the methodology, these results indicate that the benefits of reducing O3 to the California standard may be substantial and that further research on the shape of the O3-mortality concentration-response function and economic value of O3-related mortality would best reduce these uncertainties.  相似文献   

8.
A synthesis of research on the responses of terrestrial biota (1095 effect sizes) to industrial pollution (206 point emission sources) was conducted to reveal regional and global patterns from small-scale observational studies. A meta-analysis, in combination with other statistical methods, showed that the effects of pollution depend on characteristics of the specific polluter (type, amount of emission, duration of impact on biota), the affected organism (trophic group, life history), the level at which the response was measured (organism, population, community), and the environment (biome, climate). In spite of high heterogeneity in responses, we have detected several general patterns. We suggest that the development of evolutionary adaptations to pollution is a common phenomenon and that the harmful effects of pollution on terrestrial ecosystems are likely to increase as the climate warms. We argue that community- and ecosystem-level responses to pollution should be explored directly, rather than deduced from organism-level studies.  相似文献   

9.
Data from well-designed experiments provide the strongest evidence of causation in biodiversity studies. However, for many species the collection of these data is not scalable to the spatial and temporal extents required to understand patterns at the population level. Only data collected from citizen science projects can gather sufficient quantities of data, but data collected from volunteers are inherently noisy and heterogeneous. Here we describe a ‘Big Data’ approach to improve the data quality in eBird, a global citizen science project that gathers bird observations. First, eBird’s data submission design ensures that all data meet high standards of completeness and accuracy. Second, we take a ‘sensor calibration’ approach to measure individual variation in eBird participant’s ability to detect and identify birds. Third, we use species distribution models to fill in data gaps. Finally, we provide examples of novel analyses exploring population-level patterns in bird distributions.  相似文献   

10.
The role of temperate forests in the global carbon balance is difficult to determine because many uncertainties exist in the data, and many assumptions must be made in these determinations. Still, there is little doubt that increases in atmospheric CO2 and global warming would have major effects on temperate forest ecosystems. Increases in atmospheric CO2 may result in increases in photosynthesis, changes in water and nitrogen use efficiency, and changes in carbon allocation. Indirect effects of changes in global carbon balance on regional climate and on microenvironmental conditions, particularly temperature and moisture, may be more important than direct effects of increased CO2 on vegetation. Increased incidence of forest perturbations might also be expected. The evidence suggests that conditions favorable to forest growth and development may exist in the northern latitudes, while southern latitude forests may undergo drought stress. Current harvest of temperate and world forests contributes substantial amounts of carbon to the atmosphere, possibly as much as 3 gigatons (Gt) per year. Return of this carbon to forest storage may require decades. Forest managers should be aware of the global as well as local impact their management decisions will have on the atmospheric carbon balance of the ecosystems they oversee.  相似文献   

11.
In order to determine what effects human activities have on natural processes, it is important to thoroughly understand those processes. Unfortunately, we know little about what natural processes are operating, and even less about how they have functioned historically. This paper discusses the importance of natural processes in affecting surface water acidification and the necessity for developing quantitative estimates of natural, as well as anthropogenic, contributions to the acidification of surface waters. A review of the literature and the analysis of chemistry data from six limed lakes in New York and Massachusetts have identified a number of possible processes that may play important roles in acidifying surface waters. At present, these processes are poorly understood and require further research. Once we have such knowledge, we will be able to clearly see the effects of human activities on natural processes and modify those activities in ways that will mitigate negative impact in a predictable manner.  相似文献   

12.
A new generic approach for estimating chemical concentrations in rivers at catchment and national scales is presented. Domestic chemical loads in waste water are estimated using gridded population data. River flows are estimated by combining predicted runoff with topographically derived flow direction. Regional scale exposure is characterised by two summary statistics: PEC(works), the average concentration immediately downstream of emission points, and, PEC(area), the catchment-average chemical concentration. The method was applied to boron at national (England and Wales) and catchment (Aire-Calder) scales. Predicted concentrations were within 50% of measured mean values in the Aire-Calder catchment and in agreement with results from the GREAT-ER model. The concentration grids generated provide a picture of the spatial distribution of expected chemical concentrations at various scales, and can be used to identify areas of potentially high risk.  相似文献   

13.
One of two topics explored is the limitations of the daily average in summarizing pollutant hourly profiles. The daily average of hourly measurements of air pollutant constituents provides continuity with previous studies using monitoring technology that only provided the daily average. However, other summary statistics are needed that make better use of all available information in 24-hr profiles. The daily average reflects the total daily dose, obscuring hourly resolution of the dose rate. Air pollutant exposures with comparable total daily doses may have very different effects when occurring at high levels over a few hours as opposed to low levels over a longer time. Alternative data-based choices for summary statistics are provided using principal component analysis to capture the exposure dose rate, while preserving ease of interpretation. This is demonstrated using the earliest hourly particle concentration data available for El Paso from archived records of particulate matter (PM)10. In this way, a significant association between evening PM10 exposures and nonaccidental daily mortality is found in El Paso from 1992 to 1995, otherwise missed using the daily average. Secondly, the nature and, hence, effects of particles in the ambient aerosol during El Paso sandstorms is believed different from that of particles present during still-air conditions resulting from atmospheric temperature inversions. To investigate this, wind speed (ws) is used as a surrogate variable to label PM10 exposures as Low-ws (primarily fine particles), High-ws (primarily coarse particles), or Mid-ws (a mixture of fine and coarse particles). A High-ws evening is significantly associated with a 10% lower risk of mortality on the succeeding third day, as compared with comparable exposures at Low- or Mid-ws. Although this analysis cannot be used to form firm conclusions because it uses a very small data set, it demonstrates the limitations of the daily average and suggests differential toxicity for different particle compositions.  相似文献   

14.
In environmental monitoring, variables with analytically non-detected values are commonly encountered. For the statistical evaluation of these data, most of the methods that produce a less biased performance require specific computer programs. In this paper, a statistical method based on the median semi-variance (SemiV) is proposed to estimate the position and spread statistics in a dataset with single left-censoring. The performances of the SemiV method and 12 other statistical methods are evaluated using real and complete datasets. The performances of all the methods are influenced by the percentage of censored data. In general, the simple substitution and deletion methods showed biased performance, with exceptions for L/2, Inter and L/√2 methods that can be used with caution under specific conditions. In general, the SemiV method and other parametric methods showed similar performances and were less biased than other methods. The SemiV method is a simple and accurate procedure that can be used in the analysis of datasets with less than 50% of left-censored data.  相似文献   

15.
Turgut C 《Chemosphere》2007,66(3):469-473
Pesticides pose a serious risk for aquatic macrophytes in the environment. They are also detrimental to the rooted macrophytes used in bioassays for assessment. Currently, no data is available for impact of pesticides toward parrotfeather when present at the predicted environmental concentration. The calculated expected environmental concentration was applied to the plants and the effect was compared. Eight of the 18 pesticides showed significantly different impact. All of the other tested pesticides induced a significant change in pigment content of parrotfeather. The RQ values for risk quotient had a value higher than 0.5, so need regulatory action for environment. This study may be the first to evaluate the predicted environmental concentrations reported by pesticide registration in Europe. Additional studies are required to test all pesticides within one group since the compounds tested may depict a wide toxicity level. Furthermore, the tests should include more than one macrophyte, e.g. one rooted and one non-rooted species, in order to provide a better understanding on pesticide toxicity.  相似文献   

16.
Commonly used sums-of-squares-based error or deviation statistics—like the standard deviation, the standard error, the coefficient of variation, and the root-mean-square error—often are misleading indicators of average error or variability. Sums-of-squares-based statistics are functions of at least two dissimilar patterns that occur within data. Both the mean of a set of error or deviation magnitudes (the average of their absolute values) and their variability influence the value of a sum-of-squares-based error measure, which confounds clear assessment of its meaning. Interpretation problems arise, according to Paul Mielke, because sums-of-squares-based statistics do not satisfy the triangle inequality. We illustrate the difficulties in interpreting and comparing these statistics using hypothetical data, and recommend the use of alternate statistics that are based on sums of error or deviation magnitudes.  相似文献   

17.
Public transportation automatic fare collection (AFC) systems are able to continuously record large amounts of passenger travel information, providing massive, low-cost data for research on regulations pertaining to public transport. These data can be used not only to analyze characteristics of passengers’ trips but also to evaluate transport policies that promote a travel mode shift and emission reduction. In this study, models combining card, survey, and geographic information systems (GIS) data are established with a research focus on the private driving restriction policies being implemented in an ever-increasing number of cities. The study aims to evaluate the impact of these policies on the travel mode shift, as well as relevant carbon emission reductions. The private driving restriction policy implemented in Beijing is taken as an example. The impact of the restriction policy on the travel mode shift from cars to subways is analyzed through a model based on metro AFC data. The routing paths of these passengers are also analyzed based on the GIS method and on survey data, while associated carbon emission reductions are estimated. The analysis method used in this study can provide reference for the application of big data in evaluating transport policies.

Implications: Motor vehicles have become the most prevalent source of emissions and subsequently air pollution within Chinese cities. The evaluation of the effects of driving restriction policies on the travel mode shift and vehicle emissions will be useful for other cities in the future. Transport big data, playing an important support role in estimating the travel mode shift and emission reduction considered, can help related departments to estimate the effects of traffic jam alleviation and environment improvement before the implementation of these restriction policies and provide a reference for relevant decisions.  相似文献   


18.
The paper presents the results of the development of a standard driving cycle in the urban areas of Hong Kong. On-road speed–time data were collected by an instrumented diesel vehicle along two fixed routes located in two urban districts in Hong Kong. The collected data were analyzed and compared with mandatory driving cycles used elsewhere. It was found that none of these mandatory cycles could satisfactorily describe the driving characteristics in Hong Kong. A unique driving cycle was therefore developed for Hong Kong. The cycle was built up by extracting parts of the on-road speed data such that the summary statistics of the sample are close to that derived from the data population of the test runs.  相似文献   

19.
Recent advances in the development of receptor-oriented source apportionment techniques (models) have provided a new approach to evaluating the performance of particulate dispersion models. Rather than limiting performance evaluations to comparisons of particulate mass, receptor model estimates of source impacts can be used to open new opportunities for in-depth analysis of dispersion model performance. Recent experiences in the joint application of receptor and dispersion models have proven valuable in developing increased confidence in source impact projections used for control strategy development. Airshed studies that have followed this approach have identified major errors in emission inventory data bases and provided technical support for modeling assumptions.

This paper focuses on the joint application of dispersion and receptor models to particulate source impact analysis and dispersion model performance and evaluation. The limitations and advantages of each form of modeling are reviewed and case studies are examined. The paper is offered to provide several new perspectives into the model evaluation process in the hope that they may prove useful to those that manage our nation’s air resources.  相似文献   

20.
For assessing the efficacy of a specific form of the National Ambient Air Quality Standard for 03, those exposure patterns that result in vegetation and human health effects must be identified. For vegetation, it has been found that the higher hourly average concentrations should be weighted more than the lower concentrations. Controlled human exposure work supports the suggestion that concentration may be more important than exposure duration and ventilation rates. It has been indicated in the literature that the current form of the federal 03 standard may not be appropriate for protecting vegetation and human health from 03 exposures. The proposed use of the cumulative index alone as a form of the standard may not provide sufficient protection to vegetation. An extended-period average index, such as a daily maximum 8-hour average concentration, may not be appropriate to protect human health because of the reduced ability to observe differences among hourly 03 concentrations exhibited within exposure regimes. For both vegetation and human health effects research, additional experimentation is required to identify differences in responses that occur when ambient-type exposure regimes are applied. Any standard promulgated to protect vegetation and human health from 03 exposures should consider combining cumulative exposure indices with other parameters so that those unique exposures that have the potential for eliciting an adverse effect can be adequately described.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号