首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper reviews four commonly used statistical methods for environmental data analysis and discusses potential pitfalls associated with application of these methods through real case study data. The four statistical methods are percentile and confidence interval, correlation coefficient, regression analysis, and analysis of variance (ANOVA). The potential pitfall for estimation of percentile and confidence interval includes the automatic assumption of a normal distribution to environmental data, which so often show a log-normal distribution. The potential pitfall for correlation coefficient includes the use of a wide range of data points in which the maximum in value may trivialize other smaller data points and consequently skew the correlation coefficient. The potential pitfall for regression analysis includes the propagation of uncertainties of input variables to the regression model prediction, which may be even more uncertain. The potential pitfall for ANOVA includes the acceptance of a hypothesis as a weak argument to imply a strong conclusion. As demonstrated in this paper, we may draw very different conclusions based on statistical analysis if the pitfalls are not identified. Reminder and enlightenment obtained from the pitfalls are given at the end of this article.  相似文献   

2.
3.
Statistical techniques are useful for interpretation of monitoring data for pollutants, process variables, etc. Simplified nomographical methods are presented for relating numbers of samples to confidence intervals for their mean values, and for determining the proportion of the population exceeding a specified concentration and confidence intervals for the proportion. A chart also is given for design of a sampling program for quality control. Illustrative frequency distribution data are given for hourly-averaged methane concentrations in air over three-week periods. They show trimodal lognormal distributions. These charts are applicable to lognormal distributions, as well as to normal distributions. They are convenient for many common problems.  相似文献   

4.
The Clear Air Act of 1970 established the authority to control hazardous air pollutants. Section 112 of the legislation requires the Administrator to publish, and from time to time revise, a list of hazardous air pollutants for which he intends to establish emission standards, and to establish emission standards for those pollutants. These national emission standards for hazardous air pollutants are commonly referred to as “NESHAP” standards. All of the NESHAP that have been promulgated as of April 1984 are summarized in the table which accompanies this article. Two types of references are included in the table. The first reference identifies the issue of the Federal Register in which the NESHAP is explained in detail. The second reference identifies the background information document (BID) which contains the technical and economic information developed to support the NESHAP.  相似文献   

5.
Strategies for control of ozone aim at regulation of its chemical precursors, non-methane organic compounds (NMOC) and nitrogen oxides (NOx). It is therefore important to analyze how these precursors vary temporally and geographically. This study finds significant and important differences among four Texas ozone nonattainment sites, Dallas, Ft. Worth, El Paso, and Houston, for 1984, 1985, and 1986 for NMOC, NOx, and their ratio NNR. These differences were detected through nonparametric analysis of variance and Student-New-man-Keuls’s test for multiple comparisons on rank-transformed data. A noteworthy feature of the data analysis is its attention to the assumptions underlying the statistical methods. Classical models based on normal or lognormal theory had to be abandoned for lack of realism. It is demonstrated how alternative models may be applied to yield appropriate, rather than inappropriate, conclusions.  相似文献   

6.
A research project has been under way to investigate air pollution problems in Los Angeles County with the help of the data supplied by the Los Angeles County Air Pollution Control District. These data consist of measurements of primary pollutants such as nitric oxide, hydrocarbons, carbon monoxide, sulfur dioxide and particu-lates, and secondary pollutants such as ozone and nitrogen dioxide, recorded hourly at a number of different stations in Los Angeles County over the past seventeen years. This present discussion deals in a preliminary way with a particular aspect of this analysis, namely, the occurrence of photochemical smog in Los Angeles. The paper is divided into two main sections. The first is intended to provide a brief survey of the problem of photochemical smog in Los Angeles as presently understood in relation to the available field data and also in relation to chamber experiments which have been run in various laboratories. The second part of the paper discusses a class of intervention problems that arise in studying the data. It is noted that parallel problems occur in the study of other ecological material and elsewhere. Statistical methods for dealing with this class of problems are illustrated with some of the Los Angeles data.  相似文献   

7.
环境监测中的数据审核   总被引:1,自引:0,他引:1  
提出在环境监测数据审核过程中,应着重注意现场采样、室内分析、综合判断等方面的内容审核,以确保监测数据的正确可靠。  相似文献   

8.
Over the past few decades the development of environmental regulations, advances in analytical chemistry and other scientific disciplines, and increased rigor in quality control procedures have created a new discipline, environmental forensics. The need for analytical methods that determine qualitatively and quantitatively organic compounds in the environment, especially in drinking waters, was recognized in the early 1950s. These methods were developed gradually by the early 1960s. The important tools of gas chromatography and mass spectroscopy that evolved in the 1970s provided the early environmental forensic chemist for the first time with the ability to produce scientifically sound data that was admissible in court. By the 1990s, multivariate statistical techniques became available and accepted, including principal component analysis (PCA) and polytopic vector analysis (PVA). These techniques, coupled with the advancing analytical methods, have enabled forensic investigator tools to evaluate and demonstrate unique attributes of a set of data. Analyses of marker compounds, PCBs, PCDD/Fs and petroleum hydrocarbons are all shown to be potentially valuable in deciphering the source and fate of contamination. This paper shows how advancements in environmental analytical chemistry provide the forensic chemist with tools to assess the source(s) of site contamination.  相似文献   

9.
Over the past few decades the development of environmental regulations, advances in analytical chemistry and other scientific disciplines, and increased rigor in quality control procedures have created a new discipline, environmental forensics. The need for analytical methods that determine qualitatively and quantitatively organic compounds in the environment, especially in drinking waters, was recognized in the early 1950s. These methods were developed gradually by the early 1960s. The important tools of gas chromatography and mass spectroscopy that evolved in the 1970s provided the early environmental forensic chemist for the first time with the ability to produce scientifically sound data that was admissible in court. By the 1990s, multivariate statistical techniques became available and accepted, including principal component analysis (PCA) and polytopic vector analysis (PVA). These techniques, coupled with the advancing analytical methods, have enabled forensic investigator tools to evaluate and demonstrate unique attributes of a set of data. Analyses of marker compounds, PCBs, PCDD/Fs and petroleum hydrocarbons are all shown to be potentially valuable in deciphering the source and fate of contamination. This paper shows how advancements in environmental analytical chemistry provide the forensic chemist with tools to assess the source(s) of site contamination.  相似文献   

10.
Environmental forensic analysis has evolved significantly from the early days of qualitative chemical fingerprint evaluations. The need for quantitative rigor has made the use of numerical methods critical in identifying and mapping contaminant sources in complex environmental systems. Given multiple contaminant sources, the environmental scientist is faced with the challenge of unraveling the contributions of multiple plumes with overlapping spatial and temporal distributions. The problem may be addressed through a multivariate statistical approach, but there is a mind-boggling array of the available “chemometric” methods. This paper provides an overview of these methods, along with a review of their advantages, disadvantages, and pitfalls. Methods discussed include principal component analysis and several receptor-modeling techniques.  相似文献   

11.
规划环境影响评价指标体系及评价方法浅析   总被引:4,自引:0,他引:4  
陆军  郝大举 《污染防治技术》2006,19(1):26-27,66
对规划环境影响评价的内容作出了概述性的介绍,并根据规划环境影响评价的特点,得出规划环境影响评价的指标体系应包括自然环境指标、生态环境指标、资源利用指标、能源利用指标和社会经济指标5大体系。列举了当前适用于规划环境影响评价的技术方法,并对其进行简单的对比分析。  相似文献   

12.
The purpose of this paper is to demonstrate the use of some statistical methods for examining trends in ambient ozone air quality downwind of major urban areas. To this end, daily maximum 1 -hr ozone concentrations measured over New Jersey, metropolitan New York City and Connecticut for the period 1980 to 1989 were assembled and analyzed. This paper discusses the application of the bootstrap method, extreme value statistics and a nonparametric test for evaluating trends in urban ozone air quality. The results indicate that although there is an improvement in ozone air quality downwind of New York City, there has been little change in ozone levels upwind of New York City during this ten-year period.  相似文献   

13.
Abstract

Airborne particulate matter was sampled at a copper smelter and at an aluminum casting plant. Size, shape, quantity, and microlocalization of chemical species in the particulates were measured using closed cassettes, cascade impactors, scanning electron microscopy, X-ray diffraction, infrared and atomic absorption spectrophotometries, secondary ion mass spectrometry, and photoelectron spectroscopy. Cluster and principal components analyses were used in interpreting results. Aerosol chemistry varies as a function of size, and composition becomes more complex as the aerosol size drops into the respirable fraction and lower. Surface chemical properties are evidenced where, generally, volatile species are enriched. A few site-specific elements and characteristics were identified. The formation of particulates may often be related to process and practices, yet the actual distribution of species in the air remains an intricate matter.  相似文献   

14.
This is part one of a two-part discussion, in which we will provide an overview of the use of aerial photography, topographic mapping and photogrammetry in environmental enforcement actions. The visualization of spatial relationships of natural and man-made features can focus the scope of environmental investigation, and provide a simple, yet quantitative, historical record of changes in conditions on a site. Aerial photography has been used in environmental remote sensing since the early part of the 20th century. Aerial photos are valuable tools for environmental assessment because they provide objective, detailed documentation of surface conditions at a specific time. Furthermore, they can generally be obtained even in cases where access on the ground is denied to investigators. From aerial photos, precise quantitative information can be collected using photogrammetry. Such measurement and positional data can be produced in digital format for input into a Geographic Information System (GIS) for computerized analysis and display. Other information derived from aerial photographs requires specialized photointerpretive skills and experience. These include the recognition of vegetation mortality, oil-spill damage, and the ecological quality of water bodies. The location, extent and historical change of hazardous waste sites can be documented on topographic maps. These maps are often created from aerial photographs, and display the extent and location of real-world features by symbolizing them. The major advantage of maps over aerial photos is that maps can show things that are not visible from the air, while omitting unnecessary and distracting information. Because maps are derived products, they may contain bias in content and presentation, and they must be backed up by careful documentation and quality assurance protocols.  相似文献   

15.
This is part one of a two-part discussion, in which we will provide an overview of the use of aerial photography, topographic mapping and photogrammetry in environmental enforcement actions. The visualization of spatial relationships of natural and man-made features can focus the scope of environmental investigation, and provide a simple, yet quantitative, historical record of changes in conditions on a site. Aerial photography has been used in environmental remote sensing since the early part of the 20th century. Aerial photos are valuable tools for environmental assessment because they provide objective, detailed documentation of surface conditions at a specific time. Furthermore, they can generally be obtained even in cases where access on the ground is denied to investigators. From aerial photos, precise quantitative information can be collected using photogrammetry. Such measurement and positional data can be produced in digital format for input into a Geographic Information System (GIS) for computerized analysis and display. Other information derived from aerial photographs requires specialized photointerpretive skills and experience. These include the recognition of vegetation mortality, oil-spill damage, and the ecological quality of water bodies. The location, extent and historical change of hazardous waste sites can be documented on topographic maps. These maps are often created from aerial photographs, and display the extent and location of real-world features by symbolizing them. The major advantage of maps over aerial photos is that maps can show things that are not visible from the air, while omitting unnecessary and distracting information. Because maps are derived products, they may contain bias in content and presentation, and they must be backed up by careful documentation and quality assurance protocols.  相似文献   

16.
The new millennium ushers in changes for refiners of automobile gasoline in the United States, as well as for the state and federal regulators who establish guidelines for gasoline formulation and environmental regulation governing the fate of gasoline-related chemicals in the nation's air, soil and groundwater. One current issue in the gasoline formulation debate centers on the comparison of the proven benefits of the addition of chemical oxygenates—especially methyltert -butyl ether (MTBE)—to gasoline (to improve tailpipe emission quality) against the presumed environmental problems caused by the presence of oxygenates in ground- and surface waters due to fugitive releases of gasoline. Credible debate on this subject presumes that current and past environmental monitoring data for MTBE in environmental samples is accurate and precise. Experience suggests that this assumption is not correct, in part because certain analytical methodologies—particularly older methods supported by the U.S. Environmental Protection Agency—can fall short of reasonable data quality goals for measurement of MTBE. This Technical Note summarizes the standard EPA methods available to site investigators who need to measure MTBE in environmental media, the limitations and advantages of these measurement techniques, and recommendations for improving these standard EPA methods to yield the highest quality MTBE environmental residue data.  相似文献   

17.
对已报道的农药正辛醇/水分配系数的6种测定方法进行了简述和比较,对农药Kow值与其他环境参数Sw,Koc,BCF1的相关性进行了综述。  相似文献   

18.
从环境监测为环境管理服务的职能入手,系统分析了基层环境监测站的环境质量综合分析工作现状,针对实际工作中存在的问题,提出了提高环境质量综合分析水平的对策建议。  相似文献   

19.
Title I of the Clean Air Act Amendments of 1990 calls for “enhanced monitoring” of ozone, which is planned to include measurements of atmospheric non-methane organic compounds (NMOCs). NMOC concentration data gathered by two methods in Atlanta, Georgia during July and August 1990 are compared in order to assess the reliability of such measurements in an operational setting. During that period, automated gas chromatography (GO) systems (Field systems) were used to collect NMOC continuously as one-hour averages. In addition, canister samples of ambient air were collected on an intermittent schedule for quality control purposes and analyzed by laboratory GC (the Lab system). Data from the six-site network included concentrations of nitrogen oxides (NOX), carbon monoxide (CO), ozone, total NMOC (TNMOC), and 47 identified NMOCs. Regression analysis indicates that the average TNMOC concentration from the Lab system is about 50 percent higher than that from the Field system, and that the bulk of the difference is due to unidentified NMOCs recorded by the Lab system. Also, there are substantial uncertainties in predicting a single Field TNMOC concentration from a measured Lab concentration. Considering individual identified NMOCs, agreement between the systems is poor for many olefins that occur at low concentrations but may be photochemically important. Regressions of TNMOC against CO and NOX lead to the conclusion that the larger unidentified component being reported by the Lab system is not closely related to local combustion or automotive sources.  相似文献   

20.
环境监测是一项政府行为,如何在社会主义市场经济条件下,正确处理“政府行为”与市场经济的矛盾,已成为各级环保部门共同探讨的、需要妥善解决的问题。因此,根据经济学的有关理论,对环境监测这一领域在社会主义市场经济中所处的地位、作用作了全面论述,并对理解环境监测的“政府行为”提出看法和见解。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号