首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A stochastic, three-parameter, Weibull frequency distribution, probability generator was tested by using theoretical data. Subsequently, it was applied to replace missing values of hourly atmospheric concentrations of trace gases that were continuously monitored at three study sites, for 2 years. The results were highly accurate and realistic. The cumulative means and the medians calculated by the Weibull method were intermediate between corresponding values calculated by uniform substitution of missing values with 'zero' or with half of the minimum detection limit of the appropriate measurement instrument used. Furthermore, the Weibull method allowed the replacement of as many as 100 missing values on either side of a measured data sub-set, without altering the overall characteristics of the true frequency distribution of the entire data set.  相似文献   

2.
Although networks of environmental monitors are constantly improving through advances in technology and management, instances of missing data still occur. Many methods of imputing values for missing data are available, but they are often difficult to use or produce unsatisfactory results. I-Bot (short for “Imputation Robot”) is a context-intensive approach to the imputation of missing data in data sets from networks of environmental monitors. I-Bot is easy to use and routinely produces imputed values that are highly reliable. I-Bot is described and demonstrated using more than 10 years of California data for daily maximum 8-hr ozone, 24-hr PM2.5 (particulate matter with an aerodynamic diameter <2.5 μm), mid-day average surface temperature, and mid-day average wind speed. I-Bot performance is evaluated by imputing values for observed data as if they were missing, and then comparing the imputed values with the observed values. In many cases, I-Bot is able to impute values for long periods with missing data, such as a week, a month, a year, or even longer. Qualitative visual methods and standard quantitative metrics demonstrate the effectiveness of the I-Bot methodology.Implications: Many resources are expended every year to analyze and interpret data sets from networks of environmental monitors. A large fraction of those resources is used to cope with difficulties due to the presence of missing data. The I-Bot method of imputing values for such missing data may help convert incomplete data sets into virtually complete data sets that facilitate the analysis and reliable interpretation of vital environmental data.  相似文献   

3.
Exposure measurements of concentrations that are non-detectable or near the detection limit (DL) are common in environmental research. Proper statistical treatment of non-detects is critical to avoid bias and unnecessary loss of information. In the present work, we present an overview of possible statistical strategies for handling non-detectable values, including deletion, simple substitution, distributional methods, and distribution-based imputation. Simple substitution methods (e.g., substituting 0, DL/2, DL/ radical2, or DL for the non-detects) are the most commonly applied, even though the EPA Guidance for Data Quality Assessment discouraged their use when the percentage of non-detects is >15%. Distribution-based multiple imputation methods, also known as robust or "fill-in" procedures, may produce dependable results even when 50-70% of the observations are non-detects and can be performed using commonly available statistical software. Any statistical analysis can be conducted on the imputed datasets. Results properly reflect the presence of non-detectable values and produce valid statistical inference. We describe the use of distribution-based multiple imputation in a recent investigation conducted on subjects from the Seveso population exposed to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), in which 55.6% of plasma TCDD measurements were non-detects. We suggest that distribution-based multiple imputation be the preferred method to analyze environmental data when substantial proportions of observations are non-detects.  相似文献   

4.
The efficiency of a water treatment program or a water monitoring program can be checked only if it is accompanied by water analysis procedures allowing meaningful statements on water quality. Meaningful statements do not only include high accuracy, but high precision as well. With high precision values, good repeatability and reproducibility is aimed for. Repeatability and reproducibility may either be monitored by regular inter-laboratory trials, without prescribing a distinct analytical method, or by applying a standardized method which has undergone thorough checks concerning its reliability and efficiency. The article presents the structure of the ISO, CEN and DIN standardization work in water analysis.  相似文献   

5.
Polycyclic aromatic hydrocarbons (PAHs) are common contaminants in soil at former industrial areas; and in Sweden, some of the most contaminated sites are being remediated. Generic guideline values for soil use after so-called successful remediation actions of PAH-contaminated soil are based on the 16 EPA priority pollutants, which only constitute a small part of the complex cocktail of toxicants in many contaminated soils. The aim of the study was to elucidate if the actual toxicological risks of soil samples from successful remediation projects could be reflected by chemical determination of these PAHs. We compared chemical analysis (GC-MS) and bioassay analysis (H4IIE-luc) of a number of remediated PAH-contaminated soils. The H4IIE-luc bioassay is an aryl hydrocarbon (Ah) receptor-based assay that detects compounds that activate the Ah receptor, one important mechanism for PAH toxicity. Comparison of the results showed that the bioassay-determined toxicity in the remediated soil samples could only be explained to a minor extent by the concentrations of the 16 priority PAHs. The current risk assessment method for PAH-contaminated soil in use in Sweden along with other countries, based on chemical analysis of selected PAHs, is missing toxicologically relevant PAHs and other similar substances. It is therefore reasonable to include bioassays in risk assessment and in the classification of remediated PAH-contaminated soils. This could minimise environmental and human health risks and enable greater safety in subsequent reuse of remediated soils.  相似文献   

6.
Soil pollution data is also strongly scattering at small scale. Sampling of composite samples, therefore, is recommended for pollution assessment. Different statistical methods are available to provide information about the accuracy of the sampling process. Autocorrelation and variogram analysis can be applied to investigate spatial relationships. Analysis of variance is a useful method for homogeneity testing. The main source of the total measurement uncertainty is the uncertainty arising from sampling. The sample mass required for analysis can also be estimated using an analysis of variance. The number of increments to be taken for a composite sample can be estimated by means of simple statistical formulae. Analytical results of composite samples obtained from different fusion procedures of increments can be compared by means of multiple mean comparison. The applicability of statistical methods and their advantages are demonstrated for a case study investigating metals in soil at a very small spatial scale. The paper describes important statistical tools for the quantitative assessment of the sampling process. Detailed results clearly depend on the purpose of sampling, the spatial scale of the object under investigation and the specific case study, and have to be determined for each particular case.  相似文献   

7.
An iterative regression procedure is presented to estimate missing air pollution measurements when the data are measured at two or more sampling stations in the same vicinity. The procedure utilizes the measurements taken at other stations, on neighboring days, and of other pollutants.

The procedure is applied to a set of Philadelphia pollution data with from five to seventeen per cent of the observations missing. The method is tested by comparing the known observed pollutant values, with their estimates given by the procedure. Correlations between the observations and their estimates are uniformly high, ranging from 0.87 to 0.91. These correlations compare favorably with those estimates given by a simple linear interpolation. The magnitude of the correlations suggests that estimates given by this iterative regression procedure may be used where missing observations pccur without fear of undesirable effects on subsequent work. Therefore, this procedure may be a valuable tool in handling the problem of missing observations in air pollution data.  相似文献   

8.
Compositional analysis consists of a group of techniques used to manipulate closed (=compositional) data, i.e. multivariate data summing to a fixed quantity (proportions, percentages). It is based on the analysis of the relations among variables and the use of logarithmic transformations. It has been claimed that this group of techniques should be used to analyse profiles of pollutant sources because profiles themselves are proportions. We show in this paper that, for the exploratory analysis of these data, a good strategy is to combine the analysis done with and without transformation because they give different and complementary insights on the structure of the data. We discuss in particular the study of processes such as the mixing of pollutants produced by different sources and the exponential decay of concentrations with distance to the source found in many studies. The clr transformation is also appropriate for the study of the variables having small proportions, which remain concealed by the abundant variables when analysed without transformation. We present simulations to illustrate these ideas and we also apply these techniques to two data sets of PCDD/F content in moss tissues.  相似文献   

9.
A model which quantifies the relationship between the monthly time series for CO emissions, the monthly time series in ambient CO concentration, and meteorologically driven dispersion was developed. Fifteen cities representing a wide range of geographical and climatic conditions were selected. An eight-year time series (1984–1991 inclusive) of monthly averaged data were examined in each city. A new method of handling missing ambient concentration values which is designed to calculate city-wide average concentrations that follow the trend seen at individual monitor sites is presented. This method is general and can be used in other applications involving missing data. The model uses emissions estimates along with two meteorological variables (wind speed and mixing height) to estimate monthly averages of ambient air pollution concentrations. The model is shown to have a wide range of applicability; it works equally well for a wide range of cities that have very different temporal CO distributions. The model is suited for assessing long-term trends in ambient air pollutants and can also be used for estimating seasonal variations in concentration, estimation of trends in emissions, and for filling in gaps in the ambient concentration record.  相似文献   

10.
The assessment of the deposition of both wet (rain and cloud) and dry sedimenting particles is a prerequisite for estimating element fluxes in ecosystem research. Many nations and institutions operate deposition networks using different types of sampler. However, these samplers have rarely been characterized with respect to their sink properties. Major errors in assessing bulk deposition can result from poor sampling properties and defective sampling strategies. Relevant properties are: sampler geometry and material, in particular the shape of the rim; sink properties for gases and aerosols; and microbial transformations of the collected samples. An adequate number of replicates allows the identification of samples which are contaminated, in particular by bird droppings. The paper discusses physical and chemical properties of the samplers themselves. The dependence of measurement accuracy on the number of replicates and the sampling area exposed is discussed. Recommendations are given for sampling strategies, and for making corrections and substitution of missing data.  相似文献   

11.
The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory’s turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples.The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ∼1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors.  相似文献   

12.
- Goal, Scope, Background. Lake Skadar is the largest lake in Balkan Peninsula, located on the Montenegro-Albanian border. The unique features of the lake and wide range of endemic and rare or endangered plant and animal species resulted in the classification of the Skadar as a wetland site of international significance. In spite of its importance the Lake is influenced by inflowing waters from river Morača and other regional rivers contaminated by the industry, municipal and agricultural activities in the area. Therefore, the Lake has been subject of various physical, chemical, biological and toxicological examinations. However, community-level analyses are most relevant to assess the effect of stressors on aquatic ecosystems. In the present study bacterial community structure among differently polluted sites of the lake was compared by genetic fingerprinting technique. Methods Water and sediment samples were collected from five differently polluted sampling sites on the Lake Skadar in spring and autumn of the same year. The bacterial community structure in the samples was characterized and compared by temporal temperature gel electrophoresis (TTGE) analysis of polymerase chain reaction-amplified bacterial 16S rRNA genes. Results and Discussion The TTGE analysis resulted in many distinguishable and reproducible band patterns, allowing reliable comparison of bacterial communities among sampling sites. Results on the bacterial community structure revealed that three of the selected locations can be considered as sites that have not shown any pollution degradation determined by our method, due to similar structure of bacterial community in the sediment samples. On the other hand, significant shifts in bacterial community structure in the mouth of the river Morača and Plavnica were shown. Since the results coincide with some of the bioassays and chemical analysis performed previously, the changes in bacterial community structure are explained as an effect of antropogenic pollution on the lake ecosystem by waters of river Morača and stream Plavnica. Conclusion The TTGE has proven to be an efficient and reliable method to monitor bacterial dynamics and community shifts in aquatic environment, especially in the sediments. Within the variety of environmental quality assessments the use of TTGE analyses of bacterial community is strongly recommended, particularly as an initial investigation. However, in any conclusion on the state of the environment, the TTGE results should be combined to some other biological, chemical and hydrological data. Recommendation and Outlook Since prokaryotes are a crucial group of organisms in the biosphere, the ecosystem function studies are largely based on bacterial communities. Therefore, bacterial community structure analysis should be a part of an integrated weight of evidence approach in pollution assessment. In case of Triad approach, consisting of chemical analyses, bioassays, and community studies in the field, the TTGE bacterial community structure analyses should be placed in the later Triad leg. In comparison to other community studies, based on various biotic indices, the TTGE bacterial community analysis has proven to be very sensitive, reliable and less time consuming.  相似文献   

13.
In environmental monitoring, variables with analytically non-detected values are commonly encountered. For the statistical evaluation of these data, most of the methods that produce a less biased performance require specific computer programs. In this paper, a statistical method based on the median semi-variance (SemiV) is proposed to estimate the position and spread statistics in a dataset with single left-censoring. The performances of the SemiV method and 12 other statistical methods are evaluated using real and complete datasets. The performances of all the methods are influenced by the percentage of censored data. In general, the simple substitution and deletion methods showed biased performance, with exceptions for L/2, Inter and L/√2 methods that can be used with caution under specific conditions. In general, the SemiV method and other parametric methods showed similar performances and were less biased than other methods. The SemiV method is a simple and accurate procedure that can be used in the analysis of datasets with less than 50% of left-censored data.  相似文献   

14.
A method for determining atrazine in soil extracts was evaluated by flow injection analysis with spectrophotometric detection. The method is based on the reaction of atrazine with pyridine in an acid medium followed by the reaction with NaOH and sulfanilic acid. Several analytical conditions were previously studied and optimized. Under the best conditions of analysis, the limits of detection and quantification were 0.15 and 0.45 mg L?1, respectively, for a linear response between 0.50 and 2.50 mg L?1, and a sampling throughput of 21 determinations per hour. Using the standard addition method, the maximum relative standard deviation of 17% and recovery values between 80 and 100% were observed for three extracts from soil samples with different composition. The proposed method is simple, low-cost and easy to use, and can be employed for studies involving atrazine in soil samples or for screening of atrazine in soils.  相似文献   

15.
We present a new computed tomography method, the low third derivative (LTD) method, that is particularly suited for reconstructing the spatial distribution of gas concentrations from path-integral data for a small number of optical paths. The method finds a spatial distribution of gas concentrations that (1) has path integrals that agree with measured path integrals, and (2) has a low third spatial derivative in each direction, at every point. The trade-off between (1) and (2) is controlled by an adjustable parameter, which can be set based on analysis of the path-integral data. The method produces a set of linear equations, which can be solved with a single matrix multiplication if the constraint that all concentrations must be positive is ignored; the method is therefore extremely rapid. Analysis of experimental data from thousands of concentration distributions shows that the method works nearly as well as smooth basis function minimization (the best method previously available), yet is about 100 times faster.  相似文献   

16.
This paper describes a novel statistical approach to derive ecologically relevant sediment quality guidelines (SQGs) from field data using a nonparametric empirical Bayesian method (NEBM). We made use of the Norwegian Oil Industrial Association database and extracted concurrently obtained data on species density and contaminant levels in sediment samples collected between 1996 and 2001. In brief, effect concentrations (ECs) of each installation (i.e., oil platform) at a given reduction in species density were firstly derived by fitting a logistic-type regression function to the relationship between the species density and the corresponding concentration of a chemical of concern. The estimated ECs were further improved by the NEBM which incorporated information from other installations. The distribution of these improved ECs from all installations was determined nonparametrically by the kernel method, and then used to determine the hazardous concentration (HC) which can be directly linked to the species loss (or the species being protected) in the sediment. This method also enables an accurate estimation of the lower confidence limit of the HC, even when the number of observations was small. To illustrate the effectiveness of this novel technique, barium, cadmium, chromium, copper, mercury, lead, tetrahydrocannabinol, and zinc were chosen as example contaminants. This novel approach can generate ecologically sound SQGs for environmental risk assessment and cost-effectiveness analysis in sediment remediation or mud disposal projects, since sediment quality is closely linked to species density.  相似文献   

17.
Adair BM  Cobb GP 《Chemosphere》1999,38(12):2951-2958
Concentrations of mercury in biological samples collected for environmental studies are often less than 0.1 microgram/g. Low mercury concentrations and small organ sizes in many wildlife species (approximately 0.1 g) increase the difficulty of mercury determination at environmentally relevant concentrations. We have developed a digestion technique to extract mercury from small (0.1 g), biological samples at these relevant concentrations. Mean recoveries (+/- standard error) from validation trials of mercury fortified tissue samples using cold vapor atomic absorption spectroscopy for analysis ranged from 102 +/- 4.3% (2.5 micrograms/L, n = 15) to 108 +/- 1.4% (25 micrograms/L, n = 15). Recoveries of inorganic mercury were 99 +/- 5 (n = 19) for quality assurance samples analyzed during environmental evaluations conducted during a 24 month period. This technique can be used to determine total mercury concentrations of 60 ng Hg/g sample. Samples can be analyzed in standard laboratories in a short time, at minimal cost. The technique is versatile and can be used to determine mercury concentrations in several different matrices, limiting the time and expense of method development and validation.  相似文献   

18.
To define the soil properties for a given area or country including the level of pollution, soil survey and inventory programs are essential tools. Soil data transformations enable the expression of the original data on a new scale, more suitable for data analysis. In the computer-aided interactive analysis of large data files of soil characteristics containing outliers, the diagnostic plots of the exploratory data analysis (EDA) often find that the sample distribution is systematically skewed or reject sample homogeneity. Under such circumstances the original data should be transformed. The Box-Cox transformation improves sample symmetry and stabilizes spread. The logarithmic plot of a profile likelihood function enables the optimum transformation parameter to be found. Here, a proposed procedure for data transformation in univariate data analysis is illustrated on a determination of cadmium content in the plough zone of agricultural soils. A typical soil pollution survey concerns the determination of the elements Be (16 544 values available), Cd (40 317 values), Co (22 176 values), Cr (40 318 values), Hg (32 344 values), Ni (34 989 values), Pb (40 344 values), V (20 373 values) and Zn (36 123 values) in large samples.  相似文献   

19.
Monte Carlo simulations were conducted on a set of flux chamber measurements at a landfill to estimate the relationship between the number of flux chamber samples and study area size on the emission rate measurement accuracy. The spatial variability of flux was addressed in the study by utilizing an existing flux chamber measurement data set that is one of the most dense flux chamber sampling arrays published to date for a landfill. At a probability of 95%, the Monte Carlo simulations indicated that achieving an accuracy within 10% with the flux chamber method is highly unlikely. An accuracy within 20% was achieved for small areas of less than about 0.2 hectares using 220 flux chamber measurements, but achieving this level of accuracy for area emission sources, of similar or greater variability, that are larger than this is highly unlikely. An accuracy within 30% was achieved up to the Full Area of about 0.4 hectares if more than approximately 120 samples were obtained. Even for an accuracy within 50%, at least 40 flux chamber measurements were needed for the Full Area of about 0.4 hectares. Available methods of estimating the number of samples required were compared to the Monte Carlo simulation results. The Monte Carlo simulations indicate that, in general, more samples are required than determined from an existing statistical method, which is a function of the mean and standard deviation of the population. Specifying the number of samples based on a regulatory method results in very poor accuracy. A modification to the statistical method for estimating the number of samples, or for estimating an accuracy for a given probability and number of samples, is proposed.

Implications: The flux chamber method is the most widely used method of measuring fugitive emission rates from area sources. However, extrapolation of a set of individual flux chamber samples to a larger area results in area flux measurement values of unknown accuracy. Quantification of the accuracy of the extrapolation of a set of flux chamber measurements would be beneficial for understanding the confidence that can be placed on the measurement results. Guidance as to the appropriate number of flux chamber measurements to achieve a desired level of accuracy would benefit flux chamber method practitioners.  相似文献   


20.
Ranking of aquatic toxicity of esters modelled by QSAR   总被引:1,自引:0,他引:1  
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号