首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

The efficiency of four sample processing methods was tested with eight different types of soils representing the major proportion of cultivated soils. The principle of sampling constant was applied for characterizing the efficiency of the procedures and testing the well-mixed status of the prepared soil. The test material was 14C-labeled atrazine that enabled keeping the random error of analyses ≤ about 1%. Adding water to the soil proved to be the most efficient and generally applicable procedure resulting in about 6% relative sample processing uncertainty for 20 g test portions. The expected error is inversely proportional to the mass of test portion. Smashing and manual mixing of soil resulted in about four times higher uncertainty than mixing with water. Grinding of soil is applicable for dry soils only, but the test procedure applied was not suitable for estimating a typical uncertainty of processing dry soil samples. Adding dry ice did not improve the efficiency of sample processing.  相似文献   

2.
Abstract

Confidence interval construction for central tendency is a problem of practical consequence for those who must analyze air contaminant data. Determination of compliance with relevant ambient air quality criteria and assessment of associated health risks depend upon quantifying the uncertainty of estimated mean pollutant concentrations. The bootstrap is a resampling technique that has been steadily gaining popularity and acceptance during the past several years. A potentially powerful application of the bootstrap is the construction of confidence intervals for any parameter of any underlying distribution. Properties of bootstrap confidence intervals were determined for samples generated from lognormal, gamma, and Weibull distributions. Bootstrap t intervals, while having smaller coverage errors than Student's t or other bootstrap methods, under-cover for small samples from skewed distributions. Therefore, we caution against using the bootstrap to construct confidence intervals for the mean without first considering the effects of sample size and skew. When sample sizes are small, one might consider using the median as an estimate of central tendency. Confidence intervals for the median are easy to construct and do not under-cover. Data collected by the Northeast States for Coordinated Air Use Management (NESCAUM) are used to illustrate application of the methods discussed.  相似文献   

3.
Abstract

The quality of stationary source emission factors is typically described using data quality ratings, which provide no quantification of the precision of the emission factor for an average source, nor of the variability from one source to another within a category. Variability refers to actual differences caused by differences in feedstock composition, design, maintenance, and operation. Uncertainty refers to lack of knowledge regarding the true emissions. A general methodology for the quantification of variability and uncertainty in emission factors, activity factors, and emission inventories (EIs) is described, featuring the use of bootstrap simulation and related techniques. The methodology is demonstrated via a case study for a selected example of NOx emissions from coal-fired power plants. A prototype software tool was developed to implement the methodology. The range of interunit variability in selected activity and emission factors was shown to be as much as a factor of 4, and the range of uncertainty in mean emissions is shown to depend on the interunit variability and sample size. The uncertainty in the total inventory of ?16 to +19% was attributed primarily to one technology group, suggesting priorities for collecting data and improving the inventory. The implications for decision-making are discussed.  相似文献   

4.
Point velocity measurements conducted by traversing a Pitot tube across the cross section of a flow conduit continue to be the standard practice for evaluating the accuracy of continuous flow-monitoring devices. Such velocity traverses were conducted in the exhaust duct of a reduced-scale analog of a stationary source, and mean flow velocity was computed using several common integration techniques. Sources of random and systematic measurement uncertainty were identified and applied in the uncertainty analysis. When applicable, the minimum requirements of the standard test methods were used to estimate measurement uncertainty due to random sources. Estimates of the systematic measurement uncertainty due to discretized measurements of the asymmetric flow field were determined by simulating point velocity traverse measurements in a flow distribution generated using computational fluid dynamics. For the evaluated flow system, estimates of relative expanded uncertainty for the mean flow velocity ranged from ±1.4% to ±9.3% and depended on the number of measurement locations and the method of integration.
Implications:Accurate flow measurements in smokestacks are critical for quantifying the levels of greenhouse gas emissions from fossil-fuel-burning power plants, the largest emitters of carbon dioxide. A systematic uncertainty analysis is necessary to evaluate the accuracy of these measurements. This study demonstrates such an analysis and its application to identify specific measurement components and procedures needing focused attention to improve the accuracy of mean flow velocity measurements in smokestacks.  相似文献   

5.
The efficiency of four sample processing methods was tested with eight different types of soils representing the major proportion of cultivated soils. The principle of sampling constant was applied for characterizing the efficiency of the procedures and testing the well-mixed status of the prepared soil. The test material was 14C-labeled atrazine that enabled keeping the random error of analyses 相似文献   

6.
Abstract

Variability refers to real differences in emissions among multiple emission sources at any given time or over time for any individual emission source. Variability in emissions can be attributed to variation in fuel or feedstock composition, ambient temperature, design, maintenance, or operation. Uncertainty refers to lack of knowledge regarding the true value of emissions. Sources of uncertainty include small sample sizes, bias or imprecision in measurements, nonrepresentativeness, or lack of data. Quantitative methods for characterizing both variability and uncertainty are demonstrated and applied to case studies of emission factors for lawn and garden (L&G) equipment engines. Variability was quantified using empirical and parametric distributions. Bootstrap simulation was used to characterize confidence intervals for the fitted distributions. The 95% confidence intervals for the mean grams per brake horsepower/hour (g/hp-hr) emission factors for two-stroke engine total hydrocarbon (THC) and NOx emissions were from -30 to +41% and from -45 to +75%, respectively. The confidence intervals for four-stroke engines were from -33 to +46% for THCs and from -27 to +35% for NOx. These quantitative measures of uncertainty convey information regarding the quality of the emission factors and serve as a basis for calculation of uncertainty in emission inventories (Els).  相似文献   

7.

In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m2 whereas duplicate samples were collected in the same way at a distance of 1–2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO3 (1:1) + 1 mL 30 % H2O2) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  相似文献   

8.
ABSTRACT

Originally constructed to develop gaseous emission factors for heavy-duty diesel trucks, the U.S. Environmental Protection Agency's (EPA) On-Road Diesel Emissions Characterization Facility has been modified to incorporate particle measurement instrumentation. An electrical low-pressure impactor designed to continuously measure and record size distribution data was used to monitor the particle size distribution of heavy-duty diesel truck exhaust. For this study, which involved a high-mileage (900,000 mi) truck running at full load, samples were collected by two different methods. One sample was obtained directly from the exhaust stack using an adaptation of the University of Minnesota's air-ejector-based mini-dilution sampler. The second sample was pulled from the plume just above the enclosed trailer, at a point ~11 m from the exhaust discharge. Typical dilution ratios of about 300:1 were obtained for both the dilution and plume sampling systems. Hundreds of particle size distributions were obtained at each sampling location. These were compared both selectively and cumulatively to evaluate the performance of the dilution system in simulating real-world exhaust plumes. The data show that, in its current residence-time configuration, the dilution system imposes a statistically significant bias toward smaller particles, with substantially more nanoparticles being collected than from the plume sample.  相似文献   

9.
Abstract

A fuel-based methodology for calculating motor vehicle emission inventories is presented. In the fuel-based method, emission factors are normalized to fuel consumption and expressed as grams of pollutant emitted per gallon of gasoline burned. Fleet-average emission factors are calculated from the measured on-road emissions of a large, random sample of vehicles. Gasoline use is known at the state level from sales tax data, and may be disaggregated to individual air basins. A fuel-based motor vehicle CO inventory was calculated for the South Coast Air Basin in California for summer 1991. Emission factors were calculated from remote sensing measurements of more than 70,000 in-use vehicles. Stabilized exhaust emissions of CO were estimated to be 4400 tons/day for cars and 1500 tons/day for light-duty and medium- duty trucks, with an estimated uncertainty of ±20% for cars and ±30% for trucks. Total motor vehicle CO emissions, including incremental start emissions and emissions from heavy-duty vehicles were estimated to be 7900 tons/day. Fuelbased inventory estimates were greater than those of California's MVEI 7F model by factors of 2.2 for cars and 2.6 for trucks. A draft version of California's MVEI 7G model, which includes increased contributions from high-emitting vehicles and off-cycle emissions, predicted CO emissions which closely matched the fuel-based inventory. An analysis of CO mass emissions as a function of vehicle age revealed that cars and trucks which were ten or more years old were responsible for 58% of stabilized exhaust CO emissions from all cars and trucks.  相似文献   

10.
ABSTRACT

Easy-to-use commercial kit-based enzyme-linked immunosorbent assays (ELISAs) have been used to detect neonicotinoid dinotefuran, clothianidin and imidacloprid in Chinese chives, which are considered a troublesome matrix for chromatographic techniques. Based on their high water solubility, water was used as an extractant. Matrix interference could be avoided substantially just diluting sample extracts. Average recoveries of insecticides from spiked samples were 85–113%, with relative standard deviation of <15%. The concentrations of insecticides detected from the spiked samples with the proposed ELISA methods correlated well with those by the reference high-performance liquid chromatography (HPLC) method. The residues analyzed by the ELISA methods were consistently 1.24 times that found by the HPLC method, attributable to loss of analyte during sample clean-up for HPLC analyses. It was revealed that the ELISA methods can be applied easily to pesticide residue analysis in troublesome matrix such as Chinese chives.  相似文献   

11.
Abstract

Most of the synthetic gypsum generated from wet flue gas desulfurization (FGD) scrubbers is currently being used for wallboard production. Because oxidized mercury is readily captured by the wet FGD scrubber, and coal-fired power plants equipped with wet scrubbers desire to benefit from the partial mercury control that these systems provide, some mercury is likely to be bound in with the FGD gypsum and wallboard. In this study, the feasibility of identifying mercury species in the FGD gypsum and wallboard samples was investigated using a large sample size thermal desorption method. Potential candidates of pure mercury standards including mercuric chloride (HgCl2), mercurous chloride (Hg2Cl2), mercury oxide (HgO), mercury sulfide (HgS), and mercuric sulfate (HgSO4) were analyzed to compare their results with those obtained from FGD gypsum and dry wallboard samples. Although any of the thermal evolutionary curves obtained from these pure mercury standards did not exactly match with those of the FGD gypsum and wallboard samples, it was identified that Hg2Cl2 and HgCl2 could be candidates. An additional chlorine analysis from the gypsum and wallboard samples indicated that the chlorine concentrations were approximately 2 orders of magnitude higher than the mercury concentrations, suggesting possible chlorine association with mercury.  相似文献   

12.
This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min–1 (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min–1 (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min–1 (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1 x 10–6 g m–3 to 18.0 x 10–6 g m–3, which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.

Implications:?This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate matter (PM) concentrations approach regulatory limits, the uncertainty of the measurement is essential in determining the sample size and the probability of type II errors in hypothesis testing. This is an important factor in determining if ambient PM concentrations exceed regulatory limits. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.  相似文献   

13.
Energy supply utilities release significant amounts of greenhouse gases (GHGs) into the atmosphere. It is essential to accurately estimate GHG emissions with their uncertainties, for reducing GHG emissions and mitigating climate change. GHG emissions can be calculated by an activity-based method (i.e., fuel consumption) and continuous emission measurement (CEM). In this study, GHG emissions such as CO2, CH4, and N2O are estimated for a heat generation utility, which uses bituminous coal as fuel, by applying both the activity-based method and CEM. CO2 emissions by the activity-based method are 12–19% less than that by the CEM, while N2O and CH4 emissions by the activity-based method are two orders of magnitude and 60% less than those by the CEM, respectively. Comparing GHG emissions (as CO2 equivalent) from both methods, total GHG emissions by the activity-based methods are 12–27% lower than that by the CEM, as CO2 and N2O emissions are lower than those by the CEM. Results from uncertainty estimation show that uncertainties in the GHG emissions by the activity-based methods range from 3.4% to about 20%, from 67% to 900%, and from about 70% to about 200% for CO2, N2O, and CH4, respectively, while uncertainties in the GHG emissions by the CEM range from 4% to 4.5%. For the activity-based methods, an uncertainty in the Intergovernmental Panel on Climate Change (IPCC) default net calorific value (NCV) is the major uncertainty contributor to CO2 emissions, while an uncertainty in the IPCC default emission factor is the major uncertainty contributor to CH4 and N2O emissions. For the CEM, an uncertainty in volumetric flow measurement, especially for the distribution of the volumetric flow rate in a stack, is the major uncertainty contributor to all GHG emissions, while uncertainties in concentration measurements contribute a little to uncertainties in the GHG emissions.
Implications:Energy supply utilities contribute a significant portion of the global greenhouse gas (GHG) emissions. It is important to accurately estimate GHG emissions with their uncertainties for reducing GHG emissions and mitigating climate change. GHG emissions can be estimated by an activity-based method and by continuous emission measurement (CEM), yet little study has been done to calculate GHG emissions with uncertainty analysis. This study estimates GHG emissions and their uncertainties, and also identifies major uncertainty contributors for each method.  相似文献   

14.
Abstract

It is a current practice that refrigerators and freezers in many countries are shredded after the end of useful lives. The shredder residue is deposited in landfills. During the shredding process a significant fraction of blowing agent (BA) in the insulation foam may be released into the atmosphere. The objective of this study is to determine the fraction of BA released from foam during shredding, by comparing the BA content in insulation foam of refrigerator units before shredding with the BA content of shredded foam. All foam samples analyzed were manufactured with trichlorofluoromethane [CFC-11 (CCl3F)] as BA. The average content of BA in the insulation foam from eight U.S. refrigerator units manufactured before 1993 was found to be 14.9% ± 3.3% w/w. Several refrigerator units also identified as being manufactured before 1993 were stockpiled and shredded at three shredder facilities, of which one was operated in both wet and dry modes. The selected shredder facilities represent typical American facilities for shredding automobiles, refrigerators, freezers, and other iron containing waste products. Shredded material was collected and separated on location into four particle size categories: more than 32 mm, 16–32 mm, 8–16 mm, and 0–8 mm. Adjusting for sample purity, it was found that the majority (>81%) of the foam mass was shredded into particles larger than 16 mm. The smallest size fraction of foam (0–8 mm) was found to contain significantly less BA than the larger size categories, showing that up to 68% ± 4% of the BA is released from these fine particles during the shredding process. Because only a minor fraction of the foam is shredded into particles smaller than 8 mm, this has a minor impact on the end result when calculating the total BA release from the shredding process. Comparing BA content in shredded samples from the three shredder facilities with the measured average BA content of the eight refrigerator units, it was found that on average 24.2% ± 7.5% of the initial BA content is released during the shredding process.  相似文献   

15.
ABSTRACT

The effects of the spread of residue concentrations in the samples derived from the selected supervised trials and the number of trials were studied on the magnitude and uncertainty of the short-term dietary intakes calculated with the proposed new procedure (IESTIp) and that one used currently by the FAO (Food and Agriculture Organization) and WHO (World Health Organization) Joint meeting on Pesticide Residues (JMPR) (IESTIc). The residue data of 10 pesticides were obtained from supervised trials conducted on apples and pears. The methods described in Part I were used for the calculations of the uncertainty. The results indicate that the ratio of IESTIP to IESTIcIESTI) is directly proportional to the ratio of the estimated maximum residue level (MRL), recommended by the JMPR; to the highest residue (HR) observed in supervised trials, and it may have a wide range depending on the particular conditions. The φIESTI becomes greater with the increase of the difference between the mrl or maximum residue limit (MRL, established by the Codex Alimentarius Commission, CAC) and HR, and becomes smaller if the difference between the large portion (LP) and unit mass (U) decreases. The φIESTI ranged between 2 and 5.1 in the 16 cases examined indicating that the IESTIp calculation method leads to higher intake estimates. The ratio of CVIESTIp and CVIESTIc ranged typically between 0.62 and 1.71. It rapidly increased up to 12 trials. For a larger number of trials, the ratio remained practically constant (1.69–1.71). The processing factor (PF) equally affects the MRL and HR values, therefore, it will not practically influence the φIESTI. The uncertainty of the estimated median residues depends on the spread and number of values in the residue datasets, which affects the uncertainty of the conversion factor (CF) and subsequently the uncertainty of the estimated IESTIp. Residue values obtained from minimum nine independent trials are required for the correct calculation of the 95% confidence intervals of the calculated median residues. The uncertainty of the analytical results directly affects the median, HR values and indirectly the calculated mrl and the MRL derived from it. Therefore, it should also be considered for the calculation of the combined uncertainty of the conversion factors. For the correct interpretation of the results of dietary exposure calculations, the upper 95% confidence limit of the short-term intake should also be considered. However, it is not the current practice of regulatory agencies or JMPR.  相似文献   

16.
ABSTRACT

Measurements of residual perchloroethylene (PCE), a dry-cleaning solvent associated with human health effects, were made in dry-cleaned acetate cloth to enable improved characterizations of both occupational and environmental exposure. A limited sample size (25 acetate cloths) was used to explore the extent of inter-dry-cleaner variability in residual PCE and to characterize the effect of the pressing operation on residual PCE. A new method, which uses carbon-disulfide as the direct extracting agent, proved effective in the analysis of residual PCE, with a recovery-efficiency ≈ 75%. Inter-dry-cleaner variability of residual PCE, although marginally statistically significant, was relatively low, showing only a fourfold range compared to a 5-order-of-magnitude range obtained from Kawauchi and Nishiyama1. Pairwise comparison of residual PCE in nonpressed versus pressed acetate samples revealed a statistically significant reduction (p < 0.008), which amounted to a consistent (among dry-cleaners) pressing-related removal efficiency of 75 ± 4%. A preliminary assessment of the source term associated with the pressing operation (mass PCE liberated per kg cloth dry-cleaned, SPCE ≈ 30 mg/kg) indicates a minor contribution to the average ambient air concentrations within dry-cleaning establishments.  相似文献   

17.
Abstract

In the present work, dispersive micro-solid phase extraction (D-μ-SPE) method using magnetic graphene oxide tert-butylamine (GO/Fe3O4/TBA) nanocomposite, as an efficient sorbent, was applied for determining 2,4-dichlorophenoxyacetic acid (2,4-D) in water and food samples. Detection was carried out using high-performance liquid chromatography (HPLC) instrument. Influential parameters of D-μ-SPE such as sorbent and its amount, elution solvent and its volume, adsorption and desorption times and pH of sample solution were investigated and optimized. Under the optimized conditions, limit of detection and quantitation values were 0.007 and 0.02?μg/mL, respectively. Recovery data for several real samples were obtained within the range of 88.0–94.0% with a relative standard deviation (RSD) less than 7.5%. The proposed method was successfully applied to quantitative determination of 2,4-D in several vegetables and water samples.  相似文献   

18.
Abstract

In this study, an interval minimax regret programming (IMMRP) method is developed for the planning of municipal solid waste (MSW) management under uncertainty. It improves on the existing interval programming and minimax regret analysis methods by allowing uncertainties presented as both intervals and random variables to be effectively communicated into the optimization process. The IMMRP can account for economic consequences under all possible scenarios without any assumption on their probabilities. The developed method is applied to a case study of long-term MSW management planning under uncertainty. Multiple scenarios associated with different cost and risk levels are analyzed. Reasonable solutions are generated, demonstrating complex tradeoffs among system cost, regret level, and system-failure risk. The method can also facilitate examination of the difference between the cost incurred with identified strategy and the least cost under an ideal condition. The results can help determine desired plans and policies for waste management under a variety of uncertainties.  相似文献   

19.
Abstract

Quantitative methods for characterizing variability and uncertainty were applied to case studies of oxides of nitrogen and total organic carbon emission factors for lean-burn natural gas-fueled internal combustion engines. Parametric probability distributions were fit to represent inter-engine variability in specific emission factors. Bootstrap simulation was used to quantify uncertainty in the fitted cumulative distribution function and in the mean emission factor. Some methodological challenges were encountered in analyzing the data. For example, in one instance, five data points were available, with each data point representing a different market share. Therefore, an approach was developed in which parametric distributions were fitted to population-weighted data. The uncertainty in mean emission factors ranges from as little as ~±10% to as much as -90 to 21+180%. The wide range of uncertainty in some emission factors emphasizes the importance of recognizing and accounting for uncertainty in emissions estimates. The skewness in some uncertainty estimates illustrates the importance of using numerical simulation approaches that do not impose restrictive symmetry assumptions on the confidence interval for the mean. In this paper, the quantitative method, the analysis results, and key findings are presented.  相似文献   

20.
ABSTRACT

Aerosol water content was determined from relative humidity controlled optical particle counter (ASASP-X) size distribution measurements made during the Southeastern Aerosol and Visibility Study (SEAVS) in the Great Smoky Mountains National Park during summer 1995. Since the scattering response function of the ASASP-X is sensitive to particle refractive index, a technique for calibrating the ASASP-X for any real refractive index was developed. A new iterative process was employed to calculate water mass concentration and wet refractive index as functions of relative humidity. Experimental water mass concentrations were compared to theoretically predicted values assuming only ammonium sulfate compounds were hygroscopic. These comparisons agreed within experimental uncertainty. Estimates of particle hygroscopicity using a rural aerosol model of refractive index as a function of relative humidity demonstrated no significant differences from those made with daily varying refractive index estimates. Although aerosol size parameters were affected by the assumed chemical composition, forming ratios of these parameters nearly canceled these effects.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号