首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 265 毫秒
1.
Two trajectory source apportionment methods were tested using an artificially generated data set to determine their ability to detect the known sources. The forward-looking step from the multi-receptor trajectory analysis (MURA) method was added to the conditional probability (CP)method of Ashbaugh et al. [1985. A residence time probability analysis of sulfur concentrations at Grand Canyon National Park. Atmospheric Environment 19(8), 1263–1270] to develop the single-receptor forward CP (SIRA) method. The multi-receptor (MURA) and the SIRA methods were tested with three simulations using artificially generated sources. The ability of the methods to detect the sources was quantified for each simulation. The first simulation showed that the SIRA method is an improvement over the original CP method. The MURA trajectory method proved to be superior at identifying sources for the simulation located in the west and comparable to the SIRA method for the two simulations located in the east.  相似文献   

2.
Trajectory source apportionment (TSA) methods have been used in many research projects to attempt to identify the sources of pollution. Hybrid Single Particle Lagrangian Integrated Trajectories (HYSPLIT) is a popular model for use in various TSA methods. One of the options in this model is to choose a starting height. Very little research is available to assist a user in making this choice. This paper evaluates starting heights of 10, 50, 100, 250, and 500 m on the accuracy of the Multi-Receptor (MURA) method using artificial sources for three different simulations. It was found that using ensembles of trajectories in the MURA method appear to average out most of the biases found from different trajectory starting heights up to the 500 m tested.  相似文献   

3.
A number of works have been trying to validate various trajectory statistical methods (TSMs), mostly through subjective comparison with known sources. Here in a more comprehensive and quantitative approach three trajectory statistical methods (potential source contribution function (PSCF), concentration field method (CF), and redistributed concentration field method (RCF)) were subjected to two validation approaches: validation with virtual and real sources under idealised conditions, where the effects of dispersion and removal of the trace substance are excluded, and comparisons with the EMEP SO2 emission inventory under realistic conditions.The best performance was achieved in an idealised situation with about 78% common spatial variance between the EMEP emission inventory and the trajectory statistical reconstruction of the EMEP emission inventory with the RCF method, whereas the real world experiments for SO2 on an European scale resulted in a much lower performance with 33% common spatial variance between the EMEP SO2 emission inventory and the trajectory statistical reconstruction with the PSCF method.The experiments suggest that the limitation of the accuracy and spatial range of TSMs are rooted in the simplified transport process described just by trajectory paths. If one links these limitations with the concept of the mean residence time of the considered trace substance, a temporal and spatial scope can be deduced, within which the effect of the simplification of the transport process is restricted and useful information can be expected from TSMs. The lower values of the mean residence time for SO2 range from 9 to 17 h, which were deduced from the decay approach, where an exponential decay, respectively, removal of SO2 was built into the trajectory statistical procedure. The values derived from the optimum real world validation experiment place the upper range of the mean residence time to about 60 h or 2.5 days. Both figures are within the range of mean residence times for SO2 cited in literature. Through the validation experiments of this work the rule of thumb, not to trust TSMs beyond the mean residence time of the substance, has become palpable. Nevertheless TSMs and related methods are computationally fast procedures, which deliver first hints on potential source areas, if applied within the frame of the mean residence time of the considered substance.  相似文献   

4.
This paper derives the analytical solutions to multi-compartment indoor air quality models for predicting indoor air pollutant concentrations in the home and evaluates the solutions using experimental measurements in the rooms of a single-story residence. The model uses Laplace transform methods to solve the mass balance equations for two interconnected compartments, obtaining analytical solutions that can be applied without a computer. Environmental tobacco smoke (ETS) sources such as the cigarette typically emit pollutants for relatively short times (7-11 min) and are represented mathematically by a "rectangular" source emission time function, or approximated by a short-duration source called an "impulse" time function. Other time-varying indoor sources also can be represented by Laplace transforms. The two-compartment model is more complicated than the single-compartment model and has more parameters, including the cigarette or combustion source emission rate as a function of time, room volumes, compartmental air change rates, and interzonal air flow factors expressed as dimensionless ratios. This paper provides analytical solutions for the impulse, step (Heaviside), and rectangular source emission time functions. It evaluates the indoor model in an unoccupied two-bedroom home using cigars and cigarettes as sources with continuous measurements of carbon monoxide (CO), respirable suspended particles (RSP), and particulate polycyclic aromatic hydrocarbons (PPAH). Fine particle mass concentrations (RSP or PM3.5) are measured using real-time monitors. In our experiments, simultaneous measurements of concentrations at three heights in a bedroom confirm an important assumption of the model-spatial uniformity of mixing. The parameter values of the two-compartment model were obtained using a "grid search" optimization method, and the predicted solutions agreed well with the measured concentration time series in the rooms of the home. The door and window positions in each room had considerable effect on the pollutant concentrations observed in the home. Because of the small volumes and low air change rates of most homes, indoor pollutant concentrations from smoking activity in a home can be very high and can persist at measurable levels indoors for many hours.  相似文献   

5.
紫外光消毒法是一种重要的污水深度处理方法,消毒器内部辐射强度的分布情况、微生物的停留时间及其运动轨迹对灭菌效果起着至关重要的作用。本文首次在国内系统介绍了采用计算流体动力学(CFD)手段对紫外光消毒器进行数值模拟研究的理论基础和技术路线,指出辐射强度的计算和停留时间的确定是其中的关键。以特定结构的紫外光消毒器为对象,对其内部流场进行了模拟计算,讨论了挡板位置不同所引起的消毒器内部速度场的变化情况;应用离散坐标辐射模型(DO)对紫外光辐射强度分布进行了模拟,同时加入离散粒子模型(DPM)来模拟消毒器内微生物的停留时长和运动轨迹,为最终计算微生物所受紫外光辐射剂量奠定了坚实的基础。  相似文献   

6.
7.
The Alpine stations Zugspitze, Hohenpeissenberg, Sonnblick, Jungfraujoch and Mt. Krvavec contribute to the Global Atmosphere Watch Programme (GAW) of the World Meteorological Organization (WMO). The aim of GAW is the surveillance of the large-scale chemical composition of the atmosphere. Thus, the detection of air pollutant transport from regional sources is of particular interest. In this paper, the origin of NOx (measured with a photo-converter), CO and O3 at the four Alpine GAW stations is studied by trajectory residence time statistics. Although these methods originated during the early 1980s, no comprehensive study of different atmospheric trace gases measured simultaneously at several background observatories in the Alps was conducted up to present.The main NOx source regions detected by the trajectory statistics are the northwest of Europe and the region covering East Germany, Czech Republic and southeast Poland, whereas the main CO source areas are the central, north eastern and eastern parts of Europe with some gradient from low to high latitudes. Subsiding air masses from west and southwest are relatively poor in NOx and CO.The statistics for ozone show strong seasonal effects. Near ground air masses are poor in ozone in winter but rich in ozone in summer. The main source for high ozone concentration in winter is air masses that subside from higher elevations, often enhanced by foehn effects at Hohenpeissenberg. During summer, the Mediterranean constitutes an important additional source for high ozone concentrations.Especially during winter, large differences between Hohenpeissenberg and the higher elevated stations are found. Hohenpeissenberg is frequently within the inversion, whereas the higher elevated stations are above the inversion.Jungfraujoch is the only station where the statistics detect an influence of air rich in CO and NOx from the Po Basin.  相似文献   

8.
We propose a method to evaluate the detection abilities of networks used for protection purposes. Such networks are designed for the detection of nuclear, biological or gaseous emissions, without constraint on the source location. Their assigned goal is to have the best chance to detect a threatening emission located anywhere in the vicinity of a domain to protect. Two sensors siting applications are addressed: sensors placed in the surroundings of a facility to protect, or sensors carried by people scattered within a small area. A network protection ability is related both to its detection scope, and to its response time. To assess the performance of such networks, two statistical indicators are therefore designed: the detection probability, computed on a large number of possible source locations, and the saturation time, which is the time when the maximum detection probability has been reached.Simulations are then carried out with the Polyphemus air quality modeling system for many emission scenarios, including 961 possible source locations, various emitted species, and a few representative meteorological situations. This allows to assess the performance of single sensors as well as full networks, and their sensitivity to parameters like meteorological conditions and source characteristics. The emitted quantity and meteorological dispersion are found to be important parameters, whereas the species type does not significantly influence the results. Two network design methods are considered: (1) networks composed of a given number of the “best” sensors according to an indicator, and (2) sensors placed in circles around the protected domain. The networks built with respect to the detection probability show good results with a limited number of sensors, while the saturation time is not reliable enough to build networks. The networks based on circles also show a good performance in the studied cases, provided there is a sufficient number of sensors.  相似文献   

9.
The Chihuahuan Desert region of North America is a significant source of mineral aerosols in the Western Hemisphere, and Chihuahuan Desert dust storms frequently impact the Paso del Norte (El Paso, USA/Ciudad Juarez, Mexico) metropolitan area. A statistical analysis of HYSPLIT back trajectory residence times evaluated airflow into El Paso on all days and on days with synoptic (non-convective) dust events in 2001–2005. The incremental probability—a measure of the areas most likely to have been traversed by air masses arriving at El Paso during dusty days—was only strongly positively associated with the region west–southwest of the city, a zone of known dust source areas. Focused case studies were made of major dust events on 15 April and 15 December 2003. Trajectories approached the surface and MM5 (NCAR/Penn State Mesoscale Model) wind speeds increased at locations consistent with dust sources observed in satellite imagery on those dates. Back trajectory and model analyses suggested that surface cyclones adjacent to the Chihuahuan Desert were associated with the extreme dust events, consistent with previous studies of dust storms in the Southern High Plains to the northeast. The recognition of these meteorological patterns serves as a forecast aid for prediction of dust events likely to impact the Paso del Norte.  相似文献   

10.
Standard approaches for computing population exposures due to specific sources of air pollutants are relatively complex. In many cases, more simple and approximate methods would be useful. This paper develops an approach, based on the concept of exposure efficiency, that may be used for estimating the impact of a source (or source class) on the integrated population exposure. The approach is illustrated by an example, which uses the concept of exposure efficiency to examine the impact of perchloroethylene emissions from dry cleaners in the United States. The paper explores the geographic variability of exposure efficiency by evaluating it for each of 100 randomly selected dry cleaners. For perchloroethylene, which has a long atmospheric residence time, the site-to-site variability in exposure efficiency is found to be relatively small. This suggests that simple exposure assessments, based on generic distributional characterizations of exposure efficiency, may be used in risk assessments without introducing appreciable uncertainty. For many compounds, like perchloroethylene, the uncertainty inherent in the estimation of cancer potency or source emissions would dominate these small errors.  相似文献   

11.
The information presented in this paper is directed to air pollution scientists with an interest in applying air quality simulation models. RAM is the three letter designation for this efficient Gaussian-plume multiple-source air quality algorithm. RAM is a method of estimating short-term dispersion using the Gaussian steady-state model. This algorithm can be used for estimating air quality concentrations of relatively stable pollutants for averaging times from an hour to a day in urban areas from point and area sources. The algorithm is applicable for locations with level or gently rolling terrain where a single wind vector for each hour is a good approximation to the flow over the source area considered. Calculations are performed for each hour. Hourly meteorological data required are wind direction, wind speed, stability class, and mixing height. Emission information required of point sources consists of source coordinates, emission rate, physical height, stack gas volume flow and stack gas temperature. Emission information required of area sources consists of south-west corner coordinates, source area, total area emission rate and effective area source height. Computation time is kept to a minimum by the manner in which concentrations from area sources are estimated using a narrow plume hypothesis and using the area source squares as given rather than breaking down all sources to an area of uniform elements. Options are available to the user to allow use of three different types of receptor locations: 1 ) those whose coordinates are input by the user, 2) those whose coordinates are determined by thé model and are downwind óf significant point and area sources where maxima are likely to occur, and 3) those whose coordinates are determined by the model to give good area coverage of a specific portion of the region. Computation time is also decreased by keeping the number of receptors to a minimum.  相似文献   

12.
An innovative and effective method using isentropic trajectory analysis based on the residence time of air masses over the polluted region of Europe was successfully applied to categorize surface ozone amounts at Arosa, Switzerland during 1996–1997. The “European representative” background ozone seasonal cycle at Arosa is associated with long-range transport of North Atlantic air masses, and displays the spring maximum–summer minimum with an annual average of 35 ppb. The photochemical ozone production due to the intense large-scale anthropogenic emission over Europe is estimated as high as 20 ppb in summer, whereas it is insignificant in winter. European sources contribute an annual net ozone production of 9–12 ppb at Arosa. Comparison with the selected regional representative site in Western Europe shows similar results indicating that the categorized ozone data at Arosa by this technique could be regarded as a representative for northern hemispheric mid-latitudes.  相似文献   

13.
ABSTRACT

Standard approaches for computing population exposures due to specific sources of air pollutants are relatively complex. In many cases, more simple and approximate methods would be useful. This paper develops an approach, based on the concept of exposure efficiency, that may be used for estimating the impact of a source (or source class) on the integrated population exposure. The approach is illustrated by an example, which uses the concept of exposure efficiency to examine the impact of perchloroeth-ylene emissions from dry cleaners in the United States. The paper explores the geographic variability of exposure efficiency by evaluating it for each of 100 randomly selected dry cleaners. For perchloroethylene, which has a long atmospheric residence time, the site-to-site variability in exposure efficiency is found to be relatively small. This suggests that simple exposure assessments, based on generic distributional characterizations of exposure efficiency, may be used in risk assessments without introducing appreciable uncertainty. For many compounds, like perchloroethylene, the uncertainty inherent in the estimation of cancer potency or source emissions would dominate these small errors.  相似文献   

14.
A discrete vortex model of the recirculating flow behind a two-dimensional backward-facing step is used to calculate the trajectories of particles released from a fixed point. By averaging over a large number of such trajectories, an estimate is made of the mean concentration profile associated with a steady source in the wake. These estimates are verified against experimental data for point- and line-sources. The importance of incorporating a ‘random walk’ in calculating the trajectories is demonstrated. The mean flow in the discrete vortex model used appears to be the most critical factor in determining mean concentrations. The poorest predictions appear to be associated with the longest trajectories.Particle ‘recirculation times’ are also briefly examined and it is shown how these are related to the ‘residence times’ of Vincent (1976, Atmospheric Environment11, 765–774) and others. It is suggested that such residence times may not be an appropriate means of quantifying near-wake dispersion if sources are inside the wake. Advantages of a ‘particle trajectory’ method, as against a diffusion equation method, for dealing with dispersion in inhomogeneous flow are finally presented.  相似文献   

15.
Motor vehicles are major sources of fine particulate matter (PM2.5), and the PM2.5 from mobile vehicles is associated with adverse health effects. Traditional methods for estimating source impacts that employ receptor models are limited by the availability of observational data. To better estimate temporally and spatially resolved mobile source impacts on PM2.5, we developed an approach based on a method that uses elemental carbon (EC), carbon monoxide (CO), and nitrogen oxide (NOx) measurements as an indicator of mobile source impacts. We extended the original integrated mobile source indicator (IMSI) method in three aspects. First, we generated spatially resolved indicators using 24-hr average concentrations of EC, CO, and NOx estimated at 4 km resolution by applying a method developed to fuse chemical transport model (Community Multiscale Air Quality Model [CMAQ]) simulations and observations. Second, we used spatially resolved emissions instead of county-level emissions in the IMSI formulation. Third, we spatially calibrated the unitless indicators to annually-averaged mobile source impacts estimated by the receptor model Chemical Mass Balance (CMB). Daily total mobile source impacts on PM2.5, as well as separate gasoline and diesel vehicle impacts, were estimated at 12 km resolution from 2002 to 2008 and 4 km resolution from 2008 to 2010 for Georgia. The total mobile and separate vehicle source impacts compared well with daily CMB results, with high temporal correlation (e.g., R ranges from 0.59 to 0.88 for total mobile sources with 4 km resolution at nine locations). The total mobile source impacts had higher correlation and lower error than the separate gasoline and diesel sources when compared with observation-based CMB estimates. Overall, the enhanced approach provides spatially resolved mobile source impacts that are similar to observation-based estimates and can be used to improve assessment of health effects.

Implications: An approach is developed based on an integrated mobile source indicator method to estimate spatiotemporal PM2.5 mobile source impacts. The approach employs three air pollutant concentration fields that are readily simulated at 4 and 12 km resolutions, and is calibrated using PM2.5 source apportionment modeling results to generate daily mobile source impacts in the state of Georgia. The estimated source impacts can be used in investigations of traffic pollution and health.  相似文献   


16.
The multiple nested three-dimensional (3D) mesoscale Eulerian grid point model MM5 is directly coupled with a Lagrangian particle trajectory model in order to perform a four-dimensional source attribution for the area of Berlin based on the horizontal distribution of the import probability density (IPD). The technical aspects are already demonstrated in the companion paper A (Part I) including the illustration of the meteorological situation at the two consecutive days of investigation and a primarily 3D source attribution. We conducted further sensitivity studies concerning the effect of vertical mixing, the static stability of the particles/emissions and the regarded time scale on the IPD distribution which is extended to four dimensions. The main results are:
  • •Heterogeneity and temporal variability of the wind field enhance the contributions of nearby sources (emissions) to the total import of the receptor in contrast to stationary wind fields which increase the scope of the IPD distribution in the upstream direction.
  • •Regions of static stability, for example morning hour inversion layers, enhance the contribution of far distance sources with longer import times.
  • •The import velocities increase, as far as long distance source-receptor transitions are concerned, because they are mostly realised via higher transport paths.
  • •The third (vertical) dimension is not negligible for the task of a complete source attribution, as a considerable amount of elevated emissions, preferably out of 300±100 m elevation, reaches the receptor box which is only 50 m in depth. Hence, downward mixing of elevated and far distance sources is an important process and driven by the diurnal course of turbulence and low level jets within the PBL
  • •On the short time scale (few days), the source attribution is not independent from the regarded time scale (simulation time) due to the neglecting of older emissions released before the beginning of the simulation.
  相似文献   

17.
The potential source contribution function (PSCF) has been used to study the source–receptor relationships for total gaseous mercury (TGM) found in air collected at two sites along the St. Lawrence River valley, namely at St. Anicet and Mingan. TGM concentrations have been measured with high time-resolution analysers (Tekran instrument). The source–receptor analyses have been applied with regards to the seasonality of TGM. Median TGM concentrations are significantly less (χ2: α=0.01) during the summertime than other periods at both sites. A total of 12 225 trajectory end-points for St. Anicet and 4480 trajectory end points for Mingan have been used to create potential source area maps. This study identifies preferred potential sources of TGM at St. Anicet during wintertime with strongest probability stretching from the Gulf of Mexico to the southern tip of Greenland. This pattern mimics, the North American anthropogenic Hg emission inventory. Furthermore, some Eurasian mercury air mass intrusions are suggested at Mingan during wintertime. The summertime period at Mingan points out some potential sources stretching from the american mid-west to the St. Lawrence River valley as well as areas around the southern tip of the Hudson Bay.  相似文献   

18.
Air pollution control devices (APCDs) are not compulsory for medical waste incinerators (MWIs) in developing countries. In South Africa, combustion gases are usually vented directly to the atmosphere at temperatures greater than the formation temperature of dioxin. The possibility of dioxin formation outside the incinerator stack has been hypothesized. A plume model has been developed and tested in the wind tunnel with a scale model of an incinerator stack. The plume temperature and trajectory predictions of the plume model were verified within a +/- 3% experimental accuracy. Using South African data, the plume model predicts that the residence time of gases in the temperature range of 150-450 degrees C in a plume is 1.3 sec on average for 5% of a year (18 days) at meteorological conditions resulting in wind speeds of less than 1 m/sec. Two published dioxin formation models were used to assess the probability of dioxin formation in the plume. The formation models predict that the average polychlorinated dibenzodioxins/furans (PCDD/Fs) formed in the plume will exceed the stack emission regulations in South Africa of 0.2 ng/Nm3 toxic equivalent quotient (TEQ) by between 2 and 40 times. The calculated concentrations do not include additional gaseous PCDD/F compounds that may be formed at high-temperature post-combustion zones through pyrosynthesis mechanisms.  相似文献   

19.
An air quality simulation model that is simple, yet capable of accurately estimating concentrations under unsteady meteorological conditions, has been developed. This trajectory plume model uses the Gaussian plume equation, but has an applicability that is approximately as wide as the Lagrangian puff model. The plume axis is represented by a series of straight-line plume segments. The performance of this model was evaluated by comparing it with other diffusion models. A comparison between simulation results using the present model and those using integrated puff and Eulerian diffusion models for three different metropolitan areas (one in Japan and two in the U.S.) has indicated that a simple trajectory plume model performs as well as the two other more complex models in simulating pollutant dispersion under complicated meteorological conditions such as those which occur during the transition period from a sea breeze to a land breeze.  相似文献   

20.
A one-year-long experiment in which two different tracers were simultaneously released from two different locations was used to test various hybrid receptor modeling techniques to estimate the tracer emissions using the measured air concentrations and a meteorological model. Air concentrations were measured over an 8-hour averaging time at three sites 14 to 40 km downwind. When the model was used to estimate emissions at only one tracer source, 6 percent of the short-term (8-h) emission estimates were within a factor of 2 of the actual emissions. Temporal averaging of the 8-h data enhanced the precision of the estimate such that after 10 days 42 percent of the estimates were within a factor of 2 and after six months all of them (each source-receptor pair) were within a factor of 2. To test the ability of the model to separate two sources, both tracer sources were combined, and a multiple linear regression technique was used to determine the emissions from each source from a time series of air concentration measurements representing the sum of both tracers. In general, 50 percent of the short-term estimates were within a factor of 10, 25 percent were biased low, and in another 25 percent the regression technique failed. The bias and failures are attributed to low or no correlation between measured air concentrations and model calculated dispersion factors. In the regression method increased temporal averaging did not consistently improve the emission estimate since the ability of the model to distinguish emissions between sources was diminished with increased averaging time. However, including progressively longer time periods (more data) into the regression or spatially averaging the data over all the receptors was found to be the most effective method to improve the estimated emissions. At best about 75 percent of the estimated monthly emission data were within a factor of 10 of the measured values. This suggests that the usefulness of meteorological models and statistical methods to address questions of source attribution requires many data points to reduce the uncertainty in the emission estimates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号