首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
A mesoscale atmospheric model PSU/NCAR MM5 is used to provide operational weather forecasts for a nuclear emergency response decision support system on the southeast coast of India. In this study the performance of the MM5 model with assimilation of conventional surface and upper-air observations along with satellite derived 2-d surface wind data from QuickSCAT sources is examined. Two numerical experiments with MM5 are conducted: one with static initialization using NCEP FNL data and second with dynamic initialization by assimilation of observations using four dimensional data assimilation (FDDA) analysis nudging for a pre-forecast period of 12 h. Dispersion simulations are conducted for a hypothetical source at Kalpakkam location with the HYSPLIT Lagrangian particle model using simulated wind field from the above experiments. The present paper brings out the differences in the atmospheric model predictions and the differences in dispersion model results from control and assimilation runs. An improvement is noted in the atmospheric fields from the assimilation experiment which has led to significant alteration in the trajectory positions, plume orientation and its distribution pattern. Sensitivity tests using different PBL and surface parameterizations indicated the simple first order closure schemes (Blackadar, MRF) coupled with the simple soil model have given better results for various atmospheric fields. The study illustrates the impact of the assimilation of the scatterometer wind and automated weather stations (AWS) observations on the meteorological model predictions and the dispersion results.  相似文献   

2.
The evolution of photochemical smog in a plant plume was investigated with the aid of an instrumented helicopter. Air samples were taken in the plume of the Cumberland Power Plant, located in central Tennessee, during the afternoon of 16 July 1995 as part of the Southern Oxidants Study – Nashville Middle Tennessee Ozone Study. Twelve cross-wind air sampling traverses were made at six distance groups from 35 to 116 km from the source. During the sampling period the winds were from the west–northwest and the plume drifted towards the city of Nashville TN. Ten of the traverses were made upwind of the city, where the power plant plume was isolated, and two traverses downwind of the city when the plumes were possibly mixed. The results revealed that even six hours after the release, excess ozone production was limited to the edges of the plume. Only when the plume was sufficiently dispersed, but still upwind of Nashville, was excess ozone (up to 109 ppbv, 50–60 ppbv above background levels) produced in the center of the plume. The concentrations image of the plume and a Lagrangian particle model suggests that portions of the power plant plume mixed with the urban plume. The mixed urban power plant plume began to regenerate O3 that peaked at 120 ppbv at a short distance (15–25 km) downwind of Nashville. Ozone productivity (the ratio of excess O3 to NOy and NOz) in the isolated plume was significantly lower compared with that found in the city plume. The production of nitrate, a chain termination product, was significantly higher in the power plant plume compared to the mixed plume, indicating shorter chain length of the photochemical smog chain reaction mechanism.  相似文献   

3.
Following the release of radionuclides from the Chernobyl power plant accident, a long-range transport and deposition model is used to describe the plume dispersion over Europe. The aim of this study is the validation of a fast Lagrangjan model and a better understanding of the relative impact of some mechanisms, such as the initial plume rise. Comparisons between results and 137Cs measurement activity are discussed according to spatial and temporal variations. It is shown that many measurements can be explained only if the initial plume rise taken at 925, 850 and 700mb is considered.  相似文献   

4.
An aircraft study of air quality in the Hong Kong region during the fall of 1994 has allowed for an estimation of the daytime source strengths for CO and NOy from the Hong Kong metropolitan center. Emission rate estimates for the Hong Kong urban plume for NOy and CO were 5.4×10e(25) molecules s-1 and 1.8×10e(26) molecules s-1 as determined for the case study of 18 October. All emission rate estimates have uncertainties of a factor of 2. On one occasion a distinct plume emanating from Shenzhen in the People’s Republic of China was encountered. While plume delimitation was insufficient for source strength calculations, transect integrals did allow for a CO/NOy ratio of about 16 to be determined. The CO/NOy ratio for the Hong Kong urban plume was about 3.3. The difference in these ratios indicates differences in the overall combustion processes and efficiencies taking place within Hong Kong and the PRC.  相似文献   

5.
Uncertainty factors in atmospheric dispersion models may influence the reliability of model prediction. The ability of a model in assimilating measurement data will be helpful to improve model prediction. In this paper, data assimilation based on ensemble Kalman filter (EnKF) is introduced to a Monte Carlo atmospheric dispersion model (MCADM) designed for assessment of consequences after an accident release of radionuclides. Twin experiment has been performed in which simulated ground-level dose rates have been assimilated. Uncertainties in the source term and turbulence intensity of wind field are considered, respectively. Methodologies and preliminary results of the application are described. It is shown that it is possible to reduce the discrepancy between the model forecast and the true situation by data assimilation. About 80% of error caused by the uncertainty in the source term is reduced, and the value for that caused by uncertainty in the turbulence intensity is about 50%.  相似文献   

6.
We have developed a modelling system for predicting the traffic volumes, emissions from stationary and vehicular sources, and atmospheric dispersion of pollution in an urban area. This paper describes a comparison of the NOx and NO2 concentrations predicted using this modelling system with the results of an urban air quality monitoring network. We performed a statistical analysis to determine the agreement between predicted and measured hourly time series of concentrations at four permanently located and three mobile monitoring stations in the Helsinki Metropolitan Area in 1996–1997 (at a total of ten urban and suburban measurement locations). At the stations considered, the so-called index of agreement values of the predicted and measured time series of the NO2 concentrations vary between 0.65 and 0.82, while the fractional bias values range from −0.29 to +0.26. In comparison with corresponding results presented in the literature, the agreement between the measured and predicted datasets is good, as indicated by these statistical parameters. The seasonal variations of the NO2 concentrations were analysed in terms of the relevant meteorological parameters. We also analysed the difference between model predictions and measured data diagnostically, in terms of meteorological parameters, including wind speed and direction (the latter separately for two wind speed classes), atmospheric stability and ambient temperature, at two monitoring stations in central Helsinki. The modelling system tends to overpredict the measured NO2 concentrations both at the highest (u⩾6 m s−1) and at the lowest wind speeds (u<2 m s−1). For higher wind speeds, the modelling system overpredicts the measured NO2 concentrations in certain wind direction intervals; specific ranges were found for both monitoring stations considered. The modelling system tends to underpredict the measured concentrations in convective atmospheric conditions, and overpredict in stable conditions. The possible physico-chemical reasons for these differences are discussed.  相似文献   

7.
A quantitative methodology is described for the field-scale performance assessment of natural attenuation using plume-scale electron and carbon balances. This provides a practical framework for the calculation of global mass balances for contaminant plumes, using mass inputs from the plume source, background groundwater and plume residuals in a simplified box model. Biodegradation processes and reactions included in the analysis are identified from electron acceptors, electron donors and degradation products present in these inputs. Parameter values used in the model are obtained from data acquired during typical site investigation and groundwater monitoring studies for natural attenuation schemes. The approach is evaluated for a UK Permo-Triassic Sandstone aquifer contaminated with a plume of phenolic compounds. Uncertainty in the model predictions and sensitivity to parameter values was assessed by probabilistic modelling using Monte Carlo methods. Sensitivity analyses were compared for different input parameter probability distributions and a base case using fixed parameter values, using an identical conceptual model and data set. Results show that consumption of oxidants by biodegradation is approximately balanced by the production of CH4 and total dissolved inorganic carbon (TDIC) which is conserved in the plume. Under this condition, either the plume electron or carbon balance can be used to determine contaminant mass loss, which is equivalent to only 4% of the estimated source term. This corresponds to a first order, plume-averaged, half-life of > 800 years. The electron balance is particularly sensitive to uncertainty in the source term and dispersive inputs. Reliable historical information on contaminant spillages and detailed site investigation are necessary to accurately characterise the source term. The dispersive influx is sensitive to variability in the plume mixing zone width. Consumption of aqueous oxidants greatly exceeds that of mineral oxidants in the plume, but electron acceptor supply is insufficient to meet the electron donor demand and the plume will grow. The aquifer potential for degradation of these contaminants is limited by high contaminant concentrations and the supply of bioavailable electron acceptors. Natural attenuation will increase only after increased transport and dilution.  相似文献   

8.
In long-term safety assessment models for radioactive waste disposal, uptake of radionuclides by plants is an important process with possible adverse effects in ecosystems. Cobalt-60, 59,63Ni, 93Mo, and 210Pb are examples of long-living radionuclides present in nuclear waste. The soil-to-plant transfer of stable cobalt, nickel, molybdenum and lead and their distribution across plant parts were investigated in blueberry (Vaccinium myrtillus), May lily (Maianthemum bifolium), narrow buckler fern (Dryopteris carthusiana), rowan (Sorbus aucuparia) and Norway spruce (Picea abies) at two boreal forest sites in Eastern Finland. The concentrations of all of the studied elements were higher in roots than in above-ground plant parts showing that different concentration ratios (CR values) are needed for modelling the transfer to roots and stems/leaves. Some significant differences in CR values were found in comparisons of different plant species and of the same species grown at different sites. However, large within-species variation suggests that it is not justified to use different CR values for modelling soil-to-plant transfer of these elements in the different boreal forest plant species.  相似文献   

9.
It is well known that turbulent dispersion influences chemical reactions and that computation of reactant concentrations or mean chemical reaction rates can suffer of serious error when small-scale atmospheric processes' effects on chemical transformation are neglected. A quantity that gives a measure of the influence of turbulent dispersion on second-order chemical reaction rates is the intensity of segregation. A nonparametric estimator based on the kernel method aimed at measuring the intensity of segregation is proposed. Numerical benchmark tests, in the case of a Gaussian plume, are performed to study the suitability of this technique. The estimator works well, especially for small and moderate separation from the plume centreline and generally in the smooth parts of the estimated function. The effective reaction rate is computed and the percentage error emerges to be less than 5% in the best estimation intervals, and less than 40% in the worst. A method to reduce percentage error is introduced and improved performances are observed. The estimator proposed turns out to be particularly suitable for Lagrangian air quality modelling because it permits conservation of the grid independence.  相似文献   

10.
In the previous work (Zheng et al., 2007, Zheng et al., 2009), a data assimilation method, based on ensemble Kalman filter, has been applied to a Monte Carlo Dispersion Model (MCDM). The results were encouraging when the method was tested by the twin experiment and a short-range field experiment. In this technical note, the measured data collected in a wind tunnel experiment have been assimilated into the Monte Carlo dispersion model. The uncertain parameters in the dispersion model, including source term, release height, turbulence intensity and wind direction have been considered. The 3D parameters, i.e. the turbulence intensity and wind direction, have been perturbed by 3D random fields. In order to find the factors which may influence the assimilation results, eight tests with different specifications were carried out. Two strategies of constructing the 3D perturbation field of wind direction were proposed, and the result shows that the two level strategy performs better than the one level strategy. It is also found that proper standard deviation and the correlation radius of the perturbation field play an important role for the data assimilation results.  相似文献   

11.
Airborne measurements were performed in the plume of the Cumberland Power Plant during August 1998 using a highly sensitive SO2 instrument. The measurements confirmed previous suggestions that NOy species are removed from the plume at a faster rate than SO2. The differential removal rate (the difference between loss rate of NOy and that of SO2) was estimated to be 0.06 h−1. This value implies that the NOy loss rate is in the range of 0.09–0.14 h−1. The application of a mathematical argument, based on the convolution integral, enabled improved synchronization of the data from the SO2 and NOy instruments. Examination of the synchronized data revealed that the concentration ratio of SO2 and NOy varies across the plume. Near the source it is higher at the wings of the plume, while in the core of the plume it is similar to the ratio at the release point. Two possible explanations of the observations are discussed: conversion to non-measurable NOy species, and in-plume loss of NOy (as HNO3) via dry deposition.  相似文献   

12.
Ozone is a harmful air pollutant at ground level, and its concentrations are measured with routine monitoring networks. Due to the heterogeneous nature of ozone fields, the spatial distribution of the ozone concentration measurements is very important. Therefore, the evaluation of distributed monitoring networks is of both theoretical and practical interests. In this study, we assess the efficiency of the ozone monitoring network over France (BDQA) by investigating a network reduction problem. We examine how well a subset of the BDQA network can represent the full network. The performance of a subnetwork is taken to be the root mean square error (rmse) of the hourly ozone mean concentration estimations over the whole network given the observations from that subnetwork. Spatial interpolations are conducted for the ozone estimation taking into account the spatial correlations. Several interpolation methods, namely ordinary kriging, simple kriging, kriging about the means, and consistent kriging about the means, are compared for a reliable estimation. Exponential models are employed for the spatial correlations. It is found that the statistical information about the means improves significantly the kriging results, and that it is necessary to consider the correlation model to be hourly-varying and daily stationary. The network reduction problem is solved using a simulated annealing algorithm. Significant improvements can be obtained through these optimizations. For instance, removing optimally half the stations leads to an estimation error of the order of the standard observational error (10 μg m?3). The resulting optimal subnetworks are dense in urban agglomerations around Paris (Île-de-France) and Nice (Côte d’Azur), where high ozone concentrations and strong heterogeneity are observed. The optimal subnetworks are probably dense near frontiers because beyond these frontiers there is no observation to reduce the uncertainty of the ozone field. For large rural regions, the stations are uniformly distributed. The fractions between urban, suburban and rural stations are rather constant for optimal subnetworks of larger size (beyond 100 stations). By contrast, for smaller subnetworks, the urban stations dominate.  相似文献   

13.
A global three-dimensional (3D) transport–dispersion model was used to simulate Krypton-85 (85Kr) background concentrations at five sampling locations along the US east coast during 1982–1983. The samplers were established to monitor the 85Kr plume downwind of the Savannah river plant (SRP), a nuclear fuel reprocessing facility. The samplers were located 300–1000 km downwind of the SRP. In the original analyses of the measurements, a constant background concentration, representing an upper-limit and different for each sampling station, was subtracted from the measurements to obtain the part of the measurement representing the SRP plume. The use of a 3D global model, which includes all major 85Kr sources worldwide, was able to reproduce the day-to-day concentration background variations at the sampling locations with correlation coefficients of 0.36–0.46. These 3D model background predictions, without including the nearby SRP source, were then subtracted from the measured concentrations at each sampler, the result representing the portion of the measurement that can be attributed to emissions from the SRP. The revised plume estimates were a factor of 1.3–2.4 times higher than from the old method using a constant background subtraction. The greatest differences in the SRP plume estimates occurred at the most distant sampling stations.  相似文献   

14.
This paper describes the further development and application of the Edinburgh–Lancaster Model for Ozone (ELMO). We replace straight-line back-trajectories with trajectories and associated meteorology supplied by the US National Oceanic and Atmospheric Administration Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) service to allow more realistic modelling of specific UK ozone episodes. We call this ELMO-2. Model performance is rigorously tested against observed ozone concentrations for two episodes recorded across 14 rural UK monitoring stations during the spring and summer of 1995. For both episodes, the afternoon concentrations (usually coinciding with the daily maxima) are captured well by the model and the diurnal ozone cycle is reproduced, although the amplitude in concentrations is generally smaller than the observed. The summer episode is investigated further through indicator species analysis and source attribution, and found to be mainly VOC-limited. European emissions account for the majority of ozone production. We demonstrate how improved modelling leads to better understanding of regional and local ozone production across the UK under episodic conditions.  相似文献   

15.
The ETEX data set opens new possibilities to develop data assimilation procedures in the area of long-range transport. This paper illustrates the possibilities using a variational approach, where the source term for ETEX-I was reconstructed. The MATCH model (Robertson et al., 1996) has been the basis for this attempt. The timing of the derived emission rates are in accordance with the time period for the ETEX-I release, and a cross validation, with observations beyond the selected assimilation period, shows that the source term gained holds for the entire ETEX-I experiment. A poor-man variational approach was shown to perform nearly as good as a fully variational data assimilation. The issue of quality control has not been considered in this attempt but will be an important part that has to be addressed in future work.  相似文献   

16.
Source term estimation algorithms compute unknown atmospheric transport and dispersion modeling variables from concentration observations made by sensors in the field. Insufficient spatial and temporal resolution in the meteorological data as well as inherent uncertainty in the wind field data make source term estimation and the prediction of subsequent transport and dispersion extremely difficult. This work addresses the question: how many sensors are necessary in order to successfully estimate the source term and meteorological variables required for atmospheric transport and dispersion modeling?The source term estimation system presented here uses a robust optimization technique – a genetic algorithm (GA) – to find the combination of source location, source height, source strength, surface wind direction, surface wind speed, and time of release that produces a concentration field that best matches the sensor observations. The approach is validated using the Gaussian puff as the dispersion model in identical twin numerical experiments. The limits of the system are tested by incorporating additive and multiplicative noise into the synthetic data. The minimum requirements for data quantity and quality are determined by an extensive grid sensitivity analysis. Finally, a metric is developed for quantifying the minimum number of sensors necessary to accurately estimate the source term and to obtain the relevant wind information.  相似文献   

17.
This paper describes an investigation into the behaviour of smoke plumes from pool fires, and the subsequent generation of empirical models to predict plume rise and dispersion from such a combustion source. Synchronous video records of plumes were taken from a series of small-scale (0.06–0.25m2) outdoor methanol/toluene pool fire experiments, and used to produce sets of images from which plume dimensions could be derived. Three models were used as a basis for the multiple regression analysis of the data set, in order to produce new equations for improved prediction. Actual plume observations from a large (20.7 m×14.2 m) aviation fuel pool fire were also used to test the predictions. The two theoretically based models were found to give a better representation of plume rise and dispersion than the empirical model based on measurements of small-scale fires. It is concluded that theoretical models tested on small-scale fires (heat output ≈70 kW) can be used to predict plume behaviour from much larger combustion sources (heat output ≈70 MW) under near neutral atmospheric conditions.  相似文献   

18.
Fine particulate matter (PM) is relevant for human health and its components are associated with climate effects. The performance of chemistry transport models for PM, its components and precursor gases is relatively poor. The use of these models to assess the state of the atmosphere can be strengthened using data assimilation. This study focuses on simultaneous assimilation of sulphate and its precursor gas sulphur dioxide into the regional chemistry transport model LOTOS–EUROS using an ensemble Kalman filter. The process of going from a single component setup for SO2 or SO4 to an experiment in which both components are assimilated simultaneously is illustrated. In these experiments, solely emissions, or a combination of emissions and the conversion rates between SO2 and SO4 were considered uncertain. In general, the use of sequential data assimilation for the estimation of the sulphur dioxide and sulphate distribution over Europe is shown to be beneficial. However, the single component experiments gave contradicting results in direction in which the emissions are adjusted by the filter showing the limitations of such applications. The estimates of the pollutant concentrations in a multi-component assimilation have found to be more realistic. We discuss the behavior of the assimilation system for this application. The model uncertainty definition is shown to be a critical parameter. The increased complexity associated with the simultaneous assimilation of strongly related species requires a very careful specification of the experiment, which will be the main challenge in the future data assimilation applications.  相似文献   

19.
A model to simulate the transport of suspended particulate matter by the Rhone River plume has been developed. The model solves the 3D hydrodynamic equations, including baroclinic terms and a 1-equation turbulence model, and the suspended matter equations including advection/diffusion of particles, settling and deposition. Four particle classes are considered simultaneously according to observations in the Rhone. Computed currents, salinity and particle distributions are, in general, in good agreement with observations or previous calculations. The model also provides sedimentation rates and the distribution of different particle classes over the sea bed. It has been found that high sedimentation rates close to the river mouth are due to coarse particles that sink rapidly. Computed sedimentation rates are also similar to those derived from observations. The model has been applied to simulate the transport of radionuclides by the plume, since suspended matter is the main vector for them. The radionuclide transport model, previously described and validated, includes exchanges of radionuclides between water, suspended matter and bottom sediment described in terms of kinetic rates. A new feature is the explicit inclusion of the dependence of kinetic rates upon salinity. The model has been applied to 137Cs and 239,240Pu. Results are, in general, in good agreement with observations.  相似文献   

20.
A simplified hybrid statistical-deterministic chemistry-transport model, is used in real time for the prediction of ozone in the area of Paris during Summer 1999. We present here a statistical validation of this experiment. We distinguish the forecasts in the urban area from forecasts in the pollution plume downwind of the city. The validation of model forecasts, up to 3 days ahead, is performed against ground based observations within and up to 50 km outside of Paris. In the urban area, ozone levels are fairly well forecast, with correlation coefficients between forecast and observations ranging between 0.7 and 0.8 and root mean square errors in the range 15–20 μg m−3 at short lead times. While the bias of urban forecast is very low, the largest peaks are somehow underestimated. The ozone plume amplitude is generally well reproduced, even at long lead times (root mean square errors of about 20–30 μg m−3), while the direction of the plume is only captured at short lead times (about 70% of the time). The model has difficulties in forecasting the direction of the plume under stagnant weather conditions. We estimate the model ability to forecast concentrations above 180 μg m−3, which are of practical relevance to air quality managers. It is found that about 60% of these events are well forecast, even at long lead times, while the exact monitoring station where the exceedance is observed can only be forecast at short lead times. Finally, we found that about half of the forecast error is due to the error in the estimation of the boundary conditions, which are forecast by a simple linear regression model here.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号