首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 12 毫秒
1.
Integral transform solutions for atmospheric pollutant dispersion   总被引:1,自引:0,他引:1  
A transient two-dimensional advection–diffusion model describing the turbulent dispersion of pollutants in the atmosphere has been solved via the Generalized Integral Transform Technique (GITT), by two different schemes. The first approach performs numerical integration of the transformed system using available routines for initial value problems with automatic error control. In spite of the time-consuming character of such a scheme, its flexibility allows the handling of problems involving time-dependent meteorological parameters such as wind speed and eddy diffusivities. The second approach works fully analytically being thus intrinsically more robust and economic, although not directly applicable in dealing with time-dependent parameters. For the test problem used in this work, both methods agree very well with each other, as well as with a known analytical solution for a simpler formulation used as benchmark. The impact of the longitudinal diffusivity on the stiffness of the ordinary differential equation (ODE) system arising from the integral transformation has been assessed through the processing time demanded to solve it when the numerical approach is used. The observed CPU times show that the analytical approach is clearly preferable unless the problem involves time-dependent parameters.  相似文献   

2.
Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter ??t,τ, beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.  相似文献   

3.
The goal of this study is to develop an emission based indicator for the health impact of the air pollution caused by traffic. This indicator must make it possible to compare different situations, for example different Urban Travel Plans, or technical innovations. Our work is based on a literature survey of methods for evaluating health impacts and, more particularly, those which relate to the atmospheric pollution caused by transport. We then define a health impact indicator based on the traffic emissions, named IISCEP for Chronic health impact indicator of pollutant emission. Here health is understood in a restricted meaning, excluding well-being. Only primary pollutants can be considered, as the inputs are emission data and an indicator must be simple. The indicator is calculated as the sum of each pollutant emission multiplied by a dispersion and exposition factor and a substance specific toxicity factor taking account of the severity.Last, two examples are shown using the IISCEP: comparison between petrol and diesel vehicles, and Nantes urban district in 2008 vs 2002.Even if it could still be improved, IISCEP is a straightforward indicator which can be used to gauge the chronic effects of inhaling primary pollutants. It can only be used in comparisons, between different scenarios or different technologies. The quality of the emissions data and the choice of the pollutants that are considered are the two essential factors that determine its validity and reliability.  相似文献   

4.
Personal exposure to air pollutants can be substantially higher in close proximity to an active source due to non-instantaneous mixing of emissions. The research presented in this paper quantifies this proximity effect for a non-buoyant source in 2 naturally ventilated homes in Northern California (CA), assessing its spatial and temporal variation and the influence of factors such as ventilation rate on its magnitude. To quantify how proximity to residential sources of indoor air pollutants affects human exposure, we performed 16 separate monitoring experiments in the living rooms of two detached single-family homes. CO (as a tracer gas) was released from a point source in the center of the room at a controlled emission rate for 5-12 h per experiment, while an array of 30-37 real-time monitors simultaneously measured CO concentrations with 15 s time resolution at radial distances ranging from 0.25-5 m under a range of ventilation conditions. Concentrations measured in close proximity (within 1 m) to the source were highly variable, with 5 min averages that typically varied by >100-fold. This variability was due to short-duration (<1 min) pollutant concentration peaks ("microplumes") that were frequently recorded in close proximity to the source. We decomposed the random microplume component from the total concentrations by subtracting predicted concentrations that assumed uniform, instantaneous mixing within the room and found that these microplumes can be modeled using a 3-parameter lognormal distribution. Average concentrations measured within 0.25 m of the source were 6-20 times as high as the predicted well-mixed concentrations.  相似文献   

5.
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.  相似文献   

6.
7.
People working in the nickel refining industry are known to have a higher concentration of nickel in lung tissue than the general population. To be able to evaluate a potential nickel exposure from other sources, e.g., welding, it is important to have sufficient data on what is normal for a local population. Several local factors such as the content of nickel in air and soil can have a significant impact on this so-called normal value. As almost all surgical equipment contains nickel, the sampling process can in itself be a source of contamination. The scope of this work was to investigate if there was any measurable contamination from the sampling instruments routinely used in hospitals, and if the presence of a nickel refinery had any effect on the nickel content in the lungs of the general population. Autopsy lung tissue samples were collected in situ from 50 people who had lived in the county of Vest Agder in Norway. Two samples were collected from each person; one with a regular scalpel (Swann-Norton) and forceps, and one with a titanium knife and plastic forceps. None of the persons had any known connection to the nickel refinery. The samples were collected at random and no special attention was given to age, sex and place of residence. The autopsies were performed according to Norwegian law and in understanding with the next of kin. The arithmetic mean value +/- s of nickel was 0.64 +/- 0.56 microgram g-1 and 0.29 +/- 0.20 microgram g-1 dry weight, respectively, for samples collected with a regular scalpel and a titanium knife (P < 0.0001). For people who lived 8 km and closer to the refinery by the time of death, the nickel content was 0.41 +/- 0.19 microgram g-1 and for those who had lived between 8 and 70 km away from the refinery it was 0.18 +/- 0.13 microgram g-1 (P < 0.015). No statistical difference was established between results for males and females. Previous investigations have shown that the nickel content in lung tissue varies in the so-called normal population. This work has shown that factors such as sampling equipment and place of residence have an impact on the results. It thus demonstrates that reliable background values can presumably only be obtained by collecting samples from individuals not exposed to known environmental nickel sources and to use nickel-free instruments in the sampling process.  相似文献   

8.
9.
downscaling procedures as a tool for integration of multiple air issues   总被引:1,自引:0,他引:1  
In assessing the risks associated with climate change,downscaling has proven useful in linking surfacechanges, at scales relevant to decision making, tolarge-scale atmospheric circulation derived from GCMoutput. Stochastic downscaling is related to synopticclimatology, weather-typing approaches (classifyingcirculation patterns) such as the Lamb Weather Typesdeveloped for the United Kingdom (UK), the EuropeanGrosswetterlagen (Bardossy and Plate, 1992) and thePerfect Prognosis (Perfect Prog) method from numericalweather prediction. The large-scale atmosphericcirculation is linked with site-specific observationsof atmospheric variables, such as precipitation, windspeed or temperature, within a specified region. Classifying each day by circulation patterns isachieved by clustering algorithms, fuzzy rule bases,neural nets or decision trees. The linkages areextended to GCM output to account for climate change. Stochastic models are developed from the probabilitydistributions for extreme events. Objective analysiscan be used to interpolate values of these models toother locations. The concepts and some applicationsare reviewed to provide a basis for extending thedownscaling approach to assessing the integrated riskof the six air issues: climate change, UV-B radiation,acid rain, transport of hazardous air pollutants, smogand suspended particulates.  相似文献   

10.
Rational pollution, or the effectiveness of natural attenuation assessments based upon estimating the degree of contamination, critically depends on the basis of a sound normalization to take into account heterogeneous sedimentary environments. By normalizing the measured contaminant concentration patterns for the sediment characteristics, the inherent variability can be reduced and so allow a more meaningful assessment of both the spatial distributions and the temporal trends. A brief overview and guidance in the methodology available for choosing an appropriate site-specific normalization approach is presented. This is followed by general recommendations with respect to the choice of normalizer and the necessary geochemical and statistical quality assurance methods, with support from the results of recent international intercomparison exercises within the QUASH (Quality Assurance of Sample Handling) programme, as well as discussions within the International Commission on the Exploration of the Sea (ICES) working groups. The most important of these recommendations is the use of a two-tiered normalization approach including wet sieving (<63 microm), followed by an additional geochemical co-factor normalization.  相似文献   

11.
Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor.  相似文献   

12.
This report deals with the implementation of Article 10 of the Directive on air quality limit values and guide values for sulphur dioxide and suspended particulates (80/779/EEC). For this purpose the Commission, in cooperation with the Member States, has prepared a ‘Common Measurement Programme’ which basically provides for:
  1. examination and improvement of the reference measurement methods,
  2. examination of the comparability of the measuring procedures and equipment used for monitoring purposes in the Member States,
  3. the provision of guidelines and the carrying out of measurements to determine the corresponding stringency of the limit values laid down in Annexes IV and I of the Directive.
  相似文献   

13.
The aim of this investigation was to evaluate a simplified version of an HPLC method for the determination of PAH in suspended particles collected from small air volumes indoors, outdoors or in personal exposure measurements. The simplification consisted in: (a) collecting PAH by low-volume samplers; (b) extracting PAH ultrasonically; and (c) omitting separation of interfering substances before analysis by HPLC. The results show that the introduction of these modifications affords a considerable reduction in analysis time and solvent expenditure, without affecting the quality of measurement.  相似文献   

14.
The use of antineoplastic drugs in health care steadily increases. Health care workers can be occupationally exposed to antineoplastic drugs classified as carcinogenic or teratogenic. Monitoring of surface contamination is a common way to assess occupational exposure to antineoplastic drugs, since wipe sampling is used as a surrogate measure of dermal exposure. Since no occupational limits for antineoplastic drugs in work environments exist, 'hygienic guidance values' (HGVs) should be used instead. HGVs are practicable, achievable levels, not health based, and can be calculated from exposure data from representative workplaces with good occupational hygiene practices. So far, guidance values for surface monitoring of antineoplastic drugs only exist for pharmacies where antineoplastic drugs are prepared. The objective was to propose HGVs for surface monitoring of cyclophosphamide (CP) and ifosfamide (IF) in Swedish hospitals where antineoplastic drugs are administered to patients. In total, 17 workplaces located at six hospitals in Sweden were surveyed by wipe sampling. Wipe samples were collected, worked up and then analyzed with liquid chromatography tandem mass spectrometry. Surface contamination of CP and IF was found on 80% and 73% of the sampled surfaces, thus indicating that there is potential for health care workers to be exposed to CP and IF via the skin. The median surface load of CP was 3.3 pg cm(-2) (range <0.05-10,800 pg cm(-2)). The corresponding value for IF was 4.2 pg cm(-2) (range <0.13-95,000 pg cm(-2)). The highest surface loads were found on the floors. The proposed HGVs were set at 90th percentile values, and can be applicable to hospital workplaces where patients are treated with CP or IF. Surface monitoring combined with HGVs is a useful tool for health care workers to regularly benchmark their own surface loads which could control and reduce the occupational exposure to CP and IF in hospital workplaces. Thus, the occupational safety of the health care workers will be increased.  相似文献   

15.
提出了基于卫星气溶胶光学厚度(AOD)和恩格指数反演气溶胶细模态比(FMF)的新算法,该算法采用更准确的气溶胶模型,构建了以AOD和FMF为变量的二维查找表。将该算法用于北京和Jaipur两个地区FMF反演,并与AERONET和MODIS的FMF作比较。结果表明,新算法反演的FMF与AERONET的FMF的相关系数为0.656,而MODIS C6与AERONET的FMF相关系数为0.436;以AERONET的FMF为标准,新算法的均方根误差为0.156,低于MODIS C6的均方根误差(0.318);新算法的反演结果中有90%处于±0.4的误差范围内,而MODIS C6的FMF只有57.4%处于该误差范围内。  相似文献   

16.
One of the difficulties in accurate characterization of unknown groundwater pollution sources is the uncertainty regarding the number and the location of such sources. Only when the number of source locations is estimated with some degree of certainty that the characterization of the sources in terms of location, magnitude, and activity duration can be meaningful. A fairly good knowledge of source locations can substantially decrease the degree of nonuniqueness in the set of possible aquifer responses to subjected geochemical stresses. A methodology is developed to use a sequence of dedicated monitoring network design and implementation and to screen and identify the possible source locations. The proposed methodology utilizes a combination of spatial interpolation of concentration measurements and simulated annealing as optimization algorithm for optimal design of the monitoring network. These monitoring networks are to be designed and implemented sequentially. The sequential design is based on iterative pollutant concentration measurement information from the sequentially designed monitoring networks. The optimal monitoring network design utilizes concentration gradient information from the monitoring network at previous iteration to define the objective function. The capability of the feedback information based iterative methodology is shown to be effective in estimating the source locations when no such information is initially available. This unknown pollution source locations identification methodology should be very useful as a screening model for subsequent accurate estimation of the unknown pollution sources in terms of location, magnitude, and activity duration.  相似文献   

17.
An interesting alternative to wall-to-wall mapping approaches for the estimation of landscape metrics is to use sampling. Sample-based approaches are cost-efficient, and measurement errors can be reduced considerably. The previous efforts of sample-based estimation of landscape metrics have mainly been focused on data collection methods, but in this study, we consider two estimation procedures. First, landscape metrics of interest are calculated separately for each sampled image and then the image values are averaged to obtain an estimate of the entire landscape (separated procedure, SP). Second, metric components are calculated in all sampled images and then the aggregated values are inserted into the landscape metric formulas (aggregated procedure, AP). The national land cover map (NLCM) of Sweden, reflecting the status of land cover in the year 2000, was used to provide population information to investigate the statistical performance of the estimation procedures. For this purpose, sampling simulation with a large number of replications was used. For all three landscape metrics, the second procedure (AP) produced a lower relative RMSE and bias than the first one (SP). A smaller sample unit size (50 ha) produced larger bias than a larger one (100 ha), whereas a smaller sample unit size produced a lower variance than a larger sample unit. The efficiency of a metric estimator is highly related to the degree of landscape fragmentation and the selected procedure. Incorporating information from all of the sampled images into a single one (aggregated procedure, AP) is one way to improve the statistical performance of estimators.  相似文献   

18.
To date, the majority of empirical approaches used to derive sediment quality values (SQVs) have focused on metal concentrations in sediment associated with adverse effects on benthic invertebrate communities. Here, we propose the no-effect (NE) approach. This SQV derivation methodology uses metal concentrations in sediment associated with unaffected benthic communities (i.e., from reference sites and lightly contaminated no-effect sites) and accounts for local benthic invertebrate tolerance and potential chemical interactions at no-effect exposure sites. This NE approach was used to propose alternative regional SQVs for uranium operations in northern Saskatchewan. Three different sets of NE values were derived using different combinations of benthic invertebrate community effects criteria (abundance, richness, evenness, Bray–Curtis index). Additionally, reference values were derived based solely on sediment metal concentrations from reference sites. In general, NE values derived using abundance, richness, and evenness (NE1 and NE2 values) were found to be higher than the NE values derived using all four metrics (NE3 values). Derived NE values for Cr, Cu, Pb, and V did not change with the incorporation of additional effects criteria due to a lack of influence from the uranium operations on the concentrations of these metals in sediment. However, a gradient of exposure concentrations was apparent for As, Mo, Ni, Se, and U in sediment which allowed for tolerable exposure levels of these metals in sediment to be defined. The findings from this assessment have suggested a range of new, alternate metal SQVs for use at uranium operations in northern Saskatchewan.  相似文献   

19.
Currently, there is an urgent need to develop an electro-analytical technique that can detect and monitor environmental pollutants in a sensitive, specific, and selective manner. Traditional environment pollutant detection techniques suffer from different drawbacks like calibration, sample preparation, blank determination, skilled operator, lengthy time-consuming procedure, costly and no universal approach. The objective of this review article is to provide details regarding the fabrication of conducting polymer-nanoparticles (nanocomposites) based electrochemical biosensors for continuous environmental monitoring. Emphasis has been placed on the principles, development, classification, and use of electrochemical biosensors comprising of nanocomposites of conducting polymer (CP) and metal oxide nanoparticles for the estimation of environmental pollutants present in the air, water, food, and soil.  相似文献   

20.
The aim of this study was to elucidate the amount of metal released at each step by using different extractants in three sequential extraction schemes for the partitioning of metal contents of car park deposited dust samples. For this purpose, three different sequential extraction procedures (SEP) were employed for the metal fractionation in car park dust samples collected from the campus of Erciyes University, Kayseri, Turkey. While two of the sequential extraction procedures contain five steps the other, namely the BCR sequential extraction scheme, has three steps. The first two methods fractionate metals to be exchangeable, bound to carbonates, bound to Mn oxides, bound to Fe oxides and bound to organic matter, and the BCR protocol fractionates the metals as acid soluble and exchangeable, reducible, and oxidisable. Determination of the metals Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb and Zn was performed by flame atomic absorption spectrometry (FAAS). The results obtained by the three methods were compared and showed that the amount of metal released at each step of the leaching procedure depended both on the type of reagents used and the sequence in which they were applied. The most mobile elements were Cd, Pb and Zn which are metals potentially toxic to the environment and are also known to originate from traffic. The calculated enrichment factors for Cd and Pb were substantially high (73.5-187 and 18.4-27.5, respectively) and somewhat lower for Zn (5.1-6.8). These results confirm that they are important metal pollutants for car parks. Detection limits and recoveries were found in the range of 0.01-1.39 microg ml(-1) and 68-126%, respectively, for the metals studied and the three sequential extraction procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号