Probabilistic modelling using Monte Carlo simulation has been proposed as a more scientifically valid method of estimating soil contaminant exposures than conservative deterministic methods currently used by regulatory agencies. A retrospective application of probabilistic modelling to an exposure scenario involving arsenic-contaminated residential soil near the former ASARCO smelter near Tacoma, Washington is presented. The population of interest is children, aged 2–6 years, living within one-half mile (0.3 km) of the smelter site. Models that predict urinary arsenic levels based on unintentional soil ingestion and inhalation exposure pathways are used. Distributions of exposure variables are based on site-specific data and previous exposure studies. Simulated urinary arsenic levels are compared with data from two biomonitoring studies performed during the late 1980s. Arsenic distributions produced by simulation and biomonitoring are significantly different, and likely contributors to this difference are discussed. However the probabilistic model provides closer estimations of urinary arsenic levels than conservative deterministic models similar to those used by regulatory agencies, and provides useful information regarding parameter uncertainty. Soil ingestion rate was a driving variable in the probabilistic models. Further quantification of soil ingestion rates is warranted. 相似文献
Objectives: The uncertainties of pedestrian mobility are important factors affecting the accuracy and robustness of an active pedestrian protection system. This study is to provide the means for probabilistic risk evaluation of pedestrian–vehicle collision by counting the uncertainties in pedestrian motion.
Method: The pedestrian is modeled by a first-order Markov model to characterize the stochastic properties in mobility according to field experiments of pedestrians crossing an uncontrolled road. Based on the assumption of Gaussian distribution, unscented transformation (UT) is employed to predict the collision risk probability with the symmetric σ-set constructed on the basis of discrete trajectory simulation. Simulation experiments were carried out with 10,000 Monte Carlo (MC) simulations as the reference.
Results: The probability density distributions of time-to-collision, minimal distance, and collision probability estimated by UT coincide with the reference ones under various vehicle–pedestrian conflict scenarios, and the maximal deviation of collision probability from the reference is 5.33%. The UT method is about 600 times faster than the MC method (10,000 runs), which means that the proposed method has the potential for online application.
Conclusions: This article presents an effective and efficient algorithm to estimate the collision probability by using a UT method to solve the nonlinear transformation of uncertainties in pedestrian motion. Simulation results show that the UT-based method achieves accurate collision probability estimation and higher computation efficiency than MC and provides more valuable information concerning collision avoidance than the deterministic methods in the design of a pedestrian collision avoidance system. 相似文献
The quality of science for policy depends as much on the robustness of available scientific knowledge as it does on the procedural settings and working procedures in safety agencies. Using a report on Bisphenol A as a case study, and a set of original criteria, we provide an understanding of procedural influences on the results of scientific advisory committees and about literature reviews for chemical hazard characterization. Expert elicitation revealed that three aspects are critically important for the results of the advisory activity and for the selected case study: the method used to combine different studies, the interpretation of the review results in terms of level of evidence and conclusiveness, and the choice of uncertainty factors. Our results also show how procedural settings and working procedures can promote the invisible influence of values and policy on scientific advisory activities. 相似文献
This study investigates the impact of climate and land use change on the magnitude and timing of streamflow and sediment yield in a snow‐dominated mountainous watershed in Salt Lake County, Utah using a scenario approach and the Hydrological Simulation Program — FORTRAN model for the 2040s (year 2035–2044) and 2090s (year 2085–2094). The climate scenarios were statistically and dynamically downscaled from global climate models. Land use and land cover (LULC) changes were estimated in two ways — from a regional planning scenario and from a deterministic model. Results indicate the mean daily streamflow in the Jordan River watershed will increase by an amount ranging from 11.2% to 14.5% in the 2040s and from 6.8% to 15.3% in the 2090s. The respective increases in sediment load in the 2040s and 2090s is projected to be 6.7% and 39.7% in the canyons and about 7.4% to 14.2% in the Jordan valley. The historical 50th percentile timing of streamflow and sediment load is projected to be shifted earlier by three to four weeks by mid‐century and four to eight weeks by late‐century. The projected streamflow and sediment load results establish a nonlinear relationship with each other and are highly sensitive to projected climate change. The predicted changes in streamflow and sediment yield will have implications for water supply, flood control and stormwater management. 相似文献
Stedinger, Jery R. and Veronica W. Griffis, 2011. Getting From Here to Where? Flood Frequency Analysis and Climate. Journal of the American Water Resources Association (JAWRA) 47(3):506‐513. DOI: 10.1111/j.1752‐1688.2011.00545.x Abstract: Modeling variations in flood risk due to climate change and climate variability are a challenge to our profession. Flood‐risk computations by United States (U.S.) federal agencies follow guidelines in Bulletin 17 for which the latest update 17B was published in 1982. Efforts are underway to update that remarkable document. Additional guidance in the Bulletin as to how to address variation in flood risk over time would be welcome. Extensions of the log‐Pearson type 3 model to include changes in flood risk over time would be relatively easy mathematically. Here an example of the use of a sea surface temperature anomaly to anticipate changes in flood risk from year to year in the U.S. illustrates this opportunity. Efforts to project the trend in the Mississippi River flood series beg the question as to whether an observed trend will continue unabated, has reached its maximum, or is really nothing other than climate variability. We are challenged with the question raised by Milly and others: Is stationarity dead? Overall, we do not know the present flood risk at a site because of limited flood records. If we allow for historical climate variability and climate change, we know even less. But the issue is not whether stationarity is dead – the issue is how to use all the information available to reliably forecast flood risk in the future: “Where do we go from here?” 相似文献