Deep learning (DL) models are increasingly used to make accurate hindcasts of management-relevant variables, but they are less commonly used in forecasting applications. Data assimilation (DA) can be used for forecasts to leverage real-time observations, where the difference between model predictions and observations today is used to adjust the model to make better predictions tomorrow. In this use case, we developed a process-guided DL and DA approach to make 7-day probabilistic forecasts of daily maximum water temperature in the Delaware River Basin in support of water management decisions. Our modeling system produced forecasts of daily maximum water temperature with an average root mean squared error (RMSE) from 1.1 to 1.4°C for 1-day-ahead and 1.4 to 1.9°C for 7-day-ahead forecasts across all sites. The DA algorithm marginally improved forecast performance when compared with forecasts produced using the process-guided DL model alone (0%–14% lower RMSE with the DA algorithm). Across all sites and lead times, 65%–82% of observations were within 90% forecast confidence intervals, which allowed managers to anticipate probability of exceedances of ecologically relevant thresholds and aid in decisions about releasing reservoir water downstream. The flexibility of DL models shows promise for forecasting other important environmental variables and aid in decision-making. 相似文献
Small island developing states (SIDS) face multiple threats from anthropogenic climate change, including potential changes in freshwater resource availability. Due to a mismatch in spatial scale between SIDS landforms and the horizontal resolution of global climate models (GCMs), SIDS are mostly unaccounted for in GCMs that are used to make future projections of global climate change and its regional impacts. Specific approaches are required to address this gap between broad-scale model projections and regional, policy-relevant outcomes. Here, we apply a recently developed methodology that circumvents the GCM limitation of coarse resolution in order to project future changes in aridity on small islands. These climate projections are combined with independent population projections associated with shared socioeconomic pathways (SSPs) to evaluate overall changes in freshwater stress in SIDS at warming levels of 1.5 and 2 °C above pre-industrial levels. While we find that future population growth will dominate changes in projected freshwater stress especially toward the end of the century, projected changes in aridity are found to compound freshwater stress for the vast majority of SIDS. For several SIDS, particularly across the Caribbean region, a substantial fraction (~?25%) of the large overall freshwater stress projected under 2 °C at 2030 can be avoided by limiting global warming to 1.5 °C. Our findings add to a growing body of literature on the difference in climate impacts between 1.5 and 2 °C and underscore the need for regionally specific analysis. 相似文献
Abundance estimates are essential for assessing the viability of populations and the risks posed by alternative management actions. An effort to estimate abundance via a repeated mark‐recapture experiment may fail to recapture marked individuals. We devised a method for obtaining lower bounds on abundance in the absence of recaptures for both panmictic and spatially structured populations. The method assumes few enough recaptures were expected to be missed by random chance. The upper Bayesian credible limit on expected recaptures allows probabilistic statements about the minimum number of individuals present in the population. We applied this method to data from a 12‐year survey of pallid sturgeon (Scaphirhynchus albus) in the lower and middle Mississippi River (U.S.A.). None of the 241 individuals marked was recaptured in the survey. After accounting for survival and movement, our model‐averaged estimate of the total abundance of pallid sturgeon ≥3 years old in the study area had a 1%, 5%, or 25% chance of being <4,600, 7,000, or 15,000, respectively. When we assumed fish were distributed in proportion to survey catch per unit effort, the farthest downstream reach in the survey hosted at least 4.5–15 fish per river kilometer (rkm), whereas the remainder of the reaches in the lower and middle Mississippi River hosted at least 2.6–8.5 fish/rkm for all model variations examined. The lower Mississippi River had an average density of pallid sturgeon ≥3 years old of at least 3.0–9.8 fish/rkm. The choice of Bayesian prior was the largest source of uncertainty we considered but did not alter the order of magnitude of lower bounds. Nil‐recapture estimates of abundance are highly uncertain and require careful communication but can deliver insights from experiments that might otherwise be considered a failure. 相似文献
Objective: The objective of this article is to provide empirical evidence for safe speed limits that will meet the objectives of the Safe System by examining the relationship between speed limit and injury severity for different crash types, using police-reported crash data.
Method: Police-reported crashes from 2 Australian jurisdictions were used to calculate a fatal crash rate by speed limit and crash type. Example safe speed limits were defined using threshold risk levels.
Results: A positive exponential relationship between speed limit and fatality rate was found. For an example fatality rate threshold of 1 in 100 crashes it was found that safe speed limits are 40 km/h for pedestrian crashes; 50 km/h for head-on crashes; 60 km/h for hit fixed object crashes; 80 km/h for right angle, right turn, and left road/rollover crashes; and 110 km/h or more for rear-end crashes.
Conclusions: The positive exponential relationship between speed limit and fatal crash rate is consistent with prior research into speed and crash risk. The results indicate that speed zones of 100 km/h or more only meet the objectives of the Safe System, with regard to fatal crashes, where all crash types except rear-end crashes are exceedingly rare, such as on a high standard restricted access highway with a safe roadside design. 相似文献
Annual CO2 emission tallies for 210 coal-fired power plants during 2009 were more accurately calculated from fuel consumption records reported by the U.S. Energy Information Administration (EIA) than measurements from Continuous Emissions Monitoring Systems (CEMS) reported by the U.S. Environmental Protection Agency. Results from these accounting methods for individual plants vary by ± 10.8%. Although the differences systematically vary with the method used to certify flue-gas flow instruments in CEMS, additional sources of CEMS measurement error remain to be identified. Limitations of the EIA fuel consumption data are also discussed. Consideration of weighing, sample collection, laboratory analysis, emission factor, and stock adjustment errors showed that the minimum error for CO2 emissions calculated from the fuel consumption data ranged from ± 1.3% to ± 7.2% with a plant average of ± 1.6%. This error might be reduced by 50% if the carbon content of coal delivered to U.S. power plants were reported.
Implications:
Potentially, this study might inform efforts to regulate CO2 emissions (such as CO2 performance standards or taxes) and more immediately, the U.S. Greenhouse Gas Reporting Rule where large coal-fired power plants currently use CEMS to measure CO2 emissions. Moreover, if, as suggested here, the flue-gas flow measurement limits the accuracy of CO2 emission tallies from CEMS, then the accuracy of other emission tallies from CEMS (such as SO2, NOx, and Hg) would be similarly affected. Consequently, improved flue gas flow measurements are needed to increase the reliability of emission measurements from CEMS. 相似文献