排序方式: 共有42条查询结果,搜索用时 15 毫秒
31.
K.J. Riebschleager R. Karthikeyan R. Srinivasan K. McKee 《Journal of the American Water Resources Association》2012,48(4):745-761
Riebschleager, K.J., R. Karthikeyan, R. Srinivasan, and K. McKee, 2012. Estimating Potential E. coli Sources in a Watershed Using Spatially Explicit Modeling Techniques. Journal of the American Water Resources Association (JAWRA) 48(4): 745‐761. DOI: 10.1111/j.1752‐1688.2012.00649.x Abstract: The Spatially Explicit Load Enrichment Calculation Tool (SELECT) was automated to characterize waste and the associated pathogens from various sources within a mixed land use watershed. Potential Escherichia coli loads in Lake Granbury watershed were estimated using spatially variable governing factors, such as land use, soil condition, and distance to streams. A new approach for characterizing E. coli loads resulting from malfunctioning on‐site wastewater treatment systems (OWTSs) was incorporated into SELECT along with the Pollutant Connectivity Factor (PCF) module. The PCF component was applied to identify areas contributing E. coli loads during runoff events by incorporating the influence of potential E. coli loading, runoff potential, and travel distance to waterbodies. Simulation results indicated livestock and wildlife are potential E. coli contributing sources in the watershed. The areas in which these sources are potentially contributing are not currently monitored for E. coli. The bacterial water quality violations seen around Lake Granbury are most likely the result of malfunctioning OWTSs and pet wastes. SELECT results demonstrate the need to evaluate each contributing source separately to effectively allocate site specific best management practices (BMPs) utilizing stakeholder inputs. It also serves as a powerful screening tool for determining areas where detailed investigation is merited. 相似文献
32.
Herbert C. McKee John H. Margeson Thomas W. Stanley 《Journal of the Air & Waste Management Association (1995)》2013,63(10):870-875
The Methods Standardization Branch of the Environmental Protection Agency, National Environmental Research Center, has undertaken a program to standardize methods used in measuring air pollutants covered by the national primary and secondary air quality standards. This paper presents the results of a collaborative test of the method specified for carbon monoxide. The test involved analysis of CO in air samples (in cylinders) by participating laboratories. Three concentrations, covering the range of the method which is, 0 to 58 mg/m3, were analyzed dry and humidified on each of three days by 15 collaborators. The method of analysis, nondispersive infrared spectrometry (NDIR), involved an NDIR instrument in combination with different procedures for eliminating water vapor interference. A statistical analysis of the data obtained produced the following results: 1. The checking limit for duplicates (replication error) is 0.5 mg/m3. 2. The repeatability (variation within a laboratory) is 1.6 mg/m3. 3. The reproducibility (variation between laboratories) varies nonlinearly with concentration; i.e., a minimum of 2.3 mg/m3 at a concentration of 20 mg/m3 and ranges as high as 4.3 mg/m3 in the concentration range of 0 to 58 mg/m3. 4. The reproducibility at the level of the national primary ambient air quality standard, 10 mg/m3-8-hour average, is 2.5 mg/m3 or 25%. 5. The minimum detectable sensitivity is estimated to be 0.3 mg/m3. 6. Compensation for water vapor interference is satisfactorily accomplished using drying agents and refrigeration methods. The use of narrow-band optical filters alone may not provide adequate compensation. 7. The accuracy obtained depends upon the availability of reliable calibration gases. Based on the results of this study, the method produces results that average 2.5% high. Future papers will contain test results for methods to measure other air pollutants. 相似文献
33.
Frances M. McKee‐Ryan Meghna Virick Gregory E. Prussia Jaron Harvey Juliana D. Lilly 《组织行为杂志》2009,30(4):561-580
The competitive environment of business today makes corporate layoffs an organizational reality, and losing one's job can be a highly stressful experience. We propose and test a model that places objective underemployment and subjective underemployment in a causal sequence between organizational actions and employees' restoration of equilibrium by obtaining jobs worth keeping. We longitudinally examine relationships between layoff fairness, workers' stress symptoms and appraisal, and subsequent employment outcomes among 149 laid‐off technical employees over the course of one year. Structural equation model results support seven of nine hypothesized paths, and demonstrate discriminant validity between and mediational properties of objective and subjective underemployment. Findings also reveal the important role that employees' perceptions and subjective assessments play in successfully returning to pre‐job loss equilibrium following displacement. Copyright © 2008 John Wiley & Sons, Ltd. 相似文献
34.
35.
Herbert C. McKee 《Journal of the Air & Waste Management Association (1995)》2013,63(3):271-272
Under Title III of SARA, companies must provide information about chemicals that they manufacture, store, or process. Communities will use data about potential accidental releases to develop local emergency plans. Data about routine chemical releases will be made available to the public on a computer data base. Simply having such data available does not ensure consensus about reducing potential chemical risks. Laboratory and field research are summarized, indicating that people tend to edit small risks to zero as being too small to worry about, or to adjust them imperfectly from an anchor equal to the potential loss. These results suggest recommendations for communicating about the risks posed by accidental or routine releases of chemicals. 相似文献
36.
37.
M. Kashif Gill Tirusew Asefa Mariush W. Kemblowski Mac McKee 《Journal of the American Water Resources Association》2006,42(4):1033-1046
ABSTRACT: Herein, a recently developed methodology, Support Vector Machines (SVMs), is presented and applied to the challenge of soil moisture prediction. Support Vector Machines are derived from statistical learning theory and can be used to predict a quantity forward in time based on training that uses past data, hence providing a statistically sound approach to solving inverse problems. The principal strength of SVMs lies in the fact that they employ Structural Risk Minimization (SRM) instead of Empirical Risk Minimization (ERM). The SVMs formulate a quadratic optimization problem that ensures a global optimum, which makes them superior to traditional learning algorithms such as Artificial Neural Networks (ANNs). The resulting model is sparse and not characterized by the “curse of dimensionality.” Soil moisture distribution and variation is helpful in predicting and understanding various hydrologic processes, including weather changes, energy and moisture fluxes, drought, irrigation scheduling, and rainfall/runoff generation. Soil moisture and meteorological data are used to generate SVM predictions for four and seven days ahead. Predictions show good agreement with actual soil moisture measurements. Results from the SVM modeling are compared with predictions obtained from ANN models and show that SVM models performed better for soil moisture forecasting than ANN models. 相似文献
38.
39.
The activity of a chemical in solution determines its tendency to move into other media. At low concentrations (<0.01M) it is generally considered to be linearly related to concentration. A hypothetical model based on the structure of liquid water is discussed which could cause deviations from this linearity in the ppb region, a concentration much lower than that normally investigated thermodynamically, but one of great importance environmentally. Headspace experiments are reported with carbon tetrachloride and chloroform in water at concentrations down to ~10?3 ppb but no such deviations were discerned. 相似文献
40.