Objective: The ability to detect changing visual information is a vital component of safe driving. In addition to detecting changing visual information, drivers must also interpret its relevance to safety. Environmental changes considered to have high safety relevance will likely demand greater attention and more timely responses than those considered to have lower safety relevance. The aim of this study was to explore factors that are likely to influence perceptions of risk and safety regarding changing visual information in the driving environment. Factors explored were the environment in which the change occurs (i.e., urban vs. rural), the type of object that changes, and the driver's age, experience, and risk sensitivity.
Methods: Sixty-three licensed drivers aged 18–70 years completed a hazard rating task, which required them to rate the perceived hazardousness of changing specific elements within urban and rural driving environments. Three attributes of potential hazards were systematically manipulated: the environment (urban, rural); the type of object changed (road sign, car, motorcycle, pedestrian, traffic light, animal, tree); and its inherent safety risk (low risk, high risk). Inherent safety risk was manipulated by either varying the object's placement, on/near or away from the road, or altering an infrastructure element that would require a change to driver behavior. Participants also completed two driving-related risk perception tasks, rating their relative crash risk and perceived risk of aberrant driving behaviors.
Results: Driver age was not significantly associated with hazard ratings, but individual differences in perceived risk of aberrant driving behaviors predicted hazard ratings, suggesting that general driving-related risk sensitivity plays a strong role in safety perception. In both urban and rural scenes, there were significant associations between hazard ratings and inherent safety risk, with low-risk changes perceived as consistently less hazardous than high-risk impact changes; however, the effect was larger for urban environments. There were also effects of object type, with certain objects rated as consistently more safety relevant. In urban scenes, changes involving pedestrians were rated significantly more hazardous than all other objects, and in rural scenes, changes involving animals were rated as significantly more hazardous. Notably, hazard ratings were found to be higher in urban compared with rural driving environments, even when changes were matched between environments.
Conclusion: This study demonstrates that drivers perceive rural roads as less risky than urban roads, even when similar scenarios occur in both environments. Age did not affect hazard ratings. Instead, the findings suggest that the assessment of risk posed by hazards is influenced more by individual differences in risk sensitivity. This highlights the need for driver education to account for appraisal of hazards’ risk and relevance, in addition to hazard detection, when considering factors that promote road safety. 相似文献
Land use change and the expansion of dairying are perceived as the cause of poor water quality in the 1881 km2 Pomahaka catchment in Otago, New Zealand. A study was conducted to determine the long-term trend at four sites, and current state in 13 sub-catchments, of water quality. Drains in 2 dairy-farmed sub-catchments were also sampled to determine their potential as a point source of stream contamination. Data highlighted an overall increase in the concentration of phosphorus (P) fractions at long-term sites. Loads of contaminants (nitrogen (N) and P fractions, sediment and Escherichia coli) were greatest in those sub-catchments with the most dairying. Baseline (without human influence) contaminant concentrations suggested that there was considerable scope for decreasing losses. At most sites, baseline concentrations were <20% of current median concentrations. Contaminant losses via drainage were recorded despite there being no rainfall that day and attributed to applying too much effluent onto wet soil. Modelling of P concentrations in one dairy-farmed sub-catchment suggested that up to 58% of P losses came from point sources, like bad effluent practice and stock access to streams. A statistical test to detect “contaminated” drainage was developed from historical data. If this test had been applied to remove contaminated drainage from samples of the two dairy-farmed sub-catchments, median contaminant concentrations and loads would have decreased by up to 58% (greater decreases were found for E. coli, ammoniacal-N and total P than other contaminants). This suggests that better uptake of strategies to mitigate contamination, such as deferred effluent irrigation (and low rate application), could decrease drainage losses from dairy-farmed land and thereby improve water quality in the Pomahaka catchment. 相似文献
In order to assist an integrated development of ionic liquids (ILs), a study on the sorption, distribution, and cytotoxicity of a series of 1-alkyl-3-methyl imidazolium tetrafluoroborates with C6 rat glioma cells has been performed. Cellular sorption and distribution among three cellular fractions (cytosol, nuclei, and membranes) were analysed by reversed-phase HPLC (RP-HPLC). Compounds with longer 1-alkyl substituents were sorbed with higher enrichment factors and sorption coefficients per protein than those with shorter 1-alkyl chains. The 1-octyl-3-methyl imidazolium cation (C8MIM) was enriched 17-folds whereas C6MIM and C4MIM were enriched by factors of 3.5 and 2.3, respectively. After fractionation of cells by centrifugation, about 8% of C8MIM was found in the nuclear fractions. The cytotoxicity as estimated by the tetrazolium reductase assay was increasing with the lengths of the 1-alkyl chains from C4MIM to C10MIM. Consistently, cell proliferation rates were decreasing with increasing lengths of the 1-alkyl chains. The results reveal the correlations between lipophilicity, cellular sorption, and cytotoxicity. 相似文献
In situ trampling occurred under experimental conditions to quantify the differences in the responses to anthropogenic trampling
in four dominant species of Hawaiian corals, Porites compressa, Porites lobata, Montipora capitata, and Pocillopora meandrina. Trampling was simulated daily for a period of nine days at which time further breakage was minimal. Forty treatment colonies
produced 559 fragments. Trampling was followed by an 11-month recovery period.
Coral colony and fragment mortality was low. All four species were highly tolerant of inflicted damage, suggesting that some
species of corals can withstand limited pulse events that allow time for recovery.
Growth rates following trampling were significantly lower in the treatment groups for three of the four species. This study
demonstrated that very few trampling events can produce significant changes in growth even after a long recovery period.
Survivorship of fragments is clearly size- and species-dependent in M. capitata and P. compressa. Smaller fragments (<5 cm) had higher mortality than larger fragments (>5 cm). High breakage rates for M. capitata and P. compressa are consistent with the nearshore, low-energy regions they inhabit—the same environment frequented by skin divers and waders.
Mechanical tests were conducted to determine tensile and compressive strengths. Pocillopora meandrina exhibited the strongest skeletal strength, followed in decreasing order by Porites lobata, Porites compressa, and Montipora capitata. The skeletal strength obtained from the experiments correlate with the wave energy present in the environments in the regions
they inhabit, suggesting that structural strength of corals is an adaptive response to hydraulic stress. 相似文献
Little is known about the microbial communities carried in wind-eroded sediments from various soil types and land management systems. The novel technique of pyrosequencing promises to expand our understanding of the microbial diversity of soils and eroded sediments because it can sequence 10 to 100 times more DNA fragments than previous techniques, providing enhanced exploration into what microbes are being lost from soil due to wind erosion. Our study evaluated the bacterial diversity of two types of wind-eroded sediments collected from three different organic-rich soils in Michigan using a portable field wind tunnel. The wind-eroded sediments evaluated were a coarse sized fraction with 66% of particles >106 μm (coarse eroded sediment) and a finer eroded sediment with 72% of particles <106 μm. Our findings suggested that (i) bacteria carried in the coarser sediment and fine dust were effective fingerprints of the source soil, although their distribution may vary depending on the soil characteristics because certain bacteria may be more protected in soil surfaces than others; (ii) coarser wind-eroded sediment showed higher bacterial diversity than fine dust in two of the three soils evaluated; and (iii) certain bacteria were more predominant in fine dust (, , and ) than coarse sediment ( and ), revealing different locations and niches of bacteria in soil, which, depending on wind erosion processes, can have important implications on the soil sustainability and functioning. Infrared spectroscopy showed that wind erosion preferentially removes particular kinds of C from the soil that are lost via fine dust. Our study shows that eroded sediments remove the active labile organic soil particulates containing key microorganisms involved in soil biogeochemical processes, which can have a negative impact on the quality and functioning of the source soil. 相似文献
Risk management of food-animal antibiotics has reached a crucial juncture for public health officials worldwide. While withdrawals of animal antibiotics previously used to control animal bacterial illnesses are being encouraged in many countries, the human health impacts of such withdrawals are only starting to be understood. Increases in animal and human bacterial illness rates and antibiotic resistance levels in humans in Europe despite bans on animal antibiotics there have raised questions about how animal antibiotic use affects human health. This paper presents a quantitative human health risk and benefits assessment for virginiamycin (VM), a streptogramin antibiotic recommended for withdrawal from use in food animals in several countries. It applies a new quantitative Rapid Risk Rating Technique (RRRT) that estimates and multiplies data-driven exposure, dose-response, and consequence factors, as suggested by WHO (2003) to estimate human health impacts from withdrawing virginiamycin. Increased human health risks from more pathogens reaching consumers if VM use is terminated (6660 estimated excess campylobacteriosis cases per year in the base case) are predicted to far outweigh benefits from reduced streptogramin-resistant vancomycin-resistant Enterococcus faecium (VREF) infections in human patients (0.27 estimated excess cases per year in the base case). While lack of information about impacts of VM withdrawal on average human illnesses-per-serving of food animal meat precludes a deterministic conclusion, it appears very probable that such a withdrawal would cause many times more human illnesses than it would prevent. This qualitative conclusion appears to be robust to several scientific and modeling uncertainties. 相似文献