This paper reports a qualitative study of 54 police drivers who were interviewed about their views on police driver training, driving strategies and their accident involvement. Study of the transcribed interviews indicated that officers constructed narratives of themselves as being highly aware of hazards presented by other road users and they used a variety of discursive devices to minimise their own culpability and attribute risk elsewhere. Rather than maintaining a straightforward ‘illusion of invulnerability’ they were formulating a ‘topography of risk’ in which they were responding to hazards presented by suspects or other road users. Their meticulously detailed accounts of the circumstances surrounding accidents serve to place them as knowledgeable and impartial participants and create a sense of expertise and authority. Training initiatives could profitably seek to challenge this ‘topography of risk’ and sense of authority so that drivers more fully appreciate the hazard they may present to themselves and the public. 相似文献
Developing a relationship between pest abundance and damage to crops is essential for the calculation of economic injury levels (EIL) leading to informed management decisions. The crop modelling framework, APSIM, was used to simulate the impact of mouse damage on yield loss on wheat where a long-term dataset on the density of mice was available (1983–2003). The model was calibrated using results from field trials where wheat plants were hand clipped to imitate mouse damage. The grazing effect of mice was estimated using the population density, daily intake per mouse and the proportion of wheat grain and plant tissue in the diet to determine yield loss. The mean yield loss caused by mice was 12.4% (±5.4S.E.; range −0.5 to 96%). There were 7/21 years when yield loss was >5%. A damage/abundance relationship was constructed and a sigmoidal curve explained 97% of variation when accounting for different trajectories of mouse densities from sowing to harvest. The majority of damage occurred around emergence of the crop when mouse densities were >100 mice ha−1. This is the first time that field data on mouse density and a crop simulation model have been combined to estimate yield loss. The model examines the efficacy of baiting and how to estimate EILs. Because the broadscale application of zinc phosphide is cheap and effective, the EIL is very low (<1% yield loss). The APSIM model is highly flexible and could be used for other vertebrate pests in a range of crops or pastures to develop density/damage relationships and to assist with management. 相似文献
Canadian and US marine conservation law, and other related law, was analyzed to determine if it reflected ecological criteria needed to implement connectivity among marine protected areas of the northeast Pacific in the proposed trilateral Baja to Bering Sea (B2B) initiative. The analysis included both nations’ federal laws and those of California, Oregon, Washington, Alaska, and British Columbia. While legal provisions exist already to implement marine protected areas for varying reasons, there is little capacity in most laws to create connectivity among them for conservation purposes. Only California's legislation contained explicit provisions for all the criteria. Other federal, state, and provincial laws, while containing provisions for species at risk and vulnerable habitats, generally lacked explicit provisions for the vital criteria, size of area, migratory patterns, and recruitment patterns. Implementation, future management, and protection of the proposed B2B marine network would be facilitated by amendment of both Canadian and US laws. Some of the ecological criteria are already implied implicitly or vaguely, but they need to be made explicit in the amended law. The legislative model of California could serve as a template for amending the laws of other jurisdictions in the B2B venture. 相似文献
Objectives: This study reports the results of a pilot program in Kenosha County that used a combination of direct biomarkers extracted from blood spots and nails to monitor repeat intoxicated drivers for their use of alcohol and drugs with a detection window spanning from 3 weeks to several months. The objectives were to test whether the direct biomarkers phosphatidylethanol (PEth), ethylglucuronide (EtG), and 5 drug metabolites would (1) help assessors obtain a more objective evaluation of repeat offenders during the assessment interview, (2) allow for timely identification of relapses and improve classification of drivers into risk categories, and (3) predict recidivism by identifying offenders most likely to obtain a subsequent operating while intoxicated (OWI) offense within 4 years of enrollment in the program.
Methods: All (N = 261) repeat offenders were tested using PEth obtained from blood spots and EtG obtained from fingernails; 159 participants were also tested for a 5 drugs of abuse nail panel. Drivers were tested immediately after the assessment interview (baseline) and at 3, 6, 9, and 12 months after baseline. Based on biomarker results and self-reports of abstinence, offenders were classified into different risk categories and required to follow specific testing timelines based on the program's decision tree.
Results: The baseline analysis shows that 60% of drivers tested positive for alcohol biomarkers (EtG, PEth, or both) at the assessment interview, with lower detection rates (0–11%) for the 5 drug metabolites. The comparison of biomarkers results to self-reports of abstinence identified 28% of all offenders as high risk and assigned them to more frequent testing and more intense monitoring. The longitudinal analysis shows that 56% (completers) of participants completed the program successfully and the remaining 44% (noncompliant) terminated prematurely. Two thirds (68%) of the completers were able to reduce or control their drinking and one third relapsed at least one time during their mandated monitoring periods. After a brief intervention by the assessors, 79% of relapsers tested negative for biomarkers in their repeat tests. The rearrest analysis showed that offenders classified in the noncompliant and relapsers groups were 7 times more likely to receive a new OWI 4 years after enrollment compared to drivers classified as abstainers or controllers. Refractory drivers were monitored the longest and reported no subsequent rearrests.
Conclusion: These findings demonstrate the benefits of more individualized interventions with repeat OWI offenders and calls for further development of multimodal approaches in traffic medicine including those that use direct alcohol biomarkers as evidence-based practices to reduce recidivism. 相似文献
International negotiations on the inclusion of land use activities into an emissions reduction system for the UN Framework Convention on Climate Change (UNFCCC) have been partially hindered by the technical challenges of measuring, reporting, and verifying greenhouse gas (GHG) emissions and the policy issues of leakage, additionality, and permanence. This paper outlines a five-part plan for estimating forest carbon stocks and emissions with the accuracy and certainty needed to support a policy for Reducing Emissions from Deforestation and forest Degradation, forest conservation, sustainable management of forests, and enhancement of forest carbon stocks (the REDD-plus framework considered at the UNFCCC COP-15) in developing countries. The plan is aimed at UNFCCC non-Annex 1 developing countries, but the principles outlined are also applicable to developed (Annex 1) countries. The parts of the plan are: (1) Expand the number of national forest carbon Measuring, Reporting, and Verification (MRV) systems with a priority on tropical developing countries; (2) Implement continuous global forest carbon assessments through the network of national systems; (3) Achieve commitments from national space agencies for the necessary satellite data; (4) Establish agreed-on standards and independent verification processes to ensure robust reporting; and (5) Enhance coordination among international and multilateral organizations. 相似文献
Although forest conservation activities, particularly in the tropics, offer significant potential for mitigating carbon (C)
emissions, these types of activities have faced obstacles in the policy arena caused by the difficulty in determining key
elements of the project cycle, particularly the baseline. A baseline for forest conservation has two main components: the
projected land-use change and the corresponding carbon stocks in applicable pools in vegetation and soil, with land-use change
being the most difficult to address analytically. In this paper we focus on developing and comparing three models, ranging
from relatively simple extrapolations of past trends in land use based on simple drivers such as population growth to more
complex extrapolations of past trends using spatially explicit models of land-use change driven by biophysical and socioeconomic
factors. The three models used for making baseline projections of tropical deforestation at the regional scale are: the Forest
Area Change (FAC) model, the Land Use and Carbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD) model.
The models were used to project deforestation in six tropical regions that featured different ecological and socioeconomic
conditions, population dynamics, and uses of the land: (1) northern Belize; (2) Santa Cruz State, Bolivia; (3) Paraná State,
Brazil; (4) Campeche, Mexico; (5) Chiapas, Mexico; and (6) Michoacán, Mexico.
A comparison of all model outputs across all six regions shows that each model produced quite different deforestation baselines.
In general, the simplest FAC model, applied at the national administrative-unit scale, projected the highest amount of forest
loss (four out of six regions) and the LUCS model the least amount of loss (four out of five regions). Based on simulations
of GEOMOD, we found that readily observable physical and biological factors as well as distance to areas of past disturbance
were each about twice as important as either sociological/demographic or economic/infrastructure factors (less observable)
in explaining empirical land-use patterns.
We propose from the lessons learned, a methodology comprised of three main steps and six tasks can be used to begin developing
credible baselines. We also propose that the baselines be projected over a 10-year period because, although projections beyond
10 years are feasible, they are likely to be unrealistic for policy purposes. In the first step, an historic land-use change
and deforestation estimate is made by determining the analytic domain (size of the region relative to the size of proposed
project), obtaining historic data, analyzing candidate baseline drivers, and identifying three to four major drivers. In the
second step, a baseline of where deforestation is likely to occur–a potential land-use change (PLUC) map—is produced using
a spatial model such as GEOMOD that uses the key drivers from step one. Then rates of deforestation are projected over a 10-year
baseline period based on one of the three models. Using the PLUC maps, projected rates of deforestation, and carbon stock
estimates, baseline projections are developed that can be used for project GHG accounting and crediting purposes: The final
step proposes that, at agreed interval (e.g., about 10 years), the baseline assumptions about baseline drivers be re-assessed.
This step reviews the viability of the 10-year baseline in light of changes in one or more key baseline drivers (e.g., new
roads, new communities, new protected area, etc.). The potential land-use change map and estimates of rates of deforestation
could be re-done at the agreed interval, allowing the deforestation rates and changes in spatial drivers to be incorporated
into a defense of the existing baseline, or the derivation of a new baseline projection. 相似文献