The tiger shark (Galeocerdo cuvier Peron and Lesueur 1822) is a widely distributed predator with a broad diet and the potential to affect marine community structure, yet information on local patterns of abundance for this species is lacking. Tiger shark catch data were gathered over 7 years of tag and release research fishing (1991–2000, 2002–2004) in Shark Bay, Western Australia (25°45′S, 113°44′E). Sharks were caught using drumlines deployed in six permanent zones (~3 km2 in area). Fishing effort was standardized across days and months, and catch rates on hooks were expressed as the number of sharks caught h−1. A total of 449 individual tiger sharks was captured; 29 were recaptured. Tiger shark catch rate showed seasonal periodicity, being higher during the warm season (Sep–May) than during the cold season (Jun–Aug), and was marked by inter-annual variability. The most striking feature of the catch data was a consistent pattern of slow, continuous variation within each year from a peak during the height of the warm season (February) to a trough in the cold season (July). Annual growth rates of recaptured individuals were generally consistent with estimates from other regions, but exceeded those for populations elsewhere for sharks >275 cm fork length (FL), perhaps because mature sharks in the study area rely heavily on large prey. The data suggest that (1) the threat of predation faced by animals consumed by tiger sharks fluctuates dramatically within and between years, and (2) efforts to monitor large shark abundance should be extensive enough to detect inter-annual variation and sufficiently intensive to account for intra-annual trends. 相似文献
Although forest conservation activities, particularly in the tropics, offer significant potential for mitigating carbon (C)
emissions, these types of activities have faced obstacles in the policy arena caused by the difficulty in determining key
elements of the project cycle, particularly the baseline. A baseline for forest conservation has two main components: the
projected land-use change and the corresponding carbon stocks in applicable pools in vegetation and soil, with land-use change
being the most difficult to address analytically. In this paper we focus on developing and comparing three models, ranging
from relatively simple extrapolations of past trends in land use based on simple drivers such as population growth to more
complex extrapolations of past trends using spatially explicit models of land-use change driven by biophysical and socioeconomic
factors. The three models used for making baseline projections of tropical deforestation at the regional scale are: the Forest
Area Change (FAC) model, the Land Use and Carbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD) model.
The models were used to project deforestation in six tropical regions that featured different ecological and socioeconomic
conditions, population dynamics, and uses of the land: (1) northern Belize; (2) Santa Cruz State, Bolivia; (3) Paraná State,
Brazil; (4) Campeche, Mexico; (5) Chiapas, Mexico; and (6) Michoacán, Mexico.
A comparison of all model outputs across all six regions shows that each model produced quite different deforestation baselines.
In general, the simplest FAC model, applied at the national administrative-unit scale, projected the highest amount of forest
loss (four out of six regions) and the LUCS model the least amount of loss (four out of five regions). Based on simulations
of GEOMOD, we found that readily observable physical and biological factors as well as distance to areas of past disturbance
were each about twice as important as either sociological/demographic or economic/infrastructure factors (less observable)
in explaining empirical land-use patterns.
We propose from the lessons learned, a methodology comprised of three main steps and six tasks can be used to begin developing
credible baselines. We also propose that the baselines be projected over a 10-year period because, although projections beyond
10 years are feasible, they are likely to be unrealistic for policy purposes. In the first step, an historic land-use change
and deforestation estimate is made by determining the analytic domain (size of the region relative to the size of proposed
project), obtaining historic data, analyzing candidate baseline drivers, and identifying three to four major drivers. In the
second step, a baseline of where deforestation is likely to occur–a potential land-use change (PLUC) map—is produced using
a spatial model such as GEOMOD that uses the key drivers from step one. Then rates of deforestation are projected over a 10-year
baseline period based on one of the three models. Using the PLUC maps, projected rates of deforestation, and carbon stock
estimates, baseline projections are developed that can be used for project GHG accounting and crediting purposes: The final
step proposes that, at agreed interval (e.g., about 10 years), the baseline assumptions about baseline drivers be re-assessed.
This step reviews the viability of the 10-year baseline in light of changes in one or more key baseline drivers (e.g., new
roads, new communities, new protected area, etc.). The potential land-use change map and estimates of rates of deforestation
could be re-done at the agreed interval, allowing the deforestation rates and changes in spatial drivers to be incorporated
into a defense of the existing baseline, or the derivation of a new baseline projection. 相似文献
Twenty-two pesticides and metabolites selected on the basis of a regional priority list, were surveyed in surface river waters by high performance liquid chromatography coupled in tandem with UV diode array detection and mass spectrometry, after an off-line pre-concentration step. Pesticide concentrations ranged between 0.07 and 4.8 microg/l according to the compound and sampling period. Analytical results were linked to the environmental risk of pesticides, evaluated by their system investigation of risk by integration of score (SIRIS) rank. 相似文献
This research focused on the use of sonication to destroy surfactants and surface tension properties in industrial wastewaters that affect traditional water treatment processes. We have investigated the sonochemical destruction of surfactants and a chelating agent to understand the release of metals from surfactants during sonication. In addition, the effects of physical properties of surfactants and the effect of ultrasonic frequency were investigated to gain an understanding of the factors affecting degradation. Sonochemical degradation of surfactants was observed to be more effective than nonsurfactant compounds. In addition, as the concentration is increased, the degradation rate constant does not decrease as significantly as with nonsurfactant compounds in the near-field acoustical processor reactor. The degradation of metal complexes is not as effective as in the absence of the metal. However, this is likely an artifact of the model complexing agent used. Surfactant metal complexes are expected to be faster, as they will accumulate at the hot bubble interface, significantly increasing ligand exchange kinetics and thus degradation of the complex. 相似文献
The Clean Air Act identifies 189 hazardous air pollutants (HAPs), or "air toxics," associated with a wide range of adverse human health effects. The U.S. Environmental Protection Agency has conducted a modeling study with the Assessment System for Population Exposure Nationwide (ASPEN) to gain a greater understanding of the spatial distribution of concentrations of these HAPs resulting from contributions of multiple emission sources. The study estimates year 1990 long-term outdoor concentrations of 148 air toxics for each census tract in the continental United States, utilizing a Gaussian air dispersion modeling approach. Ratios of median national modeled concentrations to estimated emissions indicate that emission totals without consideration of emission source type can be a misleading indicator of air quality. The results also indicate priorities for improvements in modeling methodology and emissions identification. Model performance evaluation suggests a tendency for underprediction of observed concentrations, which is likely due, at least in part, to a number of limitations of the Gaussian modeling formulation. Emissions estimates for HAPs have a high degree of uncertainty and contribute to discrepancies between modeled and monitored concentration estimates. The model's ranking of concentrations among monitoring sites is reasonably good for most of the gaseous HAPs evaluated, with ranking accuracy ranging from 66 to 100%. 相似文献
Objective: Despite advances in vehicle safety systems, motor vehicle crashes continue to cause ankle fractures. This study attempts to provide insight into the mechanisms of injury and to identify the at-risk population groups.
Methods: A study was made of ankle fractures patients treated at an urban level 1 trauma center following motor vehicle crashes, with a concurrent analysis of a nationally representative crash data set. The national data set focused on ankle fractures in drivers involved in frontal crashes. Statistical analysis was applied to the national data set to identify factors associated with fracture risk.
Results: Malleolar fractures occurred most frequently in the driver's right foot due to pedal interaction. The majority of complex/open fractures occurred in the left foot due to interaction with the vehicle floor. These fractures occurred in association with a femoral fracture, but their broad injury pattern suggests a range of fracture causation mechanisms. The statistical analysis indicated that the risk of fracture increased with increasing driver body mass index (BMI) and age.
Conclusions: Efforts to reduce the risk of driver ankle injury should focus on right foot and pedal interaction. The range of injury patterns identified here suggest that efforts to minimize driver ankle fracture risk will likely need to consider injury tolerances for flexion, pronation/supination, and axial loading in order to capture the full range of injury mechanisms. In the clinical environment, physicians examining drivers after a frontal crash should consider those who are older or obese or who have severe femoral injury without concurrent head injury as highly suspicious for an ankle injury. 相似文献
Information on flood inundation extent is important for understanding societal exposure, water storage volumes, flood wave attenuation, future flood hazard, and other variables. A number of organizations now provide flood inundation maps based on satellite remote sensing. These data products can efficiently and accurately provide the areal extent of a flood event, but do not provide floodwater depth, an important attribute for first responders and damage assessment. Here we present a new methodology and a GIS‐based tool, the Floodwater Depth Estimation Tool (FwDET), for estimating floodwater depth based solely on an inundation map and a digital elevation model (DEM). We compare the FwDET results against water depth maps derived from hydraulic simulation of two flood events, a large‐scale event for which we use medium resolution input layer (10 m) and a small‐scale event for which we use a high‐resolution (LiDAR; 1 m) input. Further testing is performed for two inundation maps with a number of challenging features that include a narrow valley, a large reservoir, and an urban setting. The results show FwDET can accurately calculate floodwater depth for diverse flooding scenarios but also leads to considerable bias in locations where the inundation extent does not align well with the DEM. In these locations, manual adjustment or higher spatial resolution input is required. 相似文献
Knowledge of appropriate behaviour during an earthquake is crucial for prevention of injury and loss of life. The Israeli Home Front Command conducts a yearly earthquake education programme in all Israeli schools, using three types of educational interventions: lectures, drills and a combination of the two. The aim of this study was to evaluate the effectiveness of these interventions in providing students with knowledge. We distributed a questionnaire to 2,648 children from the 5th and 6th grades in 120 schools nationwide. Knowledge scores for both 5th and 6th grades were increased, regardless of type of intervention, compared to the non-exposure group. A combined intervention of lectures and drills resulted in the highest knowledge scores. Our findings suggest that for the age group studied a combination of lectures and drills will likely prepare students best for how to behave in the event of an earthquake. 相似文献
Not as much abatement as has been presumed. Smog check programs aim to curb tailpipe emissions from in-use vehicles by requiring repairs whenever emissions, measured at regular time intervals, exceed a certain threshold. Using data from California, we estimate that on average 41% of the initial emissions abatement from repairs is lost by the time of the subsequent inspection, normally two years later. Our estimates imply that the cost per pound of pollution avoided is an order of magnitude greater for smog check repairs than alternative policies such as new-vehicle standards or emissions trading among industrial point sources. 相似文献