首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
Forshay KJ  Johnson PT  Stock M  Peñalva C  Dodson SI 《Ecology》2008,89(10):2692-2699
When parasitic infections are severe or highly prevalent among prey, a significant component of the predator's diet may consist of parasitized hosts. However, despite the ubiquity of parasites in most food webs, comparisons of the nutritional quality of prey as a function of infection status are largely absent. We measured the nutritional consequences of chytridiomycete infections in Daphnia, which achieve high prevalence in lake ecosystems (>80%), and tested the hypothesis that Daphnia pulicaria infected with Polycaryum laeve are diminished in food quality relative to uninfected hosts. Compared with uninfected adults, infected individuals were smaller, contained less nitrogen and phosphorus, and were lower in several important fatty acids. Infected zooplankton had significantly shorter carapace lengths (8%) and lower mass (8-20%) than uninfected individuals. Parasitized animals contained significantly less phosphorus (16-18% less by dry mass) and nitrogen (4-6% less) than did healthy individuals. Infected individuals also contained 26-34% less saturated fatty acid and 31-42% less docosahexaenoic acid, an essential fatty acid that is typically low in cladocera, but critical to fish growth. Our results suggest that naturally occurring levels of chytrid infections in D. pulicaria populations reduce the quality of food available to secondary consumers, including planktivorous fishes, with potentially important effects for lake food webs.  相似文献   

2.
Threat-sensitive decision-making might be changed in response to a parasitic infection that impairs future reproduction. Infected animals should take more risk to gain energy to speed up their growth to achieve early reproduction and/or to strengthen their immune response. To avoid correlational evidence, we experimentally infected and sham-infected randomly selected immature three-spined sticklebacks with the cestode Schistocephalus solidus. For 7 weeks we determined the threat-sensitive foraging decisions and growth of individual sticklebacks in the presence of a live pike (Esox lucius). The experimenters were blind with respect to the infection status of the fish. In contrast to previous studies, our recently infected fish should have been almost unconstrained by the parasite and thus have been able to adopt an appropriate life history strategy. We found a strong predator effect for both infected and uninfected fish: the sticklebacks’ risk-sensitive foraging strategy resulted in significantly reduced growth under predation risk. Infected fish did not grow significantly faster under predation risk than uninfected fish. Since infected fish consumed much less prey in the presence of the predator than did infected fish in its absence, they obviously did not use the opportunity to maximize their growth rate to reach reproduction before the parasite impairs it. Received: 21 June 1999 / Revised: 27 November 1999 / Accepted: 5 September 2000  相似文献   

3.
The distribution of organisms at small spatial scales and their use of microhabitats are important determinants of species-level interactions. In many ubiquitous rocky shore invertebrates, use of intertidal microhabitats has previously been studied with relation to thermal and desiccation stress, ontogenetic changes and predation. Here, the effects of parasitism on the microhabitat use and movement of two New Zealand littorinid hosts, Austrolittorina antipodum and A. cincta, were investigated by examining the effect of infection by a philophthalmid trematode parasite. Alterations in microhabitat use and movement of infected versus uninfected individuals were found during both field mark-recapture and laboratory experiments, carried out from August 2012 to March 2013 in Otago Harbour, New Zealand (45.83°S, 170.64°E). Specifically, a trend towards increased use of rock surface habitats and a reduction in the distance moved by infected snails was observed. In addition, decreased downward movement was observed for some infected individuals. This alteration in individual distribution is likely to increase the availability of infected individuals to predators, hence aiding the successful transmission of the trematode parasite. These results highlight the importance of including parasitism as a biotic factor in studies of gastropod movement and spatial distribution.  相似文献   

4.
Animals face trade-offs between predation risk and foraging success depending on their location in the landscape; for example, individuals that remain near a common shelter may be safe from predation but incur stronger competition for resources. Despite a long tradition of theoretical exploration of the relationships among foraging success, conspecific competition, predation risk, and population distribution in a heterogeneous environment, the scenario we describe here has not been explored theoretically. We construct a model of habitat use rules to predict the distribution of a local population (prey sharing a common shelter and foraging across surrounding habitats). Our model describes realized habitat quality as a ratio of density- and location-dependent mortality to density-dependent growth. We explore how the prey distribution around a shelter is expected to change as the parameters governing the strength of density dependence, landscape characteristics, and local abundance vary. Within the range of parameters where prey spend some time away from shelter but remain site-attached, the prey density decreases away from shelter. As the distance at which prey react to predators increases, the population range generally increases. At intermediate reaction distances, however, increases in the reaction distance lead to decreases in the maximum foraging distance because of increased evenness in the population distribution. As total abundance increases, the population range increases, average population density increases, and realized quality decreases. The magnitude of these changes differs in, for example, ‘high-’ and ‘low-visibility’ landscapes where prey can detect predators at different distances.  相似文献   

5.
Shelter competition is uncommon among social animals, as is the case among normally gregarious Caribbean spiny lobsters (Panulirus argus). However, healthy lobsters avoid sheltering with conspecifics infected by a lethal pathogenic virus, PaV1. These contradictory behaviors have implications for shelter use and survival, especially in areas where shelter is limited. In laboratory experiments, we tested shelter competition between paired healthy and diseased juvenile lobsters in shelter-limited mesocosms. Neither healthy nor diseased lobsters dominated access to shelters, but lobsters shared shelter less often when diseased lobsters were present relative to controls with two healthy lobsters. We hypothesized that exclusion of juvenile lobsters from shelter results in increased mortality from predation, especially for the more lethargic, infected individuals. Field tethering trials revealed that predation was indeed higher on infected individuals and on all tethered lobsters deprived of shelter. We then tested in mesocosm experiments how the contrasting risks of predation versus infection by a lethal pathogen influence shelter use. Lobsters were offered a choice of an empty shelter or one containing a diseased lobster in the presence of a predator (i.e., caged octopus) whose presence normally elicits shelter-seeking behavior, and these data were compared with a previous study where the predator was absent. Lobsters selected the empty shelter significantly more often despite the threat of predation, foregoing the protection of group defense in favor of reduced infection risk. These results offer striking evidence of how pathogenic diseases shape not only the behavior of social animals but also their use of shelters and risk of predation.  相似文献   

6.
Although there is ample evidence for the generality of foraging and predation trade-offs in aquatic systems, its application to terrestrial systems is less comprehensive. In this review, meta-analysis was used to analyze experiments on giving-up-densities in terrestrial systems to evaluate the overall magnitude of predation risk on foraging behavior and experimental conditions mediating its effect. Results indicate a large and significant decrease in foraging effort as a consequence of increased predation risk. Whether experiments were conducted under natural or artificial conditions produced no change in the overall effect predation had on foraging. Odor and live predators as a correlate of predation risk had weaker and nonsignificant effects compared to habitat characteristics. The meta-analysis suggests that the effect of predation risk on foraging behavior in terrestrial systems is strongly dependent on the type of predation risk being utilized.  相似文献   

7.
Although predators can affect foraging behaviors of floral visitors, rarely is it known if these top-down effects of predators may cascade to plant fitness through trait-mediated interactions. In this study we manipulated artificial crab spiders on flowers of Rubus rosifolius to test the effects of predation risk on flower-visiting insects and strength of trait-mediated indirect effects to plant fitness. In addition, we tested which predator traits (e.g., forelimbs, abdomen) are recognized and avoided by pollinators. Total visitation rate was higher for control flowers than for flowers with an artificial crab spider. In addition, flowers with a sphere (simulating a spider abdomen) were more frequently visited than those with forelimbs or the entire spider model. Furthermore, the presence of artificial spiders decreased individual seed set by 42% and fruit biomass by 50%. Our findings indicate that pollinators, mostly bees, recognize and avoid flowers with predation risk; forelimbs seem to be the predator trait recognized and avoided by hymenopterans. Additionally, predator avoidance by pollinators resulted in pollen limitation, thereby affecting some components of plant fitness (fruit biomass and seed number). Because most pollinator species that recognized predation risk visited many other plant species, trait-mediated indirect effects of spiders cascading down to plant fitness may be a common phenomenon in the Atlantic rainforest ecosystem.  相似文献   

8.
Behavior in eusocial insects likely reflects a long history of selection imposed by parasites and pathogens because the conditions of group living often favor the transmission of infection among nestmates. Yet, relatively few studies have quantified the effects of parasites on both the level of individual colony members and of colony success, making it difficult to assess the relative importance of different parasites to the behavioral ecology of their social insect hosts. Colonies of Polybia occidentalis, a Neotropical social wasp, are commonly infected by gregarines (Phylum Apicomplexa; Order Eugregarinida) during the wet season in Guanacaste, Costa Rica. To determine the effect of gregarine infection on individual workers in P. occidentalis, we measured foraging rates of marked wasps from colonies comprising both infected and uninfected individuals. To assess the effect of gregarines on colony success, we measured productivity and adult mortality rates in colonies with different levels of infection prevalence (proportion of adults infected). Foraging rates in marked individuals were negatively correlated with the intensity of gregarine infection. Infected colonies with high gregarine prevalence constructed nests with fewer brood cells per capita, produced less brood biomass per capita, and, surprisingly, experienced lower adult mortality rates than did uninfected or lightly infected colonies. These data strongly suggest that gregarine infection lowers foraging rates, thus reducing risk to foragers and, consequently, reducing adult mortality rates, while at the same time lowering per-capita input of materials and colony productivity. In infected colonies, queen populations were infected with a lower prevalence than were workers. Intra-colony infection prevalence decreased dramatically in the P. occidentalis population during the wet season.An erratum to this article can be found at  相似文献   

9.
Insect larvae increase in size with several orders of magnitude throughout development making them more conspicuous to visually hunting predators. This change in predation pressure is likely to impose selection on larval anti-predator behaviour and since the risk of detection is likely to decrease in darkness, the night may offer safer foraging opportunities to large individuals. However, forsaking day foraging reduces development rate and could be extra costly if prey are subjected to seasonal time stress. Here we test if size-dependent risk and time constraints on feeding affect the foraging–predation risk trade-off expressed by the use of the diurnal–nocturnal period. We exposed larvae of one seasonal and one non-seasonal butterfly to different levels of seasonal time stress and time for diurnal–nocturnal feeding by rearing them in two photoperiods. In both species, diurnal foraging ceased at large sizes while nocturnal foraging remained constant or increased, thus larvae showed ontogenetic shifts in behaviour. Short night lengths forced small individuals to take higher risks and forage more during daytime, postponing the shift to strict night foraging to later on in development. In the non-seasonal species, seasonal time stress had a small effect on development and the diurnal–nocturnal foraging mode. In contrast, in the seasonal species, time for pupation and the timing of the foraging shift were strongly affected. We argue that a large part of the observed variation in larval diurnal–nocturnal activity and resulting growth rates is explained by changes in the cost/benefit ratio of foraging mediated by size-dependent predation and time stress.  相似文献   

10.
Abstract: Developmental instability, measured as fluctuating asymmetry (FA), is often used as a tool to measure stress and the overall quality of organisms. Under FA, it is assumed that control of symmetry during development is costly and that under stress the trajectory of development is disturbed, resulting in asymmetric morphologies. Amphibian emergent infectious diseases (EIDs), such as Ranavirus and chytrid fungus, have been involved in several mortality events, which makes them stressors and allows for the study of FA. We analyzed nine populations of green frogs (Rana clamitans) for the presence or absence of Ranavirus and chytrid fungus. Individuals were measured to determine levels of FA in seven traits under the hypothesis that FA is more likely to be observed in individuals infected by the pathogens. Significantly higher levels of FA were found in individuals with Ranavirus compared with uninfected individuals among all populations and all traits. We did not observe FA in individuals infected with chytrid fungus for any of the traits measured. Additionally, we observed a significant association between Ranavirus infection and levels of FA in both males and females, which may indicate this viral disease is likely to affect both sexes during development. Altogether, our results indicate that some EIDs may have far‐reaching and nonlethal effects on individual development and populations harboring such diseases and that FA can be used as a conservation tool to identify populations subject to such a stress.  相似文献   

11.
The simultaneous presence of predators and a limited time for development imposes a conflict: accelerating growth under time constraints comes at the cost of higher predation risk mediated by increased foraging. The few studies that have addressed this tradeoff have dealt only with life history traits such as age and size at maturity. Physiological traits have largely been ignored in studies assessing the impact of environmental stressors, and it is largely unknown whether they respond independently of life history traits. Here, we studied the simultaneous effects of time constraints, i.e., as imposed by seasonality, and predation risk on immune defense, energy storage, and life history in lestid damselflies. As predicted by theory, larvae accelerated growth and development under time constraints while the opposite occurred under predation risk. The activity of phenoloxidase, an important component of insect immunity, and investment in fat storage were reduced both under time constraints and in the presence of predators. These reductions were smaller when time constraints and predation risk were combined. This indicates that predators can induce sublethal costs linked to both life history and physiology in their prey, and that time constraints can independently reduce the impact of predator-induced changes in life history and physiology.  相似文献   

12.
Creel S 《Ecology》2011,92(12):2190-2195
Risk effects, or the costs of antipredator behavior, can comprise a large proportion of the total effect of predators on their prey. While empirical studies are accumulating to demonstrate the importance of risk effects, there is no general theory that predicts the relative importance of risk effects and direct predation. Working toward this general theory, it has been shown that functional traits of predators (e.g., hunting modes) help to predict the importance of risk effects for ecosystem function. Here, I note that attributes of the predator, the prey, and the environment are all important in determining the strength of antipredator responses, and I develop hypotheses for the ways that prey functional traits might influence the magnitude of risk effects. In particular, I consider the following attributes of prey: group size and dilution of direct predation risk, the degree of foraging specialization, body mass, and the degree to which direct predation is additive vs. compensatory. Strong tests of these hypotheses will require continued development of methods to identify and quantify the fitness costs of antipredator responses in wild populations.  相似文献   

13.
In the absence of predators, pollinators can often maximize their foraging success by visiting the most rewarding flowers. However, if predators use those highly rewarding flowers to locate their prey, pollinators may benefit from changing their foraging preferences to accept less rewarding flowers. Previous studies have shown that some predators, such as crab spiders, indeed hunt preferentially on the most pollinator-attractive flowers. In order to determine whether predation risk can alter pollinator preferences, we conducted laboratory experiments on the foraging behavior of bumble bees (Bombus impatiens) when predation risk was associated with a particular reward level (measured here as sugar concentration). Bees foraged in arenas containing a choice of a high-reward and a low-reward artificial flower. On a bee’s first foraging trip, it was either lightly squeezed with forceps, to simulate a crab spider attack, or was allowed to forage safely. The foragers’ subsequent visits were recorded for between 1 and 4 h without any further simulated attacks. Compared to bees that foraged safely, bees that experienced a simulated attack on a low-reward artificial flower had reduced foraging activity. However, bees attacked on a high-reward artificial flower were more likely to visit low-reward artificial flowers on subsequent foraging trips. Forager body size, which is thought to affect vulnerability to capture by predators, did not have an effect on response to an attack. Predation risk can thus alter pollinator foraging behavior in ways that influence the number and reward level of flowers that are visited.  相似文献   

14.
Urban MC 《Ecology》2007,88(10):2587-2597
Growth is a critical ecological trait because it can determine population demography, evolution, and community interactions. Predation risk frequently induces decreased foraging and slow growth in prey. However, such strategies may not always be favored when prey can outgrow a predator's hunting ability. At the same time, a growing gape-limited predator broadens its hunting ability through time by expanding its gape and thereby creates a moving size refuge for susceptible prey. Here, I explore the ramifications of growing gape-limited predators for adaptive prey growth. A discrete demographic model for optimal foraging/growth strategies was derived under the realistic scenario of gape-limited and gape-unconstrained predation threats. Analytic and numerical results demonstrate a novel fitness minimum just above the growth rate of the gape-limited predator. This local fitness minimum separates a slow growth strategy that forages infrequently and accumulates low but constant predation risk from a fast growth strategy that forages frequently and experiences a high early predation risk in return for lower future predation risk and enhanced fecundity. Slow strategies generally were advantageous in communities dominated by gape-unconstrained predators whereas fast strategies were advantageous in gape-limited predator communities. Results were sensitive to the assumed relationships between prey size and fecundity and between prey growth and predation risk. Predator growth increased the parameter space favoring fast prey strategies. The model makes the testable predictions that prey should not grow at the same rate as their gape-limited predator and generally should grow faster than the fastest growing gape-limited predator. By focusing on predator constraints on prey capture, these results integrate the ecological and evolutionary implications of prey growth in diverse predator communities and offer an explanation for empirical growth patterns previously viewed to be anomalies.  相似文献   

15.
Abstract: Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture–recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31–42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5–7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low‐level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations.  相似文献   

16.
Malaria and risk of predation: a comparative study of birds   总被引:5,自引:0,他引:5  
Møller AP  Nielsen JT 《Ecology》2007,88(4):871-881
Predators have been hypothesized to prey on individuals in a poor state of health, although this hypothesis has only rarely been examined. We used extensive data on prey abundance and availability from two long-term studies of the European Sparrowhawk (Accipiter nisus) and the Eurasian Goshawk (Accipiter gentilis) to quantify the relationship between predation risk of different prey species and infection with malaria and other protozoan blood parasites. Using a total of 31 745 prey individuals of 65 species of birds from 1709 nests during 1977-1997 for the Sparrowhawk and a total of 21 818 prey individuals of 76 species of birds from 1480 nests for the Goshawk during 1977-2004, we show that prey species with a high prevalence of blood parasites had higher risks of predation than species with a low prevalence. That was also the case when a number of confounding variables of prey species, such as body mass, breeding sociality, sexual dichromatism, and similarity among species in risk of predation due to common descent, were controlled in comparative analyses of standardized linear contrasts. Prevalence of the genera Haemoproteus, Leucocytozoon, Plasmodium, and Trypanosoma were correlated with each other, and we partitioned out the independent effects of different protozoan genera on predation risk in comparative analyses. Prevalence of Haemoproteus, Leucocytozoon, and Plasmodium accounted for interspecific variation in predation risk for the two raptors. These findings suggest that predation is an important factor affecting parasite-host dynamics because predators tend to prey on hosts that are more likely to be infected, thereby reducing the transmission success of parasites. Furthermore, this study demonstrates that protozoan infections are a common cause of death for hosts mediated by increased risk of predation.  相似文献   

17.
Predation risk is amongst the most pervasive selective pressures influencing behaviour and animals have been repeatedly shown to trade-off foraging success for safety. We examined the nature of this trade-off in cleaning symbioses amongst Caribbean coral reef fishes. We predicted that cleaning gobies (Elacatinus evelynae and Elacatinus prochilos) should prefer fish clients that pose a low risk of predation (e.g. herbivores) over clients that may have more ectoparasites but pose a higher risk (e.g. piscivores). Our field observations revealed that cleaners did clean preferentially client species with more parasites but predatory and non-predatory clients had similar ectoparasite loads. Despite the lack of a foraging advantage for inspecting predators, cleaners did not avoid risky clients. On the contrary, a larger proportion of visiting predators than non-predators was inspected, gobies initiated more interactions with predatory clients, and predators were attended to immediately upon arrival at cleaning stations. This preferential treatment of dangerous clients may allow the rapid identification of cleaners as non-prey item or may be due to the effect of predators on the rest of the cleaners’ clientele, which avoided cleaning stations whilst predators were present. Dealing with potentially risky clients may allow gobies to regain access to their main food source: non-predatory clients.  相似文献   

18.
Studies that focus on single predator-prey interactions can be inadequate for understanding antipredator responses in multi-predator systems. Yet there is still a general lack of information about the strategies of prey to minimize predation risk from multiple predators at the landscape level. Here we examined the distribution of seven African ungulate species in the fenced Karongwe Game Reserve (KGR), South Africa, as a function of predation risk from all large carnivore species (lion, leopard, cheetah, African wild dog, and spotted hyena). Using observed kill data, we generated ungulate-specific predictions of relative predation risk and of riskiness of habitats. To determine how ungulates minimize predation risk at the landscape level, we explicitly tested five hypotheses consisting of strategies that reduce the probability of encountering predators, and the probability of being killed. All ungulate species avoided risky habitats, and most selected safer habitats, thus reducing their probability of being killed. To reduce the probability of encountering predators, most of the smaller prey species (impala, warthog, waterbuck, kudu) avoided the space use of all predators, while the larger species (wildebeest, zebra, giraffe) only avoided areas where lion and leopard space use were high. The strength of avoidance for the space use of predators generally did not correspond to the relative predation threat from those predators. Instead, ungulates used a simpler behavioral rule of avoiding the activity areas of sit-and-pursue predators (lion and leopard), but not those of cursorial predators (cheetah and African wild dog). In general, selection and avoidance of habitats was stronger than avoidance of the predator activity areas. We expect similar decision rules to drive the distribution pattern of ungulates in other African savannas and in other multi-predator systems, especially where predators differ in their hunting modes.  相似文献   

19.
Carnivore predation on livestock is a complex management and policy challenge, yet it is also intrinsically an ecological interaction between predators and prey. Human–wildlife interactions occur in socioecological systems in which human and environmental processes are closely linked. However, underlying human–wildlife conflict and key to unpacking its complexity are concrete and identifiable ecological mechanisms that lead to predation events. To better understand how ecological theory accords with interactions between wild predators and domestic prey, we developed a framework to describe ecological drivers of predation on livestock. We based this framework on foundational ecological theory and current research on interactions between predators and domestic prey. We used this framework to examine ecological mechanisms (e.g., density-mediated effects, behaviorally mediated effects, and optimal foraging theory) through which specific management interventions operate, and we analyzed the ecological determinants of failure and success of management interventions in 3 case studies: snow leopards (Panthera uncia), wolves (Canis lupus), and cougars (Puma concolor). The varied, context-dependent successes and failures of the management interventions in these case studies demonstrated the utility of using an ecological framework to ground research and management of carnivore–livestock conflict. Mitigation of human–wildlife conflict appears to require an understanding of how fundamental ecological theories work within domestic predator–prey systems.  相似文献   

20.
Urbanization decreases species diversity, but it increases the abundance of certain species with high tolerance to human activities. The safe-habitat hypothesis explains this pattern through a decrease in the abundance of native predators, which reduces predation risk in urban habitats. However, this hypothesis does not consider the potential negative effects of human-associated disturbance (e.g., pedestrians, dogs, cats). Our goal was to assess the degree of perceived predation risk in house finches (Carpodacus mexicanus) through field studies and semi-natural experiments in areas with different levels of urbanization using multiple indicators of risk (flock size, flight initiation distance, vigilance, and foraging behavior). Field studies showed that house finches in more urbanized habitats had a greater tendency to flock with an increase in population density and flushed at larger distances than in less urbanized habitats. In the semi-natural experiment, we found that individuals spent a greater proportion of time in the refuge patch and increased the instantaneous pecking rate in the more urbanized habitat with pedestrians probably to compensate for the lower amount of foraging time. Vigilance parameters were influenced in different ways depending on habitat type and distance to flock mates. Our results suggest that house finches may perceive highly urbanized habitats as more dangerous, despite the lower number of native predators. This could be due to the presence of human activities, which could increase risk or modify the ability to detect predators. House finches seem to adapt to the urban environment through different behavioral strategies that minimize risk.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号