首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 25 毫秒
1.
Theory predicts that individuals at the periphery of a group should be at higher risk than their more central conspecifics since they would be the first to be encountered by an approaching terrestrial predator. As a result, it is expected that peripheral individuals display higher vigilance levels. However, the role of conspecifics in this “edge effect” may have been previously overlooked, and taking into account the possible role of within-group competition is needed. Vigilance behavior in relation to within-group spatial position was studied in impalas (Aepyceros melampus) feeding on standardized patches. We also controlled for food distribution in order to accurately define a “central” as opposed to a “peripheral” position. Our data clearly supported an edge effect, with peripheral individuals spending more time vigilant than their central conspecifics. Data on social interactions suggest that it was easier for a foraging individual to defend its feeding patch with its head lowered, and that more interactions occurred at the center of the group. Together, these results indicate that central foragers may reduce their vigilance rates in response to increased competition. Disentangling how the effects of competition and predation risk contribute to the edge effect requires further investigations.  相似文献   

2.
Predation risk has been shown to alter various behaviours in prey. Risk alters activity, habitat use and foraging, and weight decrease might be a consequence of that. In mammals, studies on physiological measures affected by risk of predation, other than weight, are rare. We studied in two separate laboratory experiments foraging, hoarding behaviour and expression of stress measured non-invasively from the faeces in the bank vole (Clethrionomys glareolus), a common boreal rodent. Voles were exposed to predation risk using odours of the least weasels (Mustela nivalis nivalis). Distilled water served as control. In the first experiment, we found that foraging effort, measured as sunflower seeds taken from seed trays filled with sand, was significantly lower in trays scented with weasel odour. Both immediate consumption of seeds and hoarding were affected negatively by the weasel odour. Females hoarded significantly more than males in autumn. In the second experiment, the negative effect of weasel odour on foraging was consistent over a 3-day experiment, but the strongest effect was observed in the first night. Foraging increased over the time of the experiment, which might reflect either energetic compensation during a longer period of risk, predicted in the predation risk allocation hypothesis, or habituation to the odour-simulated risk. Despite decreased foraging under predation risk, stress measured as corticosteroid metabolite concentration in vole faeces was not affected by the weasel odour treatment. In conclusion, we were able to verify predation-risk-mediated changes in the foraging effort of bank voles but no physiological stress response was measured non-invasively, probably due to great individual variation in secretion of stress hormones.  相似文献   

3.
It is well known that the risk of predation affects prey decision making. However, few studies have been concerned with the cues used by prey to assess this risk. Prey animals may use indirect environmental cues to assess predation hazard since direct evaluation may be dangerous. I studied the assessment of predation risk, manipulated via environmental illumination level, and the trade-off between foraging and predation hazard avoidance in the nocturnal rodentPhyllotis darwini (Rodentia: Cricetidae). In experimental arenas I simulated dark and full moon nights (which in nature correlate with low and high predation risk, respectively) and measured the immediate responses of animals to flyovers of a raptor model. Second, varying illumination only, I evaluated patch use, food consumption, central place foraging, and nocturnal variation of body weight. During flyover experiments, animals showed significantly more evasive reactions under full moon illumination than in moonless conditions. In the patch use experiments, rodents significantly increased their giving-up density and decreased their total food consumption under moonlight. On dark nights, rodents normally fed in the food patch, but when illumination was high they became central place foragers in large proportion. Moreover, the body weight of individuals decreased proportionately more during bright nights. These results strongly suggest thatP. darwini uses the level of environmental illumination as a cue to the risk of being preyed upon and may sacrifice part of its energy return to avoid risky situations.  相似文献   

4.
Insect larvae increase in size with several orders of magnitude throughout development making them more conspicuous to visually hunting predators. This change in predation pressure is likely to impose selection on larval anti-predator behaviour and since the risk of detection is likely to decrease in darkness, the night may offer safer foraging opportunities to large individuals. However, forsaking day foraging reduces development rate and could be extra costly if prey are subjected to seasonal time stress. Here we test if size-dependent risk and time constraints on feeding affect the foraging–predation risk trade-off expressed by the use of the diurnal–nocturnal period. We exposed larvae of one seasonal and one non-seasonal butterfly to different levels of seasonal time stress and time for diurnal–nocturnal feeding by rearing them in two photoperiods. In both species, diurnal foraging ceased at large sizes while nocturnal foraging remained constant or increased, thus larvae showed ontogenetic shifts in behaviour. Short night lengths forced small individuals to take higher risks and forage more during daytime, postponing the shift to strict night foraging to later on in development. In the non-seasonal species, seasonal time stress had a small effect on development and the diurnal–nocturnal foraging mode. In contrast, in the seasonal species, time for pupation and the timing of the foraging shift were strongly affected. We argue that a large part of the observed variation in larval diurnal–nocturnal activity and resulting growth rates is explained by changes in the cost/benefit ratio of foraging mediated by size-dependent predation and time stress.  相似文献   

5.
An important goal in ecology is developing general theory on how the species composition of ecosystems is related to ecosystem properties and functions. Progress on this front is limited partly because of the need to identify mechanisms controlling functions that are common to a wide range of ecosystem types. We propose that one general mechanism, rooted in the evolutionary ecology of all species, is adaptive foraging behavior in response to predation risk. To support our claim, we present two kinds of empirical evidence from plant-based and detritus-based food chains of terrestrial and aquatic ecosystems. The first kind comes from experiments that explicitly trace how adaptive foraging influences ecosystem properties and functions. The second kind comes from a synthesis of studies that individually examine complementary components of particular ecosystems that together provide an integrated perspective on the link between adaptive foraging and ecosystem function. We show that the indirect effects of predators on plant diversity, plant productivity, nutrient cycling, trophic transfer efficiencies, and energy flux caused by consumer foraging shifts in response to risk are qualitatively different from effects caused by reductions in prey density due to direct predation. We argue that a perspective of ecosystem function that considers effects of consumer behavior in response to predation risk will broaden our capacity to explain the range of outcomes and contingencies in trophic control of ecosystems. This perspective also provides an operational way to integrate evolutionary and ecosystem ecology, which is an important challenge in ecology.  相似文献   

6.
Intraguild predation constitutes a widespread interaction occurring across different taxa, trophic positions and ecosystems, and its endogenous dynamical properties have been shown to affect the abundance and persistence of the involved populations as well as those connected with them within food webs. Although optimal foraging decisions displayed by predators are known to exert a stabilizing influence on the dynamics of intraguild predation systems, few is known about the corresponding influence of adaptive prey decisions in spite of its commonness in nature. In this study, we analyze the effect that adaptive antipredator behavior exerts on the stability and persistence of the populations involved in intraguild predation systems. Our results indicate that adaptive prey behavior in the form of inducible defenses act as a stabilizing mechanism and show that, in the same direction that adaptive foraging, enhances the parameter space in which species can coexist through promoting persistence of the IG-prey. At high levels of enrichment, the intraguild predation system exhibits unstable dynamics and zones of multiples attractors. In addition, we show that the equilibrium density of the IG-predator could be increased at intermediate values of defense effectiveness. Finally we conclude that adaptive prey behavior is an important mechanism leading to species coexistence in intraguild predation systems and consequently enhancing stability of food webs.  相似文献   

7.
In the absence of predators, pollinators can often maximize their foraging success by visiting the most rewarding flowers. However, if predators use those highly rewarding flowers to locate their prey, pollinators may benefit from changing their foraging preferences to accept less rewarding flowers. Previous studies have shown that some predators, such as crab spiders, indeed hunt preferentially on the most pollinator-attractive flowers. In order to determine whether predation risk can alter pollinator preferences, we conducted laboratory experiments on the foraging behavior of bumble bees (Bombus impatiens) when predation risk was associated with a particular reward level (measured here as sugar concentration). Bees foraged in arenas containing a choice of a high-reward and a low-reward artificial flower. On a bee’s first foraging trip, it was either lightly squeezed with forceps, to simulate a crab spider attack, or was allowed to forage safely. The foragers’ subsequent visits were recorded for between 1 and 4 h without any further simulated attacks. Compared to bees that foraged safely, bees that experienced a simulated attack on a low-reward artificial flower had reduced foraging activity. However, bees attacked on a high-reward artificial flower were more likely to visit low-reward artificial flowers on subsequent foraging trips. Forager body size, which is thought to affect vulnerability to capture by predators, did not have an effect on response to an attack. Predation risk can thus alter pollinator foraging behavior in ways that influence the number and reward level of flowers that are visited.  相似文献   

8.
Prey animals often have to face a dynamic tradeoff between the costs of antipredator behavior and the benefits of other fitness-related activities such as foraging and reproduction. According to the threat-sensitive predator avoidance hypothesis, prey animals should match the intensity of their antipredator behavior to the degree of immediate threat posed by the predator. Moreover, longer-term temporal variability in predation risk (over days to weeks) can shape the intensity of antipredator behavior. According to the risk allocation hypothesis, changing the background level of risk for several days is often enough to change the response intensity of the prey to a given stimulus. As the background level of risk increases, the response intensity of the prey decreases. In this study, we tested for possible interactions between immediate threat-sensitive responses to varying levels of current perceived risk and temporal variability in background risk experienced over the past 3 days. Juvenile convict cichlids were preexposed to either low or high frequencies of predation risk (using conspecific chemical alarm cues) for 3 days and were then tested for a response to one of five concentrations (100, 50, 25, 12.5%, or a distilled water control). According to the threat-sensitive predator avoidance hypothesis, we found greater intensity responses to greater concentrations of alarm cues. Moreover, in accordance with the risk allocation hypothesis, we found that cichlids previously exposed to the high background level of risk exhibited a lower overall intensity response to each alarm cue concentration than those exposed to the low background level of risk. It is interesting to note that we found that the background level of risk over the past 3 days influenced the threshold level of response to varying concentrations of alarm cues. Indeed, the minimum stimulus concentration that evoked a behavioral response was lower for fish exposed to high background levels of predation than those exposed to low background levels of predation. These results illustrate a remarkable interplay between immediate (current) risk and background risk in shaping the intensity of antipredator responses.  相似文献   

9.
It has been argued that the body mass levels achieved by birds are determined by the trade-off between risks of starvation and predation. Birds have also been found to reduce body mass in response to an increased predation risk. During migration, the need of extra fuel for flights is obvious and crucial. In this study, migratory blackcaps (Sylvia atricapilla) were subject to an experimental stopover situation where the predation risk was manipulated by exposure to a stuffed predator. Blackcaps that perceived an imminent risk of predation increased their food intake and fuel deposition rate during the first period of stopover compared with a control group. The pattern of night activity indicates that birds that were exposed to the predator also chose to leave earlier than birds in the control group. Since there was no cover present at the stopover site, birds might have perceived the risk of predation as higher regardless of whether they were foraging or not. Under such circumstances it has been predicted that birds should increase their foraging activity. The findings in this study clearly indicate that birds are able to adjust their stopover behaviour to perceived predation risk. Received: 8 January 1997 / Accepted after revision: 11 April 1997  相似文献   

10.
Predation and hunger are threats for most organisms, and appropriate behavioural responses to both factors should be shaped by natural selection. In combination, however, the behavioural demands of predation avoidance and effective foraging often cannot be satisfied at the same time and lead to a conflict within organisms. We examined the behavioural responses of two closely-related species of tadpoles, Rana lessonae and R. esculenta, to simulated predation by fish and hunger. Tadpoles, hatched and reared in the laboratory, were tested in a three-way factorial (predation risk × hunger × species) experiment with four predation levels and four hunger levels. Both species decreased their swimming activity with increasing predation risk. Predation risk did not influence the amount of activity time invested in feeding but caused the tadpoles to spend less time in patches with food. Refuges were not used to avoid predation. R. esculenta was more sensitive to predation risk than R. lessonae. Hunger increased both the activity of tadpoles and the amount of activity time invested in feeding, thus indicating an increased energy intake. No interactions were observed between predation risk and hunger. These results show that tadpoles possess genetically-based behavioural mechanisms that allow them to respond in a graded manner to predation and hunger. However, they did not balance the two conflicting demands of predation avoidance and effective foraging; the two mechanisms appeared to act independently. Correspondence to: R.D. Semlitsch  相似文献   

11.
Winnie JA  Cross P  Getz W 《Ecology》2008,89(5):1457-1468
Top-down effects of predators on prey behavior and population dynamics have been extensively studied. However, some populations of very large herbivores appear to be regulated primarily from the bottom up. Given the importance of food resources to these large herbivores, it is reasonable to expect that forage heterogeneity (variation in quality and quantity) affects individual and group behaviors as well as distribution on the landscape. Forage heterogeneity is often strongly driven by underlying soils, so substrate characteristics may indirectly drive herbivore behavior and distribution. Forage heterogeneity may further interact with predation risk to influence prey behavior and distribution. Here we examine differences in spatial distribution, home range size, and grouping behaviors of African buffalo as they relate to geologic substrate (granite and basalt) and variation in food quality and quantity. In this study, we use satellite imagery, forage quantity data, and three years of radio-tracking data to assess how forage quality, quantity, and heterogeneity affect the distribution and individual and herd behavior of African buffalo. We found that buffalo in an overall poorer foraging environment keyed-in on exceptionally high-quality areas, whereas those foraging in a more uniform, higher-quality area used areas of below-average quality. Buffalo foraging in the poorer-quality environment had smaller home range sizes, were in smaller groups, and tended to be farther from water sources than those foraging in the higher-quality environment. These differences may be due to buffalo creating or maintaining nutrient hotspots (small, high-quality foraging areas) in otherwise low-quality foraging areas, and the location of these hotspots may in part be determined by patterns of predation risk.  相似文献   

12.
Matassa CM  Trussell GC 《Ecology》2011,92(12):2258-2266
Predators can initiate trophic cascades by consuming and/or scaring their prey. Although both forms of predator effect can increase the overall abundance of prey's resources, nonconsumptive effects may be more important to the spatial and temporal distribution of resources because predation risk often determines where and when prey choose to forage. Our experiment characterized temporal and spatial variation in the strength of consumptive and nonconsumptive predator effects in a rocky intertidal food chain consisting of the predatory green crab (Carcinus maenas), an intermediate consumer (the dogwhelk, Nucella lapillus), and barnacles (Semibalanus balanoides) as a resource. We tracked the survival of individual barnacles through time to map the strength of predator effects in experimental communities. These maps revealed striking spatiotemporal patterns in Nucella foraging behavior in response to each predator effect. However, only the nonconsumptive effect of green crabs produced strong spatial patterns in barnacle survivorship. Predation risk may play a pivotal role in determining the small-scale distribution patterns of this important rocky intertidal foundation species. We suggest that the effects of predation risk on individual foraging behavior may scale up to shape community structure and dynamics at a landscape level.  相似文献   

13.
Predation and competition are both strong structuring forces in community dynamics, but their relative importance is disputed. In a laboratory experiment, we evaluated the relative importance of competition and predation from juvenile and adult brown trout, respectively, on foraging performance of groups of three stone loaches. We observed loach consumption rate, time spent inactive, and aggressive interactions between juvenile trout and loach in artificial stream sections. The controlled experiments were complemented by examining stone loach population densities in natural systems as functions of juvenile and adult trout. In the laboratory experiments, increasing numbers of competitors decreased prey availability, which ultimately led to lower consumption rates for loach. Loach responded to predation risk by increasing time being inactive, thereby decreasing consumption rates. However, there were no effects of juvenile trout competitors on loach consumption rates in treatments with adult trout presence, suggesting no additive effect of predation and competition on loach foraging success. Partial regressions of loach and trout densities in natural streams revealed a positive relationship between juvenile trout and loach, and a negative relationship between adult trout and loach. Our laboratory and field data thus suggest that predation is a limiting factor for loach success, and predator presence could mediate species coexistence at high interspecific densities.  相似文献   

14.
To detect threats and reduce predation risk prey animals need to be alert. Early predator detection and rapid anti-predatory action increase the likelihood of survival. We investigated how foraging affects predator detection and time to take-off in blue tits (Parus caeruleus) by subjecting them to a simulated raptor attack. To investigate the impact of body posture we compared birds feeding head-down with birds feeding head-up, but could not find any effect of posture on either time to detection or time to take-off. To investigate the impact of orientation we compared birds having their side towards the attacking predator with birds having their back towards it. Predator detection, but not time to take-off, was delayed when the back was oriented towards the predator. We also investigated the impact of foraging task by comparing birds that were either not foraging, foraging on chopped mealworms, or foraging on whole ones. Foraging on chopped mealworms did not delay detection compared to nonforaging showing that foraging does not always restrict vigilance. However, detection was delayed more than 150% when the birds were foraging on whole, live mealworms, which apparently demanded much attention and handling skill. Time to take-off was affected by foraging task in the same way as detection was. We show that when studying foraging and vigilance one must include the difficulty of the foraging task and prey orientation.Communicated by P.A. Bednekoff  相似文献   

15.
Animals balance feeding and anti-predator behaviors at various temporal scales. When risk is infrequent or brief, prey can postpone feeding in the short term and temporally allocate feeding behavior to less risky periods. If risk is frequent or lengthy, however, prey must eventually resume feeding to avoid fitness consequences. Species may exhibit different behavioral strategies, depending on the fitness tradeoffs that exist in their environment or across their life histories. North Pacific flatfishes that share juvenile rearing habitat exhibit a variety of responses to predation risk, but their response to risk frequency has not been examined. We observed the feeding and anti-predator behaviors of young-of-the-year English sole (Parophrys vetulus), northern rock sole (Lepidopsetta polyxystra), and Pacific halibut (Hippoglossus stenolepis)—three species that exhibit divergent anti-predator strategies—following exposure to three levels of predation risk: no risk, infrequent (two exposures/day), and frequent (five exposures/day). The English sole responded to the frequent risk treatment with higher feeding rates than during infrequent risk, following a pattern of behavioral response that is predicted by the risk allocation hypothesis; rock sole and halibut did not follow the predicted pattern, but this may be due to the limited range of treatments. Our observations of unique anti-predator strategies, along with differences in foraging and species-specific ecologies, suggest divergent trajectories of risk allocation for the three species.  相似文献   

16.
Foraging theory predicts that animals will adjust their foraging behavior in order to maximize net energy intake and that trade-offs may exist that can influence their behavior. Although substantial advances have been made with respect to the foraging ecology of large marine predators, there is still a limited understanding of how predators respond to temporal and spatial variability in prey resources, primarily due to a lack of empirical studies that quantify foraging and diving behavior concurrently with characteristics of prey fields. Such information is important because changes in prey availability can influence the foraging success and ultimately fitness of marine predators. We assessed the diving behavior of juvenile female harbor seals (Phoca vitulina richardii) and prey fields near glacial ice and terrestrial haulout sites in Glacier Bay (58°40′N, ?136°05′W), Alaska. Harbor seals captured at glacial ice sites dived deeper, had longer dive durations, lower percent bottom time, and generally traveled further to forage. The increased diving effort for seals from the glacial ice site corresponded to lower prey densities and prey at deeper depths at the glacial ice site. In contrast, seals captured at terrestrial sites dived shallower, had shorter dive durations, higher percent bottom time, and traveled shorter distances to access foraging areas with much higher prey densities at shallower depths. The increased diving effort for seals from glacial ice sites suggests that the lower relative availability of prey may be offset by other factors, such as the stability of the glacial ice as a resting platform and as a refuge from predation. We provide evidence of differences in prey accessibility for seals associated with glacial ice and terrestrial habitats and suggest that seals may balance trade-offs between the costs and benefits of using these habitats.  相似文献   

17.
Despite growing interest in ecological interactions between predators and pathogens, few studies have experimentally examined the consequences of infection for host predation risk or how environmental conditions affect this relationship. Here we combined mesocosm experiments, in situ foraging data, and broad-scale lake surveys to evaluate (1) the effects of chytrid infection (Polycaryum laeve) on susceptibility of Daphnia to fish predators and (2) how environmental characteristics moderate the strength of this interaction. In mesocosms, bluegill preferred infected Daphnia 2-5 times over uninfected individuals. Among infected Daphnia, infection intensity was a positive predictor of predation risk, whereas carapace size and fecundity increased predation on uninfected individuals. Wild-caught yellow perch and bluegill from in situ foraging trials exhibited strong selectivity for infected Daphnia (3-10 times over uninfected individuals). In mesocosms containing water high in dissolved organic carbon (DOC), however, selective predation on infected Daphnia was eliminated. Correspondingly, lakes that supported chytrid infections had higher DOC levels and lower light penetration. Our results emphasize the strength of interactions between parasitism and predation while highlighting the moderating influence of water color. P. laeve increases the conspicuousness and predation risk of Daphnia; as a result, infected Daphnia occur predominantly in environments with characteristics that conceal their elevated visibility.  相似文献   

18.
The activity level of prey reflects a trade-off between predation risk and foraging gain. A number of theoretical and empirical studies have shown that a prey's energetic state or the level of its resource should influence this trade-off (i.e., what the optimal activity level at a level of predation risk is). Here, I show that the energetic state of prey may also influence the duration of their antipredator behavioral response. Green frog tadpoles (Rana clamitans) reduced their activity level for a shorter time during exposure to the chemical cue of predatory larval dragonflies (Anax spp.) as their time since last feeding increased (i.e., as their energetic state decreased). Interestingly, the tadpoles strongly reduced their activity level upon cue exposure in all treatments. Thus, the relative activity level of tadpoles at different energetic states varied over time.  相似文献   

19.
Ideal free distributions under predation risk   总被引:1,自引:0,他引:1  
 We examine the trade-off between gathering food and avoiding predation in the context of patch use by a group of animals. Often a forager will have to choose between feeding sites that differ in both energetic gain rate and predation risk. The ideal site will have a high gain rate and low risk of predation. However, intake rate will often decrease when the patch is shared with other foragers and it may be optimal for some individuals to feed elsewhere. Within the framework of ideal free theory, we investigate the distribution of foragers that will equalise individual fitness gains. We focus on a two-patch environment with continuous inputs of food. With reference to existing experimental studies, we examine the effects of risk dilution, food input rates and an animal’s expectations of the future. We identify the effect of total animal numbers when one patch is subject to predation risk and the other is safe. Conditions under which the difference in intake rate in the two patches is constant are identified, as are conditions in which the ratio of animals in the two patches is constant. If current conditions do not alter future expectations an increase in input rates to the patches promotes increased use of the risky patch. Yet, if conditions are assumed to persist indefinitely the opposite effect is seen. When both patches are subject to predation risk, dilution of risk favours more extreme distributions, and may lead to more than one stable distribution. The results of these models are used to critically analyse previous work on the energetic equivalence of risk. This paper is intended to help guide the development of new experimental studies into the energy-risk trade-off. Received: 10 February 1995/Accepted after revision: 1 October 1995  相似文献   

20.
The effect of predation risk and male-male competition on male courtship behaviour and attractiveness to females was studied in the threespine stickleback (Gasterosteus aculeatus) by presenting dummy or live females to solitary and competing males under different predation risks. In the presence of a predator, males decreased courtship activity. Different courtship components were, however, adjusted to different extents and in opposing directions to predation risk, probably because the single components may have varied in riskiness. The presence of a competing male decreased overall courtship activity, but increased the frequency of zigzags, suggesting zigzagging to be a competitive strategy against other males. In the presence of a predator male courtship activity was not affected by a competitor. Female mate choice correlated with the males' previous frequency of zigzags towards a dummy female. However, when a live female paid attention to a male, the male decreased zigzagging and instead increased leading and fanning behaviours, probably trying to attract the female to the nest to mate. Predation risk affected the attractiveness of males as females reduced their attention to a male when he faced a predator and reduced his courtship activity. As females instead increased their attention to a competing male that had increased his courtship activity, due to decreased competition, males clearly are balancing mating opportunities against predator avoidance. When males vary in their susceptibility to predators, predation risk may thus affect mating success of competing males. Received: 31 January 1997 / Accepted after revision: 15 April 1997  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号