首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Intraguild predation (IGP) has been explained in terms of competitor-removal, food-stress and predator-removal hypotheses. Only the first two hypotheses have been fairly well studied. To test the predator-removal hypothesis as a force determining IGP in avian predators, we performed a field experiment to simulate the presence of an IG predator (eagle owl Bubo bubo dummy) in the surrounding of the nests of four potential IG prey (black kite Milvus migrans, red kite Milvus milvus, booted eagle Aquila pennata and common buzzard Buteo buteo). To discard the possibility that an aggressive reaction towards the eagle owl was not related to the presence of the IG predator, we also presented a stuffed tawny owl Strix aluco, which is a potential competitor but cannot be considered an IG predator of the studied diurnal raptors considered in the experiment. While almost always ignoring the tawny owl, raptors chiefly showed an interspecific aggressive behaviour towards their IG predator. Our results seem to support the predator-removal hypothesis, as the IG prey may take advantage of the diurnal inactivity of the IG predator to remove it from their territory. However, the recorded behaviour may be also considered as a special variety of mobbing (i.e. a prey’s counter-strategy against its predator), where the mobber is sufficiently powerful to escalate predator harassment into deliberate killing attempts. In their turn, eagle owls can respond with an IG predatory behaviour aimed at removing IG prey species which are highly aggressive mobbers.  相似文献   

2.
Attributes of the recipient community may affect the invasion success of arriving non-indigenous organisms. In particular, biotic interactions may enhance the resistance of communities to invasion. Invading organisms typically encounter a novel suite of competitors and predators, and thus their invasiveness may be affected by how they cope with these interactions. Behavioral plasticity may help invaders to respond appropriately to novelty. We examined the behavioral responses of highly invasive mosquitofish to representative novel competitors and predators they might encounter as they spread through North America. We compared the behavior of invasive Gambusia holbrooki and G. affinis to that of two close relatives of lower invasive potential (G. geiseri and G. hispaniolae) in order to elucidate whether responses to novelty related to invasiveness. In short-term assays, female Gambusia were paired with a novel competitor, Pimephales promelas, and a novel predator, Micropterus dolomieu. Behavioral responses were measured in terms of foraging success and efficiency, activity, refuge use, predator inspections, and interspecific aggression. Contrary to a priori predictions, invasive and non-invasive responses to novel interactions did not differ consistently. In response to novel competition, both invasive species increased foraging efficiency, but so did G. geiseri. In response to novel predation, only G. holbrooki decreased consumption and activity and increased refuge use. No antipredator response was observed in G. affinis. We found consistent differences, however, between invasives and non-invasives in foraging behavior. Both in the presence and absence of the competitor and the predator, invasives foraged more efficiently and consumed more prey than non-invasives.Communicated by P. Bednekoff  相似文献   

3.
Cues for detecting and responding to perceived predation risk may be indirect, i.e., correlated with the probability of encountering a predator, or direct, i.e., produced by or related to the actual presence of a predator. Research shows, independently, both types of cues can influence anti-predator and foraging behaviours in prey species. However, since animals naturally encounter indirect and direct cues simultaneously, we were interested in quantifying their cumulative effect. Our aim was to evaluate food intake and behaviours (patch use, feeding (rate and time), vigilance) of a nocturnal mammalian herbivore to indirect (open vs. covered microhabitats; illumination) and direct (fox/owl odours) predator cues. We ran a preference trial with four paired treatments using a covered Safe food patch and an open Risk food patch, with one of four combinations of indirect and direct predator cues. Predation risk had a significant effect on both intake and behaviour (including feeding time, rate, and vigilance), but these effects differed depending on cues. No two combinations of cues produced exactly the same effects, illustrating the complexity of interactions that occur between cues. Covered patches were always perceived as less risky than open patches, but unexpectedly, open patches were perceived as riskier when dark rather than light. The strongest suite of (negative) responses to risk was associated with combined indirect and direct cues. These results highlight the importance of considering jointly, intake from a patch, intake rate, and behaviours, such as the proportion of time spent vigilant, when quantifying predation risk, rather than intake alone.  相似文献   

4.
Because environments can vary over space and time in non-predictable ways, foragers must rely on estimates of resource availability and distribution to make decisions. Optimal foraging theory assumes that foraging behavior has evolved to maximize fitness and provides a conceptual framework in which environmental quality is often assumed to be fixed. Another more mechanistic conceptual framework comes from the successive contrast effects (SCE) approach in which the conditions that an individual has experienced in the recent past alter its response to current conditions. By regarding foragers’ estimation of resource patches as subjective future value assessments, SCE may be integrated into an optimal foraging framework to generate novel predictions. We released Allenby’s gerbils (Gerbillus andersoni allenbyi) into an enclosure containing rich patches with equal amounts of food and manipulated the quality of the environment over time by reducing the amount of food in most (but not all) food patches and then increasing it again. We found that, as predicted by optimal foraging models, gerbils increased their foraging activity in the rich patch when the environment became poor. However, when the environment became rich again, the gerbils significantly altered their behavior compared to the first identical rich period. Specifically, in the second rich period, the gerbils spent more time foraging and harvested more food from the patches. Thus, seemingly identical environments can be treated as strikingly different by foragers as a function of their past experiences and future expectations.  相似文献   

5.
Raptor–prey encounters were studied to evaluate the strategies and success rate of both predator attack and prey defense. We compared the success of barn owls in catching stationary simulated prey (food item) with that of moving prey (food item that was pulled in various directions). We also tracked real encounters between barn owls and spiny mice in a captive environment. It was found that owls had higher success in attacking stationary prey and that they seemed to attack the prey as soon as it became motionless. When attacked, only a few spiny mice remained immobile (freeze response) whereas most fled and usually avoided capture by the owls. It was also found that spiny mice displayed a preference to escape in those directions in which owls had demonstrated a lower success in catching the simulated prey. Escape initiation dichotomized to a short or long (but rarely intermediate) distance between the spiny mouse and the owl with more successful avoidance at short-distance (last-moment) escapes. The best predictor of escape success was the velocity of the spiny mouse, and the second best predictor was its flight initiation distance (FID). We present an update for Ydenberg and Dill’s model for optimal FID in close encounters, suggesting that fleeing at the last moment is advantageous. However, a last-moment attempt to escape is also more risky with a split second differing between life and death, and is therefore appropriate mainly for agile prey under close-distance attack.  相似文献   

6.
Patch use as an indicator of habitat preference,predation risk,and competition   总被引:34,自引:0,他引:34  
Summary A technique for using patch giving up densities to investigate habitat preferences, predation risk, and interspecific competitive relationships is theoretically analyzed and empirically investigated. Giving up densities, the density of resources within a patch at which an individual ceases foraging, provide considerably more information than simply the amount of resources harvested. The giving up density of a forager, which is behaving optimally, should correspond to a harvest rate that just balances the metabolic costs of foraging, the predation cost of foraging, and the missed opportunity cost of not engaging in alternative activities. In addition, changes in giving up densities in response to climatic factors, predation risk, and missed opportunities can be used to test the model and to examine the consistency of the foragers' behavior. The technique was applied to a community of four Arizonan granivorous rodents (Perognathus amplus, Dipodomys merriami, Ammospermophilus harrisii, and Spermophilus tereticaudus). Aluminum trays filled with 3 grams of millet seeds mixed into 3 liters of sifted soil provided resource patches. The seeds remaining following a night or day of foraging were used to determine the giving up density, and footprints in the sifted sand indicated the identity of the forager. Giving up densities consistently differed in response to forager species, microhabitat (bush versus open), data, and station. The data also provide useful information regarding the relative foraging efficiencies and microhabitat preferences of the coexisting rodent species.  相似文献   

7.
The influence of predation risk and food deprivation on the behavior and activity of juvenile American lobsters, Homarus americanus Milne Edwards, was examined in single and paired individuals in laboratory experiments performed during 1988 and in the winter of 1991/92. In the presence of a predator (the tautog Tautoga onitis Linnaeus) restrained behind a barrier, single lobsters significantly reduced the time spent feeding at night, consumed fewer mussels, and quickly brought them back to shelter. Single lobsters did not forage during the day in any treatment. If deprived of food for 60 h, they consumed more mussels and spent more time walking than recently fed (12-h food-deprived) lobsters. Paired lobsters did forage during the day in the presence of a predator. The smaller lobsters (subdominant) in the pairs foraged for a longer time in the presence than in the absence of a predator and significantly longer than single individuals. Shelter occupancy was significantly shorter in single, recently fed lobsters in the presence of a predator compared to time spent sheltering in its absence. Among food-deprived lobsters, paired individuals spent a significantly shorter time within the shelter than single lobsters in the absence of a predator. Larger (dominant) lobsters, however, spent more time than subdominant lobsters within the shelter during all periods of the day. Without a predator, paired lobsters spent significantly more time than single ones in shelter-related activities. Under predation risk, subdominant lobsters concentrated shelter-building time during the day and built a higher percent of alternative shelters than either single or dominant lobsters. In the absence of a predator, paired lobsters walked in the open area for a significantly longer time than single ones in the absence of a predator. This apparently was associated with fighting between dominant and subdominant lobsters and the attempts of the larger lobster to drive the smaller one from its shelter. During the day, lobsters fought for a significantly longer time in the presence than in the absence of a predator. When the tautog was not constrained, mortality rate was similar in both single and paired lobsters. Mortality rate among subdominant lobsters, however, was seven times higher than among dominant lobsters. We suggest that the risk of predation interferes with the ability of single juvenile lobsters to acquire and consume food. They appear to trade off energetic consideration against risk of predation when foraging away from the shelter. The introduction of a conspecific competitor to the system may further increase risk (of the subdominant) to the predator. Intraspecific interactions tend to increase the risk of predation to smaller lobsters but increase the survival rate among larger lobsters. Received: 6 February 1995 / Accepted: 2 September 1997  相似文献   

8.
Animals commonly choose between microhabitats that differ in foraging return and mortality hazard. I studied the influence of autotomy, the amputation of a body part, on the way larvae of the damselfly Lestes sponsa deal with the trade-off between foraging or seeking cover. Survival of Lestes larvae when confronted with the odonate predator Aeshna cyanea was higher in a complex than in a simple microhabitat, indicating that this more complex microhabitat was safer. Within the simple microhabitat, larvae without lamellae had a higher risk for mortality by predation than larvae with lamellae, showing a long-term cost of autotomy. When varying the foraging value (food present or absent) and predation risk (encaged predator or no predator) in the simple microhabitat, larvae with and without lamellae responded differentially to the imposed trade-off. All larvae spent more time in the simple microhabitat when food was present than when food was absent. Larvae without lamellae, however, only sporadically left the safe microhabitat, irrespective of the presence of the predator. In contrast, larvae with lamellae shifted more frequently towards the risky microhabitat than those without lamellae, and more often in the absence than in the presence of the predator. These decisions affected the foraging rates of the animals. I show for the first time that refuge use is higher after autotomy and that this is associated with the cost of reduced foraging success. The different microhabitat preferences for larvae with and without lamellae are consistent with their different vulnerabilities to predation and demonstrate the importance of intrinsic factors in establishing trade-offs. Received: 4 June 1999 / Received in revised form: 18 August 1999/ Accepted: 18 August 1999  相似文献   

9.
Predation risk and foraging behavior of the hoary marmot in Alaska   总被引:2,自引:0,他引:2  
Summary I observed hoary marmots for three field seasons to determine how the distribution of food and the risk of predation influenced marmots' foraging behavior. I quantified the amount of time Marmota caligata foraged in different patches of alpine meadows and assessed the distribution and abundance of vegetation eaten by marmots in these meadows. Because marmots dig burrows and run to them when attacked by predators, marmot-toburrow distance provided an index of predation risk that could be specified for different meadow patches.Patch use correlated positively with food abundance and negatively with predation risk. However, these significant relationships disappeared when partial correlations were calculated because food abundance and risk were intercorrelated. Using multiple regression, 77.0% of the variance in patch use was explained by a combination of food abundance, refuge burrow density, and a patch's distance from the talus where sleeping burrows were located. Variations in vigilance behavior (look-ups to search for predators while feeding) according to marmots' ages, the presence of other conspecifics, and animals' proximity to their sleeping burrows all indicated that predation risk influenced foraging.In a forage-manipulation experiment, the use of forage-enhanced patches increased six-fold, verifying directly the role of food availability on patch used. Concomitant with increased feeding, however, was the intense construction of refuge burrows in experimental patches that presumably reduced the risk of feeding. Thus, I suggest that food and predation risk jointly influence patch use by hoary marmots and that both factors must be considered when modeling the foraging behavior of species that can be predator and prey simultaneously.  相似文献   

10.
Predation risk has been shown to alter various behaviours in prey. Risk alters activity, habitat use and foraging, and weight decrease might be a consequence of that. In mammals, studies on physiological measures affected by risk of predation, other than weight, are rare. We studied in two separate laboratory experiments foraging, hoarding behaviour and expression of stress measured non-invasively from the faeces in the bank vole (Clethrionomys glareolus), a common boreal rodent. Voles were exposed to predation risk using odours of the least weasels (Mustela nivalis nivalis). Distilled water served as control. In the first experiment, we found that foraging effort, measured as sunflower seeds taken from seed trays filled with sand, was significantly lower in trays scented with weasel odour. Both immediate consumption of seeds and hoarding were affected negatively by the weasel odour. Females hoarded significantly more than males in autumn. In the second experiment, the negative effect of weasel odour on foraging was consistent over a 3-day experiment, but the strongest effect was observed in the first night. Foraging increased over the time of the experiment, which might reflect either energetic compensation during a longer period of risk, predicted in the predation risk allocation hypothesis, or habituation to the odour-simulated risk. Despite decreased foraging under predation risk, stress measured as corticosteroid metabolite concentration in vole faeces was not affected by the weasel odour treatment. In conclusion, we were able to verify predation-risk-mediated changes in the foraging effort of bank voles but no physiological stress response was measured non-invasively, probably due to great individual variation in secretion of stress hormones.  相似文献   

11.
Matassa CM  Trussell GC 《Ecology》2011,92(12):2258-2266
Predators can initiate trophic cascades by consuming and/or scaring their prey. Although both forms of predator effect can increase the overall abundance of prey's resources, nonconsumptive effects may be more important to the spatial and temporal distribution of resources because predation risk often determines where and when prey choose to forage. Our experiment characterized temporal and spatial variation in the strength of consumptive and nonconsumptive predator effects in a rocky intertidal food chain consisting of the predatory green crab (Carcinus maenas), an intermediate consumer (the dogwhelk, Nucella lapillus), and barnacles (Semibalanus balanoides) as a resource. We tracked the survival of individual barnacles through time to map the strength of predator effects in experimental communities. These maps revealed striking spatiotemporal patterns in Nucella foraging behavior in response to each predator effect. However, only the nonconsumptive effect of green crabs produced strong spatial patterns in barnacle survivorship. Predation risk may play a pivotal role in determining the small-scale distribution patterns of this important rocky intertidal foundation species. We suggest that the effects of predation risk on individual foraging behavior may scale up to shape community structure and dynamics at a landscape level.  相似文献   

12.
We sought to understand why a social, desert rodent, the great gerbil, Rhombomys opimus, expends energy and possible risk of predation by footdrumming and vocalizing in the presence of a diversity of terrestrial predators: snakes, monitor lizards, polecats, foxes, and humans. Behavioral observations, human approaches, and experiments with tethered predators revealed that both male and female gerbils called and footdrummed in the presence of offspring, close relatives, and potential mates. Because adults called more often when pups were present, and solitary gerbils seldom gave an alarm, the alarm behavior probably warns conspecifics, especially vulnerable offspring, of potential danger. We also found that gerbils altered alarm behavior with the type of predator. They drummed more in the burrow when a dog that could not enter the burrow was present, and they drummed more out of the burrow in response to a snake that could enter the burrow. Gerbils vocalized and stood in an alert posture in response to all stimuli. The different footdrumming responses of gerbils to terrestrial predators seems related to the hunting style and type of risk posed by the predator, especially its ability to enter the burrow system. Received: 23 August 1999 / Received in revised form: 6 December 1999 / Accepted: 25 February 2000  相似文献   

13.
Antarctic fur seals Arctocephalus gazella and macaroni penguins Eudyptes chrysolophus are the two main land-based krill Euphausia superba consumers in the northern Scotia Sea. Using a combination of concurrent at-sea (predator observations, net hauls and multi-frequency acoustics), and land-based (animal tracking and diet analysis) techniques, we examined variability in the foraging ecology of these sympatric top predators during the austral summer and autumn of 2004. Krill availability derived from acoustic surveys was low during summer, increasing in autumn. During the breeding season, krill occurred in 80% of fur seal diet samples, with fish remains in 37% of samples. Penguin diets contained the highest proportion of fish in over 20 years of routine monitoring (46% by mass; particularly the myctophid Electrona antarctica), with krill (33%) and amphipods (Themisto gaudichaudii; 21%) also occurring. When constrained by the need to return and feed their offspring both predator species foraged to the northwest of South Georgia, consistent with an area of high macrozooplankton biomass, but fur seals were apparently more successful at exploiting krill. When unconstrained by chick-rearing (during March) penguins foraged close to the Shag Rocks shelf-break, probably exploiting the high daytime biomass of fish in this area. Penguins and seals are able to respond differently to periods of reduced krill abundance (in terms of variability in diet and foraging behaviour), without detriment to the breeding success of either species. This highlights the importance of myctophid fish as an alternative trophic pathway for land-based predators in the Scotia Sea ecosystem.  相似文献   

14.
Antlion larvae are sand-dwelling insect predators, which ambush small arthropod prey while buried in the sand. In some species, the larvae construct conical pits and are considered as sit-and-wait predators which seldom relocate while in other species, they ambush prey without a pit but change their ambush site much more frequently (i.e., sit-and-pursue predators). The ability of antlion larvae to evade some of their predators which hunt them on the sand surface is strongly constrained by the degree of sand stabilization or by sand depth. We studied the effect of predator presence, predator type (active predatory beetle vs. sit-and-pursue wolf spider), and sand depth (shallow vs. deep sand) on the behavioral response of the pit building Myrmeleon hyalinus larvae and the sit-and-pursue Lopezus fedtschenkoi larvae. Predator presence had a negative effect on both antlion species activity. The sit-and-wait M. hyalinus larvae showed reduced pit-building activity, whereas the sit-and-pursue L. fedtschenkoi larvae decreased relocation activity. The proportion of relocating M. hyalinus was negatively affected by sand depth, whereas L. fedtschenkoi was negatively affected also by the predator type. Specifically, the proportion of individual L. fedtschenkoi that relocated in deeper sand was lower when facing the active predator rather than the sit-and-pursue predator. The proportion of M. hyalinus which constructed pits decreased in the presence of a predator, but this pattern was stronger when exposed to the active predator. We suggest that these differences between the two antlion species are strongly linked to their distinct foraging modes and to the foraging mode of their predators. Reut Loria and Inon Scharf contributed equally to the paper.  相似文献   

15.
Energy intake and expenditure on natural foraging trips were estimated for the seed-harvester ants, Pogonomyrmex maricopa and P. rugosus. During seed collection, P. maricopa foraged individually, whereas P. rugosus employed a trunk-trail foraging system. Energy gain per trip and per minute were not significantly different between species. There was also no interspecific difference in energy cost per trip, but energy cost per minute was lower for P. maricopa foragers because they spent on average 7 min longer searching for a load on each trip. Including both unsuccessful and successful foraging trips, average energy gain per trip was more than 100 times the energy cost per trip for both species. Based on this result, we suggest that time cost incurred during individual foraging trips is much more important than energy cost in terms of maximizing net resource intake over time. In addition, because energy costs are so small relative to gains, we propose that energy costs associated with foraging may be safely ignored in future tests of foraging theory with seed-harvesting ant species.  相似文献   

16.
In the absence of predators, pollinators can often maximize their foraging success by visiting the most rewarding flowers. However, if predators use those highly rewarding flowers to locate their prey, pollinators may benefit from changing their foraging preferences to accept less rewarding flowers. Previous studies have shown that some predators, such as crab spiders, indeed hunt preferentially on the most pollinator-attractive flowers. In order to determine whether predation risk can alter pollinator preferences, we conducted laboratory experiments on the foraging behavior of bumble bees (Bombus impatiens) when predation risk was associated with a particular reward level (measured here as sugar concentration). Bees foraged in arenas containing a choice of a high-reward and a low-reward artificial flower. On a bee’s first foraging trip, it was either lightly squeezed with forceps, to simulate a crab spider attack, or was allowed to forage safely. The foragers’ subsequent visits were recorded for between 1 and 4 h without any further simulated attacks. Compared to bees that foraged safely, bees that experienced a simulated attack on a low-reward artificial flower had reduced foraging activity. However, bees attacked on a high-reward artificial flower were more likely to visit low-reward artificial flowers on subsequent foraging trips. Forager body size, which is thought to affect vulnerability to capture by predators, did not have an effect on response to an attack. Predation risk can thus alter pollinator foraging behavior in ways that influence the number and reward level of flowers that are visited.  相似文献   

17.
Threat-sensitive decision-making might be changed in response to a parasitic infection that impairs future reproduction. Infected animals should take more risk to gain energy to speed up their growth to achieve early reproduction and/or to strengthen their immune response. To avoid correlational evidence, we experimentally infected and sham-infected randomly selected immature three-spined sticklebacks with the cestode Schistocephalus solidus. For 7 weeks we determined the threat-sensitive foraging decisions and growth of individual sticklebacks in the presence of a live pike (Esox lucius). The experimenters were blind with respect to the infection status of the fish. In contrast to previous studies, our recently infected fish should have been almost unconstrained by the parasite and thus have been able to adopt an appropriate life history strategy. We found a strong predator effect for both infected and uninfected fish: the sticklebacks’ risk-sensitive foraging strategy resulted in significantly reduced growth under predation risk. Infected fish did not grow significantly faster under predation risk than uninfected fish. Since infected fish consumed much less prey in the presence of the predator than did infected fish in its absence, they obviously did not use the opportunity to maximize their growth rate to reach reproduction before the parasite impairs it. Received: 21 June 1999 / Revised: 27 November 1999 / Accepted: 5 September 2000  相似文献   

18.
Summary The threat-sensitive predator avoidance hypothesis predicts that prey can assess the relative threat posed by a predator and adjust their behaviour to reflect the magnitude of the threat. We tested the ability of larval threespine sticklebacks to adjust their foraging in the presence of predators by exposing them to conspecific predators of various sizes and recording their foraging and predator avoidance behaviours. Larvae (<30 days post-hatch) displayed predator escape behaviours only towards attacking predators. At 3 weeks post-hatch larvae approached the predator after fleeing, a behaviour which may be the precursor to predator inspection. Larvae reduced foraging and spent less time in the proximity of large and medium-sized predators compared to small predators. The reduction in foraging was negatively correlated to the predator/larva size ratio, indicating that larvae increased their foraging as they increased in size relative to the predator. We conclude that larval sticklebacks can assess the threat of predation early in their ontogeny and adjust their behaviour accordingly.Correspondence to: J.A. Brown  相似文献   

19.
Animal prey has developed a variety of behavioural strategies to avoid predation. Many fish species form shoals in the open water or seek refuge in structurally complex habitats. Since anti-predator strategies bear costs and are energy-demanding, we hypothesised that the nutritional state of prey should modify the performance level and efficiency of such strategies. In aquaria either containing or lacking a structured refuge habitat, well-fed or food-deprived juvenile roach (Rutilus rutilus) were exposed to an open-water predator (pikeperch, Sander lucioperca). Controls were run without predators. In the presence of the predator, roach enhanced the performance of the anti-predator strategy and increased the use of the refuge habitat whereby food-deprived roach were encountered more often in the structure than well-fed roach. Nonetheless more starved than well-fed roach were fed upon by the predator. In the treatments offering only open-water areas, roach always formed dense shoals in the presence of the predator. The shoal density, however, was lower in starved roach. Starving fish in shoals experienced the highest predation mortality across all experimental treatments. The experiment confirmed the plasticity of the anti-predator behaviour in roach and demonstrated that food deprivation diminished the efficiency of shoaling more strongly than the efficiency of hiding. The findings may be relevant to spatial distribution of prey and predator–prey interactions under natural conditions because when prey are confronted with phases of reduced resource availability, flexible anti-predator strategies may lead to dynamic habitat use patterns.  相似文献   

20.
One hypothesis for the maintenance of genetic variation states that alternative genotypes are adapted to different environmental conditions (i.e., genotype-by-environment interaction G×E) that vary in space and time. Although G×E has been demonstrated for morphological traits, little evidence has been given whether these G×E are associated with traits used as signal in mate choice. In three wild bird species, we investigated whether the degree of melanin-based coloration, a heritable trait, covaries with nestling growth rate in rich and poor environments. Variation in the degree of reddish-brown phaeomelanism is pronounced in the barn owl (Tyto alba) and tawny owl (Strix aluco), and variation in black eumelanism in the barn owl and Alpine swift (Apus melba). Melanin-based coloration has been shown to be a criterion in mate choice in the barn owl. We cross-fostered hatchlings to test whether nestlings sired by parents displaying melanin-based colorations to different extent exhibit alternative growth trajectories when raised by foster parents in poor (experimentally enlarged broods) and rich (experimentally reduced broods) environments. With respect to phaeomelanism, barn owl and tawny owl offspring sired by redder parents grew more rapidly in body mass only in experimentally reduced broods. With respect to eumelanism, Alpine swift offspring of darker fathers grew their wings more rapidly only in experimentally enlarged broods, a difference that was not detected in reduced broods. These interactions between parental melanism and offspring growth rate indicate that individuals display substantial plasticity in response to the rearing environment which is associated with the degree of melanism: at least with respect to nestling growth, phaeomelanic and eumelanic individuals are best adapted to rich and poor environments, respectively. It now remains to be investigated why eumelanism and phaeomelanism have a different signaling function and what the lifelong consequences of these melanism-dependent allocation strategies are. This is important to fully appraise the role played by environmental heterogeneity in maintaining variation in the degree of melanin-based coloration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号