ABS 2022
Cross-modal Cues and the Cocktail Party
A. Leonie Baier1,2, Logan S. James1,2, Gregg W. Cohen1, Rachel A. Page1, Paul Clements3, Kimberly L. Hunter4, Ryan C. Taylor1,4, Michael J. Ryan1,2. 1Smithsonian Tropical Research Institute, Balboa, Ancon, Panama; 2Department of Integrative Biology, University of Texas, Austin, TX, United States; 3Henson School of Technology, Salisbury University, Salisbury, MD, United States; 4Department of Biological Sciences, Salisbury University, Salisbury, MD, United States

Who should I mate with? What should I eat? To make decisions, animals need information. This information is often communicated via multiple sensory modalities. Male túngara frogs call out to female frogs using the auditory modality, but additional modalities also relay information on male quality and location: The male’s vocal sac inflates and deflates, also creating detectable water ripples. Such an elaborate mating display attracts eavesdroppers: The frog-eating bat exploits the different display components to home in for a meal. In this model system we study interactions between different sensory modalities and their influence on decision-making. Here we investigate cross-modal facilitation, a process where sensory performance in one modality is improved by stimulation in another modality. Either in ambient noise or with masking frog-chorus playback, frog-eating bats must choose between two robotic frogs presenting either a (differing) unimodal acoustic display or a multimodal display that additionally includes (identical) echoacoustic cues, i.e., the dynamically inflating vocal sac and water ripples. Our study illuminates the complexity associated with multimodal signalling.