Thursday, 18 September 2025

AI is helping to decode animals’ speech




Blogger Comments:

Here’s a preliminary mapping of the animal calls discussed in the article to Halliday’s protolanguage microfunctions. I’ve kept it at the level of illustrative examples rather than exhaustive coding.

Species / CallObserved Behaviour / ContextMicrofunctionNotes
Bonobo: yelp–grunt‘Look at what I’m doing, let’s do this together’ (nest building)Regulatory / InteractionalCoordinates joint activity; maintains social cohesion.
Bonobo: peep–whistle‘I would like to do this’ + ‘let’s stay together’Regulatory / InteractionalEncourages group alignment and peaceful coordination.
Chimpanzee: alarm–recruitmentResponding to snakesRegulatoryConveys threat and prompts group response; indicates environmental process.
Sperm whale: codas (a-vowel / i-vowel)Communication via clicks, codas with frequency modulationPersonal / InteractionalCodas may indicate individual identity, social cues, or sequence patterns; precise “meaning” under investigation.
Japanese tit: alert + recruitmentPredator detection, approach behaviourRegulatoryCombines information about environment and action; shows compositionality at microfunctional level.
Bengalese finch: song sequences (FinchGPT study)Predictable song patternsInteractionalLikely conveys social or territorial information; AI detects structure, not necessarily “meaning” in human sense.
Atlantic spotted dolphin: sequences (DolphinGemma)Mimicked vocalisationsInteractional / RegulatoryPatterns generated for playback experiments; function in natural behaviour still uncertain.

Key Observations Using Microfunctions

  1. Coordination over grammar: The microfunctions highlight that animal communication primarily regulates behaviour and social relations.

  2. Context-sensitive meaning: Each call’s significance emerges in specific environmental and social situations.

  3. AI’s role: AI can detect patterns but does not assign microfunctions—it cannot yet perceive relational or contextual meaning.