Tl; Dream
- Google has introduced Dolphingemma, an AI model that lets scientists talk to Dolphins using Pixel phones.
- The model is trained to analyze and generate dolphin vocalizations, all run through Pixel 6.
Google’s Pixel phones have already been covered to understand different languages with live translation. But now the company takes things a big splash further. It turns out that Pixel phones can not only help people talk to people, but also help people talk to dolphins! Yes, you read it correctly. The same phone that helps you translate on the go now helps researchers decode Dolphin-Speak.
In a new announcement timed with National Dolphin Day, Google unveiled Dolphingemma. It is an AI model Google created in collaboration with researchers at Georgia Tech and Marine Scientists by the Wild Dolphin Project (WDP). The model is trained to analyze and generate dolphin vocalizations, and the best part is that it can run through pixel phones.
How does Dolphingemma work?
Google explains that Atlantic Spotted Dolphins has a whole underwater language consisting of clicks, whistles and what scientists call “Burst Pulses.” Think of it as the sonar version of sentences and emojis.
WDP has studied these dolphins for almost 40 years and has collected a sea of data on who says what, when and to whom. It is the databank that Dolphingemma taps.
The AI model takes in real dolphin sounds and identifies patterns in the vocal sequences. The technology behind it borrows from Google’s own Gemma models (small siblings of Gemini models), tailor -made especially for sound treatment.
Dolphingemma works on pixel phones used in the field, eliminating the need for bulky computers underwater and other equipment. Thanks to Google’s sound wizard and Mighty Soundstream-Tokenizer (which no joke sounds like anything from a sci-fi movie), scientists can not only listen to dolphin crops in real time, but also respond to the mammals.
WDP and Georgia Tech test even a system called Chat (Cetacean Hearing Augmentation Telemetry) to talk to Dolphins. Chat uses synthetic whistles to represent specific objects dolphins love. The idea is that a dolphin will hear the synthetic whistle, realize that it means “toys”, mimics it and communicates with researchers.
Pixel 6 phones with Dolphingemma listen through the noise, recognize a dolphin mimicry and tell human scientists what the dolphin said. This information is led straight into bone-leading headphones, and researchers can respond in real time. Google says Pixel 9 -Series Will speed this process further by running AI models locally.