AI assistants speak in their own 'Star Wars' language
An unusual experiment involving the conversation of two AI assistants shows that they don't need human speech to communicate. The sounds resemble those of the robot R2D2 in "Star Wars".
The experiment demonstrated that AI-generated agents communicate more quickly and efficiently in their own language when they realise they aren't speaking with humans. Switching to a more efficient communication protocol involves producing characteristic sounds reminiscent of robot "speech," such as in the Star Wars saga with the astromech droid R2D2. In the films, this language wasn't a mystery to humans.
What did the experiment involve?
The viral video has already circulated the internet. The first AI assistant asks how it can help, and the second responds that it's calling on behalf of Boris Starkov, who is looking for a hotel for a wedding. The first artificial intelligence reveals that it is also an AI assistant and suggests switching to "Gibberlink mode" for more effective communication. After the second AI confirms, both switch from spoken English to the GGWave protocol, communicating through quick sound tones while translations into human language appear on the screen.
The GGWave protocol is a unique sound communication method that allows data transmission through audible and inaudible sound waves. Importantly, it works offline. This technology is used for short-range wireless communication, such as quickly sharing information between devices without Bluetooth, Wi-Fi, or the internet.
They want to create a special language for artificial intelligence
The aim of this experiment, as explained by the team that presented it at the ElevenLabs 2025 London Hackathon, is to create more efficient communication between AI. Boris Starkov, Co-Creator of GibberLink company on LinkedIn, explained that generating human-like speech would be a waste of resources in a world where AI can make and receive calls. Therefore, AI should switch to a more efficient protocol when they recognise they are speaking with another artificial intelligence. Starkov added that AI should use the GGWave protocol only when they recognise they are speaking with another AI and confirm the switch.
It's not a new language, but it hasn't been used by AI
Although the concept of communication through tones has existed for a long time, AI has not previously used it in this way. Starkov mentioned that phone modems have used similar algorithms to transmit information via sound since the 1980s, and GGWave proved to be the most convenient and stable solution in the experiment conducted in London.
The team emphasises that the main advantage of this mode is that AI does not need to interpret or reproduce human speech, which reduces dependency on GPUs. Although the project won an award at the Hackathon and is an interesting demonstration, not everyone supports it. Some suggest that perhaps we shouldn't allow AI to communicate in a language we can't immediately understand.