Tech

Microsoft collaboration develops DroidSpeak for better communication between LLMs

Share
Share
Microsoft collaboration leads to development of DroidSpeak; a language that allows LLMs to communicate better
Generation quality v.s. prefill delay for Llama-3-8B, Mistrallite, Llama-3-70B and MAmmoTH2 models. Credit: arXiv (2024). DOI: 10.48550/arxiv.2411.02820

A team of computer engineers and AI specialists at Microsoft, working with a pair of colleagues from the University of Chicago, has led to the development of a new language that allows LLMs to speak with one another more efficiently. The group has posted a paper outlining the ideas behind the new language, how it works and the sorts of improvements in efficiency it can lead to, on the arXiv preprint server.

Researchers working on developing more powerful AI systems have noted that one of the most promising areas of research involves building problem-specific AI models that are really good at working on one type of problem—making weather or economic forecasts, for example—and then allowing such apps to talk to each other as a way of building a universal AI system.

In this new effort, the research team noted that LLMs currently talk to each other mostly in English, which makes sense because that is the language they use to interact with humans—at least in English-speaking countries. But they also noted that doing so might be the most efficient way to talk to humans, but it is not the most efficient way for them to talk to each other. To address the problem, they created a whole new language spoken only by LLMs: DroidSpeak.

The idea behind DroidSpeak was to allow LLMs to communicate using the mathematical language that underpins the LLMs themselves. The name is a nod to the language robots use in the Star Wars movies. The researchers also noted that the biggest bottleneck in LLM to LLM communication comes from the systems reporting every step they are taking—an LLM listening would thus need to process all the information at each step. Such bottlenecks can grow rapidly as the LLMs respond to one another.

To break that bottleneck, the researchers created a language that allows the LLMs to share just the data that has been generated, rather than everything that led to its discovery. Using it allowed two test LLMs to communicate 2.78 times faster.

To get their language to perform optimally, the research team found they had to use the same type of LLM model at each end. Thus, there is still room for improvement. They suggest DroidSpeak is likely to evolve over time, as happens with human languages, making it more robust.

More information:
Yuhan Liu et al, DroidSpeak: Enhancing Cross-LLM Communication, arXiv (2024). DOI: 10.48550/arxiv.2411.02820

Journal information:
arXiv


© 2024 Science X Network

Citation:
Microsoft collaboration develops DroidSpeak for better communication between LLMs (2024, November 22)
retrieved 22 November 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Engineered additive makes low-cost renewable energy storage a possibility
Tech

Engineered additive makes low-cost renewable energy storage a possibility

The SH-ZIT design platform. Credit: Nature (2024). DOI: 10.1038/s41586-024-08079-4 Solar and wind...

You can finally try Windows Recall, if you have a Copilot+ PC with a Snapdragon chip
Tech

You can finally try Windows Recall, if you have a Copilot+ PC with a Snapdragon chip

After many delays, Windows Insiders can now try out Recall Windows Recall...

Pro-Russian hacker group targets critical infrastructure and public services
Tech

Pro-Russian hacker group targets critical infrastructure and public services

NoName057 continue DDoS attacks against Taiwanese targets Multiple sectors and critical infrastructure...