December 27, 2024
Watch Wayve’s Lingo-2 AI Model 'Drive' a Car While Following Instructions
Wayve unveiled its artificial intelligence (AI)-based vision-language-action driving model (VLAM) Lingo-2 on Wednesday. Lingo-2 comes as the successor of the Lingo-1 AI model and has multiple new capabilities. The autonomous driving AI can now offer commentary of its actions while driving as well as adapt its actions based on the passenger’s instructions.

Wayve unveiled its artificial intelligence (AI)-based vision-language-action driving model (VLAM) Lingo-2 on Wednesday. Lingo-2 comes as the successor of the Lingo-1 AI model and has multiple new capabilities. The autonomous driving AI can now offer commentary of its actions while driving as well as adapt its actions based on the passenger’s instructions. It can also answer queries about its surroundings which is not directly related to its driving. The AI firm said Lingo-2 was designed as the path to build a trustworthy autonomous driving technology.

Showcasing the capabilities of Lingo-2 in a demo video on X (formerly known as Twitter), the company introduced the new Lingo-2 AI model that is capable of navigating roads while taking instructions from passengers. The post on X also includes a video of a Lingo-2 drive through Central London, where the model drives the car while simultaneously generating real-time driving commentary.

The AI model combines three different architectures — computer vision, large language model (LLM), and action models — to create a combined VLAM model that can perform various complex tasks together in real time. Based on the demo, Lingo-2 can see what’s happening on the road, make decisions on its basis, and inform the passenger about the decision. Additionally, it can also adapt its behaviour based on any instructions given by the passenger, and answer non-driving related queries such as information about the weather.

Wayve says that performing these actions consistently and reliably is an important step towards building autonomous driving technology. “It opens up new possibilities for accelerating learning with natural language by incorporating a description of driving actions and causal reasoning into the model’s training.

Natural language interfaces could, even in the future, allow users to engage in conversations with the driving model, making it easier for people to understand these systems and build trust,” the company said on its website.

It is important to note that Lingo-2 does not really drive a vehicle as it is just an AI model and is not integrated with hardware to control a vehicle. It is trained and tested on Wayve’s in-house closed-loop simulation called Ghost Gym.

Being a closed-loop simulation, the company can test the realistic reaction of other vehicles and pedestrians based on the control vehicle’s behaviour. For the next steps, the AI firm plans to start limited testing of the AI model in a real-world environment to analyse its decision-making in more unpredictable situations.


Affiliate links may be automatically generated – see our ethics statement for details.