
Google made the Gemini 2.5 family of artificial intelligence (AI) models generally available on Tuesday. With this, users of the chatbot can now access the stable versions of the Gemini 2.5 Pro and the Gemini 2.5 Flash models. Interestingly, the Mountain View-based tech giant has also made the Pro model available to the users on the free tier of the Gemini platform. Additionally, the company has also released a 2.5 Flash-Lite, which is claimed to be Google’s fastest and most cost-efficient AI model.
Gemini 2.5 Pro Is Now Generally Available to All Users
In a blog post, the tech giant announced the rollout of the stable versions of the Gemini 2.5 Pro and Flash models. These large language models (LLMs) were so far available to users as a preview, meaning the full range of capability could not be used. While in preview, these models also tend to be prone to errors and glitches, which are likely to be fixed with the stable version.
While the Google AI Pro and Ultra users will continue to get access to the Gemini 2.5 Pro model, those on the free tier can also use it. However, the daily limit for free users is expected to be lower than the paid users. Notably, Google AI Pro users get expanded access to the model with 100 daily prompts, and Ultra users get an even higher rate limit. Notably, this version of the Pro model is similar to the one released earlier this month, and there are no notable changes.
The change also means that the model selector menu on the Gemini website and app will no longer show the preview versions of these models. Those on the free tier will now see the Gemini 2.5 Flash, Gemini 2.5 Pro, and the Personalisation Preview model which can access the user’s Google Search history and answer queries based on that.
Notably, the tech giant has also released the Gemini 2.5 Flash-Lite model. Google says it has higher performance than the 2.0 Flash-Lite and fares better in areas such as coding, mathematics, science, reasoning, and multimodal tasks. The low-latency model is aimed at near real-time tasks such as translation and classification. It also gets the other features from the 2.5 family such as reasoning at different token budgets, connecting to Google Search and code execution tools, multimodal input support, and a context window of one million tokens.
Gemini 2.5 Flash-Lite is currently available via Google AI Studio and Vertex AI. These platforms are also hosting the stable versions of 2.5 Pro and Flash. Additionally, Google is also integrating 2.5 Flash-Lite and Flash models to Search.