Home Innovation Google Google Introduces Open Transla...
Business Fortune
17 January, 2026
Google's new TranslateGemma models provide developers worldwide with robust, open, and multilingual AI translation – faster, smarter, and more accessible than ever – from smartphones to data centers.
In 2026, Google's vigorous pursuit of artificial intelligence (AI) has not abated. In addition to releasing new shopping tools and a protocol, the company has previously announced a relationship with Apple, put the chatbot on its Trends website, and offered Personal Intelligence in Gemini. With the introduction of TranslateGemma models, the business has now turned its attention to the open community. These multilingual AI models are made to facilitate translation between a wide range of languages using both text and image (input only) modalities.
The IT Company located in Mountain View published three different versions of the TranslateGemma AI models in a blog post. These models can be downloaded from Kaggle's website and Google's Hugging Face listing. Additionally, the company's cloud-based AI center, Vertex AI, makes them accessible to developers and businesses. These models come with a permissive license that permits both commercial and scholarly applications.
The sizes of TranslateGemma are 4B, 12B, and 27B (where 4B stands for four billion parameters). The 12B model is intended for consumer computers, while the smallest type is reportedly targeted for mobile and edge deployment. The largest 27B model may be operated locally on a single Nvidia H100 GPU or TPU and provides the highest level of fidelity.
The researchers employed supervised fine-tuning (SFT) using a variety of datasets, building on Gemma 3 models. According to the post, this made it possible for the models to attain wide language coverage even in low-resource (i.e., data-poor) languages. Reinforcement learning (RL) was used to improve these models and improve the translation quality.
According to the company, on the World Machine Translation 2024 (WMT24++) benchmark, the 12B TranslateGemma model performs better than Gemma 3 27B. According to Google, developers will be able to use less than half the baseline model's parameters to obtain the same quality as Gemma 3.
It is stated that 55 language pairs in Spanish, French, Chinese, Hindi, and other languages were utilized to train and assess Google's most recent translation-focused AI models. Additionally, the business stated that it trained the algorithm on almost 500 more language pairs. Interestingly, the model can recognize and translate text within images in addition to directly translating text.