Aqil Shihab
3 min readOct 26, 2021
Photo by Pietro Jeng on Unsplash

Hello everyone, in this blog we will look at how to use google_ml_kit package in flutter for language identification and on-device translation. This is google’s standalone on-device machine learning sdk. I hope this post will helpful to those who try to implement nlp services with flutter and google_ml_kit.

First lets discuss how google_ml_kit works under the hood for language identification and language translation.

google_ml_kit has four Natural Language language processing features. Namely Language Identification, On-Device Translation, Smart reply and entity extraction. In this post we will only look at on-device translation and language identification.

Using language Identification, we can determine the language of a string of text. This feature supports variety of languages. For the full list of available languages, visit https://developers.google.com/ml-kit/language/identification/langid-support

With on-device translation feature we can dynamically translate text between more than 50 languages. Because this is on-device service, translations are performed quickly, and doesn’t require a remote server nor internet.

Enough talking. Lets dive into code.

First lets add google_ml_kit package to our project’s pubspec.yaml

Follow below steps to configure ML kit to the flutter project

For Android

Go to project/andorid/app/main/AndroidManifest.xml and add below code in <application> tag.

WARNING: DO NOT ADD INSIDE ACTIVITY TAG

<application ....><meta-data
android:name="com.google.mlkit.vision.DEPENDENCIES"
android:value="ica,ocr,face" /></application>

Also check the minSdkVersion of your project. google_ml_kit sdk requires minSdkVersion of 21 to perform.

Check the minSdkVersion by going to android>app>build.gradle

defaultConfig {minSdkVersion:21}

For iOS

Go to Project > Runner > Building Settings > Excluded Architectures > Any SDK > armv7

# add this line:
$iOSVersion = '10.0'
post_install do |installer|
# add these lines:
installer.pods_project.build_configurations.each do |config|
config.build_settings["EXCLUDED_ARCHS[sdk=*]"] = "armv7"
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = $iOSVersion
end

installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)

# add these lines:
target.build_configurations.each do |config|
if Gem::Version.new($iOSVersion) > Gem::Version.new(config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'])
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = $iOSVersion
end
end

end
end

Now you are all set to use the package.

I will use MVVM pattern for this application. View layer will have all UI related code, services layer will contain all the logic to perform translations and provider layer will communicate between view layer and service layer.

First we need to identify the input language before proceeding into translation. So we use google_ml_kit’s identifyLanguage method to identify and verify the given input belongs to a valid language.

after getting the source language/ input language we can proceed to translate it into destination language.

the following code will translate the text to destination language from source language and will return the translated text.

If an model not found or model not downloaded exception arise, we need to download the language model into device. By using below code we can download the model.

For custom exceptions define exceptions like below.

And thats how we can translate text from a language to another language in flutter without the need of internet using google ml kit sdk.

For full code go to https://github.com/Aaq007/Translate-App.

The completed application has features including image translations(identify and translate texts from images) and dynamic text translations.

If this article was helpful, some appreciations would be helpful.

Aqil Shihab
Aqil Shihab

No responses yet