Try Out On-Device Conversational Modeling with Google’s TensorFlow Lite

Try Out On-Device Conversational Modeling with Google’s TensorFlow Lite

We may earn a commission for purchases made using our links.

Google has been stepping into the machine learning game more and more, especially when it comes to messaging. Earlier this year, Google launched a machine learning algorithm for smart messaging on Android Wear 2.0, allowing for the use of previously cloud-based machine learning technologies to be used entirely offline in applications such as Gmail, Inbox and Google Allo. Google is now expanding their efforts into a similar technology dubbed “TensorFlow Lite“- a lightweight and fast version of TensorFlow, a machine learning and neural network program.

Designed with mobile devices and Internet of Things (IoT) devices in mind, the team behind TensorFlow Lite has ensured its compatibility across a wide range of devices. When hardware acceleration is unavailable, its calculations can be made using the regular CPU on your device. Thankfully, the Snapdragon 835 has full support for TensorFlow Lite. Also able to generate a wide variety of responses in an intelligent manner, this technology is not unlike what Google originally launched with Android Wear 2.0.

What’s more, Google’s TensorFlow is capable of understanding when extra words do not change the meaning of a sentence. For example, “how’s it going?” and “how’s it going buddy?” should evoke the same response. With the limited processing power available to such a complex system on mobile, the solution to make it work on a phone is pretty complex. Inputs are “projected” to a compact bit vector and are given a suggested appropriate response by the AI.

If you’re interested, Google has even linked a demo application for testing one touch replies. The future of AI is looking bright when we can have a fully fledged offline reply system! If you’re interested, you can take a look at the blog post down below for more in-depth information, and be sure to give the demo application a try!

Source: Google Research Blog