Latest News

Google Introduces Machine Learning to Mobile with TensorFlow Lite

Sampriti Sarkar

Artificial intelligence (AI) today has embedded itself in a wide array of technologies without realization of most consumers. One important aspect of AI is Machine Learning, which handles pattern recognition. The adoption of machine learning models has grown rapidly over the last few years and so there was a need to organize them on mobile and embedded devices. Google has launched a lightweight version of its open-source TensorFlow machine learning library for mobile platforms. TensorFlow Lite will give app developers the ability to deploy AI on mobile devices.

More about TensorFlow and TensorFlow Lite 

Google's TensorFlow has been a popular framework since its inception in 2015. Its wide adoption can be seen right from the enormous server racks to tiny IoT (Internet of Things) devices. It's mainly a library of machine learning algorithms, handled by the Cloud Tensor Processing Units (CPUs) powering its servers.

TensorFlow made designing, training and building deep learning models easier for developers. It is used for machine learning application by researchers and developers. It also helps in deep learning that can enhance many technologies in future. With the growing use of machine learning models, there was an increasing need to redesign it for mobile devices.

TensorFlow Lite is an evolved version of Google's TensorFlow open source library, designed specifically for mobile devices.

The existing TensorFlow Mobile API will be operational but will no longer be the preferred solution for mobile AI. Google advises that developers should start using TensorFlow Lite although the existing system will be operational.

Areas of focus

TensorFlow Lite was redesigned to focus mainly on three areas:

•  It was rebuilt to make as lightweight as possible. So, it enables inference of on-device machine learning models with a small binary code base.

•  It has a run time which is designed to run on cross platforms including Android and iOS.

•  Fast optimisation for mobile devices enabling improved loading time of models and support for hardware acceleration.

Mechanism

 TensorFlow Lite runs the AI models in a highly optimised form on the CPU. It initially supports a handful of pre-trained and tested models such as the follows:

•  MobileNet: The model is able to identify across 1000 different object classes which are specifically designed for efficient execution on mobile and embedded devices

•  Inception v3: It is an image recognition model.

•  Smart Reply: It provides one-touch replies to incoming conversational chat messages.

The company said that more models and features will be added and updated in future according to users' requirements.

Advantages of TensorFlow Lite

 •  It is a lightweight and easy solution for mobile and embedded devices. TensorFlow Lite is also much smaller in size. It occupies less than 300KB when all deployed operators are linked and less than 200KB when only the operators required for supporting models InceptionV3 and Mobilenet are used.

•  It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports using custom operations in models. Developers can write their own custom operators and use them in models.

•  TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. It helps developers run neural networks on low-power devices.

•  It uses techniques such as optimization of the kernels for mobile apps that allow the app to be smaller and faster models. TensorFlow Lite has a new mobile-optimized interpreter, which has the key goals of keeping apps lean and fast. An on-device interpreter with kernels optimises faster execution on mobile.

•  A new FlatBuffers-based model file format.

•  There is a TensorFlow converter which can convert TensorFlow-trained models to the TensorFlow Lite format.

•  This platform will allow developers to reinforce AI on mobile devices. The version will work on both Android and iOS.

•  It makes developing machine learning apps on mobile devices much easier.

Conclusion

Google Inc., which started off as a search engine and expanded into the machine learning and AI fields envisions a future revolving around the use of machine learning and AI.  Google may seem to have diversified in recent years since they are exploring every field from self-driving cars to smartphones, but, machine learning is actually at the core of its every operation.

TensorFlow Lite is still under active development. Hopefully, with more development, it will simplify the developer experience of targeting a model for small devices. Once the complete library is available for the developers, it will help a developer in app development equipped with much smarter machine learning.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

TRON (TRX) and Shiba Inu (SHIB) Price Predictions – Will DTX Exchange Hit $10 From $0.08?

4 Altcoins That Could Flip A $500 Investment Into $50,000 By January 2025

$100 Could Turn Into $47K with This Best Altcoin to Buy While STX Breaks Out with Bullish Momentum and BTC’s Post-Election Surge Continues

Is Ripple (XRP) Primed for Growth? Here’s What to Expect for XRP by Year-End

BlockDAG Leads with Scalable Solutions as Ethereum ETFs Surge and Avalanche Recaptures Tokens