TensorFlow yesterday (14th Nov) announced the developer preview of TensorFlow Lite, a lightweight solution of TensorFlow for mobile and embedded devices, targeted for low-latency inference of on-device machine learning models.
TensorFlow Lite Logo |
TensorFlow Lite is an evolution of TensorFlow Mobile, and designed to be lightweight, cross-platform (Android and iOS for a start), and fast.
Through the Android Neural Networks API, TensorFlow Lite would be capable of utilizing purpose-built machine learning hardware in the devices as they become available.
A trained TensorFlow model can be converted to the TensorFlow Lite format (.tflite) using the provided converter, and deployed to the mobile app (Android or iOS), where the converted model gets executed using the TF Lite Interpreter.
TensorFlow Lite contains a C++ API with a Java API wrapper on Android.
It has out-of-the-box support for MobileNet, Inception V3, and Smart Reply Models.
Read more about TensorFlow Lite on the following links,
Related Links,
Build Deeper: The Path to Deep Learning
Learn the bleeding edge of AI in the most practical way: By getting hands-on with Python, TensorFlow, Keras, and OpenCV. Go a little deeper...
Get your copy now!
I love the way you write and share your niche! Very interesting and different! Keep it coming! I love the way you write and share your niche! Very interesting and different! Keep it coming!
ReplyDelete