How to Serve ML Models with TensorFlow Serving and Docker
In this tutorial, I’m going to show you how to serve ML models using Tensorflow Serving, an efficient, flexible, high-performance serving system for ML ...
machine-learning deep-learning tensorflow tensorflow-lite docker serving production article

In this tutorial, we show you how to serve ML models using Tensorflow Serving, an efficient, flexible, high-performance serving system for machine learning models, designed for production environments.

Specifically, you will learn:

  • How to install Tensorflow serving with docker
  • Train and save a simple image classifier with Tensorflow
  • Serve the saved model using Tensorflow Serving

Here’s what you can find in it:

  • Before you start (prerequisite)
  • Introduction to Tensorflow Serving
  • TensorFlow Serving architecture
  • Brief introduction to Docker and installation guide
  • Building, training, and saving an Image classification model
  • Serving saved model with Tensorflow Serving
  • Best practices of using Tensorflow Serving
  • Conclusion

Don't forget to tag @neptune-ai , @risenW in your comment, otherwise they may not be notified.

Authors community post
The most lightweight experiment management tool
Make Machines Smart. Improve People's Life
Share this project
Similar projects
Intro to Keras for Engineers
Everything you need to know to use Keras & TF 2.0 to build real-world machine learning solutions.
Data augmentation recipes in tf.keras image-based models
Learn about different ways of doing data augmentation when training an image classifier in tf.keras.
Inside TensorFlow: tf.Keras
Walkthrough of the high-level Keras API of TensorFlow.
G-SimCLR
TensorFlow implementation of G-SimCLR.