TensorFlow.js + Firebase
Use Firebase Hosting to deploy and host a machine learning model at scale.
production model-serving firebase tensorflow-js article tensorflow tutorial

So you've created a custom machine learning model with TensorFlow.js but now you need to host it somewhere to use on a website of your choice. There are many options to do this, but today we shall see how easy it is to use Firebase Hosting which can also give you some extra benefits such as versioning, serving models over a secure connection, and more out of the box.

What you'll build

In this code lab you will create a full end to end system capable of hosting and running a custom saved TensorFlow.js model along with its related assets such as HTML, CSS, and JavaScript. We will make a very simple lightweight model that can predict a numerical output value given some input value (e.g. what is the price of a house given its square footage), and host it via Firebase Hosting so that it can be used at scale.

What you'll learn

  • How to save a custom TensorFlow.js model in the right format
  • How to setup a Firebase account for hosting
  • How to deploy your assets to Firebase Hosting
  • How to deploy new versions of a model.

Don't forget to tag @tensorflow in your comment, otherwise they may not be notified.

Authors community post
Share this project
Similar projects
Deploying your ML Model with TorchServe
In this talk, Brad Heintz walks through how to use TorchServe to deploy trained models at scale without writing custom code.
TensorFlow Serving
A flexible, high-performance serving system for machine learning models, designed for production environments.
BentoML
BentoML is an open-source framework for high-performance ML model serving.
Cortex
Build machine learning APIs.