Model Serving using FastAPI and Streamlit
Simple example of usage of streamlit and FastAPI for ML model serving.
fastapi streamlit deeplabv3 semantic-segmentation computer-vision segmentation code article tutorial

Simple example of usage of streamlit and FastAPI for ML model serving described on https://davidefiocco.github.io/2020/06/27/streamlit-fastapi-ml-serving.html.

When developing simple APIs that serve machine learning models, it can be useful to have both a backend (with API documentation) for other applications to call and a frontend for users to experiment with the functionality.

In this example, we serve an image semantic segmentation model using FastAPI for the backend service and streamlit for the frontend service. docker-compose orchestrates the two services and allows communication between them.

Don't forget to tag @davidefiocco in your comment, otherwise they may not be notified.

Authors community post
Share this project
Similar projects
Insight
Project Insight is designed to create NLP as a service with code base for both front end GUI (streamlit) and backend server (FastAPI) the usage of ...
Real Python Recommendation Engine
A full stack data science project that performs document similarity on RealPython.com content. Content recommendations are implemented via a Chrome ...
Deploying FastAPI app to Heroku
A detailed walkthrough on Creating a simple FastAPI app and deploying it to Heroku.
FastAPI for Flask Users
A comprehensive guide to FastAPI with a side-by-side code comparison with Flask
Top collections