Model Serving using FastAPI and Streamlit
Simple example of usage of streamlit and FastAPI for ML model serving.
fastapi streamlit image-segmentation deeplabv3 semantic-segmentation computer-vision segmentation tutorial code article

Simple example of usage of streamlit and FastAPI for ML model serving described on

When developing simple APIs that serve machine learning models, it can be useful to have both a backend (with API documentation) for other applications to call and a frontend for users to experiment with the functionality.

In this example, we serve an image semantic segmentation model using FastAPI for the backend service and streamlit for the frontend service. docker-compose orchestrates the two services and allows communication between them.

Top collections

Don't forget to tag @davidefiocco in your comment.

Authors community post
Share this project
Similar projects
FastAPI for Flask Users
A comprehensive guide to FastAPI with a side-by-side code comparison with Flask
Creating an End-to-End Machine Learning Application
A complete, end-to-end ML application, implemented in both TensorFlow 2.0 and PyTorch.
Fast Api with Dockerization of your ML Models
In this GitHub repo you can able to know and learn how to build a fast API for testing your ML model and can test your ML model with UI and to Dockerize ...
Why We Switched from Flask to FastAPI for Production ML
The most popular tool isn’t always the best.