Guide 04: The Challenges of Online Inference - But when you need predictions in real time, you need online inference. There are many gotchas in online inference: you need to query data from multiple sources in real time, you’ll need A/B testing, you need rollout strategies…
Guide 05: Online Inference for ML Deployment - If after learning about those challenges you decide you still need online inference, bless your heart. There are a lot of posts on Flask APIs, but that’s the easiest part. You need versioning, autoscaling, and the ability to A/B test models.
Guide 06: Model Registries for ML Deployment - Where do you store all these trained models? Where do you track metadata and lineage? How do you retrieve models at inference time? That’s where you’ll need a model registry.
Guide 08: A/B Testing Machine Learning Models - Just because a model passes its unit tests, doesn’t mean it will move the product metrics. The only way to establish causality is through online validation. Like any other feature, models need to be A/B tested.
Don't forget to tag
your comment, otherwise they may not be notified.