Batch ML inference and monitoring with Evidently and Prefect. Hi everyone! We put together an MLOps tutorial on how to run batch ML model inference and deploy a monitoring dashboard for production ML models using Evidently and Prefect. It shows an end-to-end flow on a toy dataset: from model training to monitoring. https://www.evidentlyai.com/blog/batch-inference-monitoring
Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.