In a production environment, models are promoted based on automated tests, not manual approval.
1. Automated Quality Gates¶
You can write a script that checks a model’s metrics before promoting it to production.
import mlflow
from mlflow.tracking import MlflowClient
client = MlflowClient()
model_name = "Iris_Classifier"
latest_version = client.get_latest_versions(model_name, stages=["None"])[0]
# Fetch metrics for this version
run_id = latest_version.run_id
metrics = client.get_run(run_id).data.metrics
if metrics["accuracy"] > 0.95:
print("Model passed quality gate. Promoting to Production.")
client.set_registered_model_alias(model_name, "champion", latest_version.version)
else:
print("Model failed quality gate.")2. CI/CD Integration¶
This logic is typically run inside a GitHub Action or an Airflow DAG every time a new model is trained.