Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Setting Up a Professional MLOps Environment

# Install dependencies if running in Google Colab
try:
    import google.colab
    !pip install mlflow scikit-learn pandas matplotlib
except ImportError:
    pass

Setting Up a Professional MLOps Environment

Welcome to the first step of your MLflow journey! In this lesson, we will move beyond simple scripts and set up a robust, reproducible environment using uv and configure MLflow for real-world usage.

1. Why uv for MLflow?

Traditional pip can be slow and sometimes leads to dependency hell. uv is a modern replacement that is:

  • Fast: Written in Rust, it’s 10-100x faster than pip.

  • Reproducible: It uses a uv.lock file to ensure everyone on your team has the exact same environment.

  • Integrated: It handles virtual environments and package resolution in one tool.

Installation

If you haven’t installed it yet, run:

powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

2. Initializing the MLflow Project

Let’s create a workspace and add our core dependencies. Note how we include ipykernel so we can use this environment inside Jupyter.

# In your terminal:
# uv init mlflow-pro-tutorial
# uv add mlflow scikit-learn pandas matplotlib ipykernel
# uv sync

3. Configuring the Tracking Server

By default, MLflow logs to a local directory (./mlruns). In a production setting, you’ll want a persistent database.

Local SQLite Backend

Using SQLite allows you to use the Model Registry, which is not supported with plain file-based logging.

mlflow server --backend-store-uri sqlite:///mlflow.db --default-artifact-root ./artifacts --host 0.0.0.0 --port 5000

Connecting to a Remote Server

If your team has a shared MLflow server, you simply set an environment variable:

import os
import mlflow

os.environ["MLFLOW_TRACKING_URI"] = "http://your-remote-server:5000"
# Or set it via CLI:
# set MLFLOW_TRACKING_URI=http://localhost:5000

4. Verification

Let’s verify that MLflow can talk to our backend.

import mlflow

print(f"Tracking URI: {mlflow.get_tracking_uri()}")
print(f"MLflow Version: {mlflow.__version__}")