mlflow
CLI for MLflow, an open-source platform for machine learning lifecycle
TLDR
Start MLflow tracking server
SYNOPSIS
mlflow command [options]
DESCRIPTION
mlflow is the CLI for MLflow, an open-source platform for machine learning lifecycle management. It tracks experiments, packages code, and deploys models.
The tracking server stores experiment metadata, parameters, metrics, and artifacts. Use mlflow ui for local development or mlflow server for team deployment.
mlflow run executes MLflow Projects—directories or Git repos with MLproject files defining entry points, parameters, and environments. It ensures reproducibility.
Model serving with models serve creates REST endpoints for predictions. models build-docker packages models as containers. The Models component supports multiple ML frameworks.
Artifacts include datasets, models, and outputs. The tracking server stores references; actual files go to configured storage (local, S3, GCS, Azure Blob).
COMMANDS
server
Start tracking server.ui
Start local tracking UI.run uri
Run MLflow project.experiments create|search|delete|rename
Manage experiments.runs list|describe|delete
Manage runs.models serve|build-docker|predict
Model deployment.artifacts download|list|log-artifacts
Manage artifacts.recipes run
Run ML recipes.deployments create|update|delete|list|predict
Manage model deployments.doctor
Diagnose MLflow installation.
SERVER OPTIONS
--host address
Bind address. Default: 127.0.0.1.--port port
Server port. Default: 5000.--backend-store-uri uri
Database URI for experiment/run data.--default-artifact-root path
Default artifact storage location.--workers count
Number of gunicorn workers.
RUN OPTIONS
-P, --param key=value
Project parameters.-e, --entry-point name
Entry point. Default: main.--experiment-name name
Experiment name for the run.--env-manager type
Environment manager: local, conda, virtualenv.
CAVEATS
Tracking server default (SQLite) not suitable for production; use PostgreSQL/MySQL. Large artifacts need object storage. Some features require specific Python packages. Model serving uses Flask development server.
HISTORY
MLflow was created at Databricks and open-sourced in June 2018. It became an LF AI & Data Foundation project in 2020. The platform grew from Databricks' internal tools for managing ML workflows. Version 2.0 (2022) added MLflow Recipes and improved model registry. MLflow is widely adopted for experiment tracking and model management.
