kaggle-kernels
TLDR
List all kernels
List kernel output files
Initialize metadata file for a kernel (defaults to current directory)
Push new code to a kernel and run the kernel
Pull a kernel
Get data output from the latest kernel run
Display the status of the latest kernel run
SYNOPSIS
kaggle kernels subcommand [options] [arguments]
PARAMETERS
list
List kernels with optional filters for competitions (-c), user, page, sort-by (hotness, dateRun, dateCreated), kernel-type (script/notebook)
-m, --mine
Filter to only your own kernels
-p, --public
Filter to only public kernels
--page <page>
Page number for paginated results
--user <username>
Filter by specific user
--sort-by <sort>
Sort list by hotness/dateRun/dateCreated
--kernel-type <type>
Filter by script or notebook
pull <kernel>
Download kernel source and data to local dir
push [-p <path>] [-q]
Upload and execute local notebook/kernel
-c <competition>
Push to specific competition
--wait
Wait for kernel to complete before returning
--run-all
Run all cells during push
-m <message>
Commit message for push
status <kernel>
Check kernel execution status
--path <path>
Local path for status check
output <kernel>
Download kernel outputs
-h, --help
Show help for subcommand
-v, --version
Show Kaggle CLI version (global)
DESCRIPTION
The kaggle kernels command is part of the official Kaggle CLI tool, enabling users to interact with Kaggle Kernels—cloud-based Jupyter/IPython notebooks hosted on Kaggle's platform for data science competitions, datasets, and models.
It supports key operations like listing kernels with filters (e.g., user's own, public, by competition), pulling kernel source code and data to local directories, pushing local notebooks to Kaggle for execution and sharing, monitoring kernel execution status, and retrieving outputs.
Essential for developers and data scientists, it streamlines workflows by integrating with local IDEs like VS Code or JupyterLab, allowing version control via Git, automated submissions, and collaboration without relying solely on the web UI. Usage requires a Kaggle account, API credentials (kaggle.json), and the Python-based CLI installed via pip.
For example, list your kernels with kaggle kernels list -m, update a notebook with kaggle kernels push, or track progress via kaggle kernels status mykernel. Handles large datasets efficiently but depends on internet and Kaggle's compute limits.
CAVEATS
Requires kaggle.json API key in ~/.kaggle; internet required; subject to Kaggle compute quotas and privacy rules; large kernels may timeout.
INSTALLATION
pip install kaggle; verify with kaggle --version.
AUTHENTICATION
Download API key from Kaggle profile; kaggle config set -g api-key ~/.kaggle/kaggle.json; set permissions: chmod 600 ~/.kaggle/kaggle.json.
HISTORY
Introduced in Kaggle API v1.0+ around 2018 as Kaggle expanded notebook features; evolved with Google acquisition (2017+), adding GPU/TPU support and push/status enhancements by 2022.


