Skip to content

JupyterLab

JupyterLab server for interactive data science and ML development with GPU support.

Overview

Attribute Value
Image ghcr.io/atrawog/bazzite-ai-pod-jupyter:stable
Size ~17GB
GPU NVIDIA, AMD, Intel (auto-detected)
Port 8888 (default)
Inherits nvidia-python (PyTorch, CUDA)

Quick Start

Step Command Description Recording
1 ujust jupyter config Configure server
2 ujust jupyter start Start server
3 ujust jupyter status Check status

Access JupyterLab at http://localhost:8888 after starting.

Lifecycle Commands

Command Description Recording
ujust jupyter config Configure settings
ujust jupyter start Start server
ujust jupyter status Check status
ujust jupyter logs View logs
ujust jupyter shell Open shell
ujust jupyter restart Restart server
ujust jupyter stop Stop server
ujust jupyter delete Remove config

Workspace Directory

Mount your project files with the workspace option:

# Default workspace (~/jupyter)
ujust jupyter config
ujust jupyter start

# Custom workspace
ujust jupyter config --workspace-dir=/path/to/projects
ujust jupyter start

Multiple Instances

Run multiple JupyterLab servers for different projects:

# First instance (port 8888, ~/jupyter)
ujust jupyter config
ujust jupyter start

# Second instance (port 8889, different workspace)
ujust jupyter config -n 2 --port=8889 --workspace-dir=~/ml-project
ujust jupyter start -n 2

# Third instance (port 8890)
ujust jupyter config -n 3 --port=8890 --workspace-dir=~/data-analysis
ujust jupyter start -n 3

Pre-installed Libraries

The Jupyter pod inherits from nvidia-python and includes:

  • PyTorch with CUDA support
  • NumPy, Pandas, Scikit-learn
  • Matplotlib, Seaborn, Plotly
  • Transformers, Accelerate
  • JupyterLab with extensions

GPU Verification

Inside a notebook:

import torch
print(f"CUDA available: {torch.cuda.is_available()}")
print(f"GPU: {torch.cuda.get_device_name(0)}")

See Also