Skip to content

ujust localai

OpenAI-compatible local inference API

Quick Start

Follow the standard service lifecycle:

Step Command Recording
1. Config ujust localai config
2. Start ujust localai start
3. Status ujust localai status
4. Logs ujust localai logs
5. Stop ujust localai stop

Subcommands

Configuration

Subcommand Arguments Description Recording
config Configure LocalAI

Lifecycle

Subcommand Arguments Description Recording
restart Restart server
start Start LocalAI server
stop Stop LocalAI server
delete Remove instance config and container

Monitoring

Subcommand Arguments Description Recording
status Show instance status
logs [--lines=N] View container logs
url Show OpenAI-compatible API URL

Shell

Subcommand Arguments Description Recording
shell [-- CMD] Open shell or execute command in container

Other

Subcommand Arguments Description Recording
help Show help

Flags

Flag Short Default Values Description
--bind -b
--config-dir -c
--gpu-type -g
--image -i
--instance -n
--lines -l
--port -p
--tag -t
--workspace-dir -w

See Also

View all 9 command recordings for this service.


Source: just/bazzite-ai/localai.just