CLI Reference¶
perspt [OPTIONS] [COMMAND]
Global Options¶
Flag |
Description |
|---|---|
|
Path to configuration file |
|
API key for the LLM provider |
|
Provider: |
|
Provider name (equivalent to |
|
Model identifier |
|
List available models and exit |
|
Print help |
|
Print version |
Commands¶
chat (default)¶
Launch the TUI chat interface.
perspt chat [--model MODEL] [--provider-type TYPE]
simple-chat¶
Launch the plain-text CLI chat.
perspt simple-chat [--log-file FILE]
dashboard¶
Launch the real-time web monitoring dashboard.
perspt dashboard [--port PORT] [--bind ADDR] [--db-path PATH]
--port— HTTP port (default3000)--bind— Bind address (default127.0.0.1)--db-path— Path to DuckDB database file (default: platform data directory)
See Dashboard Setup for configuration details.
agent¶
Run the SRBN autonomous coding agent.
perspt agent [OPTIONS] <TASK>
--dashboard— Start the web monitoring dashboard alongside the agent--dashboard-port <PORT>— Port for the embedded dashboard (default3000)
See Agent Options Reference for full agent options.
init¶
Initialize a new project with Perspt configuration.
perspt init [--workdir DIR]
config¶
View or edit Perspt configuration.
perspt config [show|edit|reset]
ledger¶
Query the Merkle ledger.
perspt ledger [--recent] [--stats] [--node NODE_ID]
status¶
Show current session status.
perspt status
Displays: per-node lifecycle counts (queued, running, verifying, retrying, completed, failed, escalated), latest energy breakdown, total retry count, recent escalation reports, step timeline summary (per-step-type counts, total step time), and correction attempt summaries (accepted/rejected counts per node).
abort¶
Abort the current agent session.
perspt abort
resume¶
Resume an interrupted session.
perspt resume [--last]
Displays trust context before resuming: escalation count, last energy state,
total retries. The BudgetEnvelope (step/cost/revision caps) is restored from
the database so limits continue from the interrupted session.
logs¶
View LLM call logs and token metrics. Full prompt/response text is only
available when --log-llm was active during the session; basic token
usage, latency, and cost data are always recorded.
perspt logs [--tui] [--last] [--stats]