---
sidebar_position: 8
title: OpenAPI Spec
description: Live OpenAPI 3.1 schema for the Lium platform API — the single source of truth for every REST endpoint.
---

> ## Documentation Index
> Fetch the complete documentation index at: https://docs.lium.io/llms.txt
> Use this file to discover all available pages before exploring further.

# OpenAPI Spec

The Lium platform API publishes a live [OpenAPI 3.1](https://spec.openapis.org/oas/v3.1.0) schema. Point any code-generator, agent runtime, or HTTP client at it to discover every endpoint, payload, and response.

| Resource | URL |
|----------|-----|
| Raw JSON spec | [`https://lium.io/api/openapi.json`](https://lium.io/api/openapi.json) |
| Swagger UI | [`https://lium.io/documents`](https://lium.io/documents) |

The schema is generated directly from the FastAPI app at request time, so it always matches the live deployment.

## Fetch the spec

```bash
curl -sSL https://lium.io/api/openapi.json -o lium-openapi.json
```

```python

spec = httpx.get("https://lium.io/api/openapi.json").json()
print(spec["info"]["title"], spec["info"]["version"])
```

## Use it from an AI agent

### Ground Claude with the spec

Pass the spec to Claude as cached context, then let the model reason about which endpoint to call. The `cache_control` block keeps you from re-billing input tokens on every request — the spec is large but stable.

```python

spec_text = httpx.get("https://lium.io/api/openapi.json").text
client = anthropic.Anthropic()

resp = client.messages.create(
    model="claude-sonnet-4-5",
    max_tokens=1024,
    system=[
        {"type": "text", "text": "You manage Lium GPU pods. Use the OpenAPI spec to choose the right endpoint."},
        {"type": "text", "text": f"<openapi>\n{spec_text}\n</openapi>", "cache_control": {"type": "ephemeral"}},
    ],
    messages=[{"role": "user", "content": "List my running pods."}],
)
print(resp.content[0].text)
```

For real tool calling (model invokes endpoints directly), walk `spec["paths"]` and emit one Anthropic [tool definition](https://docs.anthropic.com/en/docs/build-with-claude/tool-use) per operation — `name = operationId`, `input_schema = requestBody.content["application/json"].schema`.

### MCP-aware agents

If your agent already speaks [MCP](./mcp), point it at the docs MCP endpoint for prose questions and at this OpenAPI URL for direct API calls. Both are stable and require no extra configuration.

### Code generation

Generate a typed client in any language with the standard OpenAPI tooling:

```bash
# TypeScript / fetch
npx openapi-typescript https://lium.io/api/openapi.json -o lium.d.ts

# Python (httpx)
pip install openapi-python-client
openapi-python-client generate --url https://lium.io/api/openapi.json
```

## Authentication

Authenticated endpoints accept your API key in the **`X-API-Key`** request header. Get a key from **Settings → API Keys** at [lium.io](https://lium.io). See the [Developer Quickstart](./quickstart) for a complete first-call walkthrough.

```bash

curl https://lium.io/api/pods -H "X-API-Key: $LIUM_API_KEY"
```

The OpenAPI spec also advertises a `JwtAccessBearer` security scheme — that one is for JWT session tokens used by the lium.io frontend. For server-to-server calls with an API key, use `X-API-Key`.

## Related

- [Developer Quickstart](./quickstart) — first authenticated request in 5 minutes
- [SDK](./sdk) — typed Python client built on top of the same API
- [MCP endpoint](./mcp) — search and read the docs over JSON-RPC
- [llms.txt](./llms-txt) — bulk markdown bundle for LLM context
