---
sidebar_position: 10
---

> ## Documentation Index
> Fetch the complete documentation index at: https://docs.lium.io/llms.txt
> Use this file to discover all available pages before exploring further.

# llms.txt

The Lium docs site publishes `llms.txt` and `llms-full.txt` at the site root, conforming to the [llmstxt.org specification](https://llmstxt.org).

## Endpoints

| URL | Description |
|-----|-------------|
| `https://docs.lium.io/llms.txt` | Index: page titles + 1-line summaries + `.md` URLs |
| `https://docs.lium.io/llms-full.txt` | Full bundle: all pages concatenated with H1 separators |

## Format

`llms.txt` is an H1 title, a brief description, and then an audience-grouped link list:

```
# Lium Documentation

> Decentralized GPU rental marketplace on Bittensor Subnet 51.

## Providers
- [Provider Quickstart](/providers/quickstart.md): Register and launch your first provider node in 5 minutes.
- [Architecture](/providers/architecture.md): Self-hosted provider + node architecture.
...
```

`llms-full.txt` concatenates every page's source markdown, separated by H1 headings:

```
# Provider Quickstart

<full page content>

---

# Architecture

<full page content>
...
```

If `llms-full.txt` exceeds 5 MB, it is split into `llms-full.part-1.txt`, `llms-full.part-2.txt`, etc., referenced from the main `llms-full.txt`.

## Usage with LLMs

Pass the full bundle to an LLM as context to enable zero-shot Q&A over the entire Lium documentation:

```python

llms_full = httpx.get("https://docs.lium.io/llms-full.txt").text

client = anthropic.Anthropic()
response = client.messages.create(
    model="claude-sonnet-4-5",
    max_tokens=1024,
    messages=[{
        "role": "user",
        "content": f"<lium_docs>\n{llms_full}\n</lium_docs>\n\nHow do I register a provider node?"
    }]
)
print(response.content[0].text)
```

For programmatic access with tool use, prefer the [MCP endpoint](./mcp). To call the platform API itself (not the docs), use the live [OpenAPI spec](./openapi).
