---
sidebar_position: 2.5
title: SDK
---

> ## Documentation Index
> Fetch the complete documentation index at: https://docs.lium.io/llms.txt
> Use this file to discover all available pages before exploring further.

# Lium SDK

The `lium.io` package ships both the [CLI](./cli/overview) and a **Python SDK** for managing GPU pods programmatically. Install it once and use whichever interface fits the job.

:::info Full SDK Reference
The complete API reference, guides, and advanced examples are hosted on Read the Docs:

👉 **[Lium SDK Documentation →](https://lium-docs.readthedocs.io/en/latest/index.html)**
:::

## Installation

```bash
pip install lium.io
```

## Two Entry Points

The SDK exposes two ways to run work on Lium GPUs:

- **`@lium.machine` decorator** — annotate a Python function and offload it to a GPU pod. Best for quickly running isolated workloads.
- **`Lium()` client** — a direct client for long-lived orchestration code that manages pod lifecycles.

### `@lium.machine` decorator

Annotate a function with the machine type and dependencies, then call it like a normal Python function. The SDK handles provisioning, code upload, execution, and teardown.

```python

@lium.machine(machine="A100", requirements=["torch", "transformers", "accelerate"])
def infer(prompt: str) -> str:
    from transformers import AutoTokenizer, AutoModelForCausalLM
    tokenizer = AutoTokenizer.from_pretrained("sshleifer/tiny-gpt2")
    model = AutoModelForCausalLM.from_pretrained("sshleifer/tiny-gpt2", device_map="cuda")
    tokens = tokenizer(prompt, return_tensors="pt").to("cuda")
    out = model.generate(**tokens, max_new_tokens=50)
    return tokenizer.decode(out[0], skip_special_tokens=True)

print(infer("Who discovered penicillin?"))
```

### `Lium()` client

The client mirrors the CLI's pod lifecycle — list nodes, bring a pod up, wait until it's ready, execute commands, and tear it down.

```python
from lium.sdk import Lium

lium = Lium()
executor = lium.ls(gpu_type="A100")[0]
pod = lium.up(executor=executor.id, name="demo")
ready = lium.wait_ready(pod, timeout=600)
print(lium.exec(ready, command="nvidia-smi")["stdout"])
lium.down(ready)
```

## Next Steps

Ready to go deeper? The **[Lium SDK Documentation](https://lium-docs.readthedocs.io/en/latest/index.html)** covers the full API reference, authentication, volumes, backups, and advanced usage patterns.

## Related

- [CLI Installation](./cli/installation) — install the `lium.io` package
- [CLI Reference](./cli/reference) — per-command reference, grouped by category
- [CLI Quickstart](./cli/quickstart) — get started with the CLI
