Markdown-Native
Write code and documentation together in .lm.md files, or use raw .lm files when you want source-only modules. Both are first-class inputs.
Build deterministic agent workflows with static types, first-class AI primitives, and markdown-native source files.
# Install (One-liner)
curl -fsSL https://raw.githubusercontent.com/alliecatowo/lumen/main/scripts/install.sh | sh
# Or via Cargo
cargo install lumen-langJoin the Editor Support section to set up VS Code.
cat > hello.lm.md << 'EOF' cell main() -> String return "Hello, World!" end EOF
lumen run hello.lm.md
## Why Lumen?
Building AI systems today means juggling Python notebooks, API clients, prompt templates, and orchestration frameworks. Lumen unifies this into one language:
- **Tools** are typed interfaces with policy constraints
- **Grants** enforce safety limits (tokens, timeouts, domains)
- **Agents** encapsulate behavior with scoped capabilities
- **Processes** provide structured workflows (pipelines, state machines, memory)
- **Effects** make side effects explicit and auditable
## Try It Now
Head to the [Playground](./playground) to run Lumen code directly in your browser—no installation required.
## Example: AI Chat Agent
```lumen
use tool llm.chat as Chat
grant Chat
model "gpt-4o"
max_tokens 1024
temperature 0.7
agent Assistant
cell respond(message: String) -> String / {llm}
role system: You are a helpful assistant.
role user: {message}
return Chat(prompt: message)
end
end
cell main() -> String / {llm}
let bot = Assistant()
{{ ... }}
end
## Community
- <img src="https://img.shields.io/github/actions/workflow/status/alliecatowo/lumen/pages.yml?branch=main&label=Docs&style=flat-square" alt="Docs Status" />
<a href="https://open-vsx.org/extension/lumen-lang/lumen-lang"><img src="https://img.shields.io/open-vsx/v/lumen-lang/lumen-lang?style=flat-square&label=Open%20VSX" alt="Open VSX" /></a>
<img src="https://img.shields.io/crates/v/lumen-lang/lumen-lang?style=flat-square" alt="Crates.io" />code samples