Installation

Turn ships as a single static binary. You install it, point it at a Wasm inference provider, set an API key, and run. No runtimes, no virtual environments, no build pipelines.


1. Install Turn

The fastest path is the one-line installer:

terminal
curl -fsSL https://turn-lang.dev/install.sh | bash

This downloads the turn binary appropriate for your OS and architecture, installs it to ~/.turn/bin/, and adds it to your PATH.

Supported platforms: macOS (ARM + Intel), Linux (x64 + ARM64)

Verify the installation:

terminal
turn --version
# turn v0.5.0-alpha

Building from Source

If you prefer to build from source, you need a Rust toolchain (1.75+):

terminal
git clone https://github.com/ekizito96/Turn.git
cd Turn/impl
cargo build --release

# Add to PATH
export PATH="$PWD/target/release:$PATH"
turn --version

2. Install an Inference Provider

Turn's infer primitive delegates LLM calls to a Wasm driver — a sandboxed .wasm file that knows how to format requests for a specific LLM provider. The driver cannot access your filesystem or network directly; Turn executes the HTTP call on its behalf.

The installer optionally downloads the official OpenAI provider. If you skipped it, install it manually:

terminal
# The provider lives in ~/.turn/providers/
ls ~/.turn/providers/
# turn_provider_openai.wasm

# Tell Turn which provider to use
export TURN_INFER_PROVIDER=~/.turn/providers/turn_provider_openai.wasm

Add this to your shell profile (~/.zshrc or ~/.bashrc) to make it permanent.

NOTE

If TURN_INFER_PROVIDER is not set, Turn will attempt to find a provider in ~/.turn/providers/. If none is found, infer calls will fail with a configuration error — but the rest of the language will work fine.


3. Set Your API Key

Each provider reads credentials from environment variables. For the standard OpenAI provider:

terminal
export OPENAI_API_KEY=sk-...
export OPENAI_MODEL=gpt-4o       # optional, defaults to gpt-4o

API keys are never embedded in .wasm drivers or Turn scripts. The Turn VM substitutes them from the host environment at the moment of the HTTP call, keeping credentials out of sandboxed code entirely.

Other official providers:

ProviderRequired Env Vars
Azure OpenAIAZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, AZURE_OPENAI_DEPLOYMENT
Azure AnthropicAZURE_ANTHROPIC_ENDPOINT, AZURE_ANTHROPIC_API_KEY

See Inference Providers for the full list and configuration details.


4. Write and Run Your First Program

Create a file called hello.tn:

hello.tn
struct Greeting {
message: Str,
language: Str
};

let result = infer Greeting {
"Generate a warm greeting for a developer starting a new language.";
};

call("echo", result.message);
call("echo", "Detected language: " + result.language);

return result;

Run it:

terminal
turn run hello.tn

You should see something like:

output
Welcome, brave developer! You're about to write something that actually thinks.
Detected language: English

What just happened:

  1. Turn compiled hello.tn to bytecode and began executing
  2. When it hit infer Greeting { ... }, it generated a JSON Schema from the Greeting struct
  3. It passed your prompt + the schema to the Wasm provider via a sandboxed call
  4. The Wasm provider formatted an OpenAI API request; Turn executed it
  5. The response was validated against the Greeting schema and bound to result
  6. You accessed result.message and result.language as typed fields — no parsing needed

5. Editor Support (VS Code)

Turn ships a VS Code extension with syntax highlighting, diagnostics, hover documentation, and go-to-definition — backed by a built-in Language Server Protocol implementation.

terminal
# Start the LSP manually (VS Code extension does this automatically)
turn lsp

The extension is in the Turn repository under editors/vscode/. You can install it directly from the .vsix file or install from the VS Code Marketplace once published.


CLI Reference

turn run <file.tn>          Run a Turn program
turn run --id <agent-id>    Run with a named persistent agent identity
turn serve [--port 3000]    Start the HTTP API server
turn lsp                    Start the Language Server (stdio)
turn --version              Print version

Next Steps