April 16, 2026
Introducing e2a — One API Call to a Secure Agent Sandbox
Today we are launching e2a.bot — the fastest way to give your AI agent a secure, isolated environment. Spawn a Firecracker microVM with terminal, browser, or full desktop access in under 500ms using a single API call.
The Problem
AI agents in 2026 don't just chat — they write code, run commands, browse the web, and manage files. But where do they run? Your laptop is too risky. Your production server is a non-starter. Spinning up cloud VMs takes minutes and costs a fortune in idle time. Docker containers share the host kernel — one exploit and your infrastructure is compromised.
Developers need disposable, isolated, instant compute environments for their agents. That's what e2a provides.
Three Inputs. One API Call. Agent Runs.
# Create a sandbox — boots in <500ms
curl -X POST https://api.e2a.bot/v1/sandboxes \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"template": "browser",
"agent": "claude-code",
"llm_key": "sk-ant-...",
"task": "Research top 5 AI papers this week"
}'
That's it. The agent gets a full Linux environment — its own kernel, filesystem, and network — and executes the task autonomously. When it's done, destroy the sandbox. No cleanup, no leaked state.
Three Templates
Standard
0.5 vCPU · 512 MiB
Terminal CLI. Code execution, file management, package installation.
Browser
1.0 vCPU · 1 GiB
Headless Chromium. Web scraping, testing, browser automation.
CUA Desktop
1.0 vCPU · 2 GiB
Full virtual desktop with mouse + keyboard. Computer Use Agent.
Persistent Workspace
Sandboxes are ephemeral by default — files die with the VM. But when you enable workspace storage, your agent's home directory (/home/user) is backed by S3 via rclone FUSE mount. Files survive sandbox destruction. Spin up a new sandbox with the same workspace — your files are right where you left them.
Workspaces are scoped by (user_id, app_id, capset_id) — each combination gets its own isolated storage. Manage files via presigned URLs, list contents, check usage, all through the REST API.
BYO LLM Key
e2a is agent-agnostic and LLM-agnostic. Bring your own API key for OpenAI, Anthropic, or Google. We provide the sandbox — you choose the brain. When you BYO your key, we don't meter LLM token usage — you pay your provider directly. We only bill for compute (RAM × CPU × time) and workspace storage.
Credit-Based Billing
No subscriptions. No monthly minimums. Buy credits via Stripe, use them on compute. 100 free credits on signup — enough to run dozens of sandbox sessions. Auto-topup available so you never run out mid-task.
Get started at e2a.bot — 100 free credits, no credit card required. Read the docs →