Integrations
Connect Lightcone to AI frameworks and orchestration platforms.
Lightcone integrates with the most popular AI frameworks and tools. Use Lightcone’s cloud computers as the execution environment for any AI workflow that needs to interact with the web.
All integrations start the same way: create a Lightcone computer session, then hand it to your framework. The session gives you a sandboxed cloud computer with stealth mode, proxy support, and session persistence.
AI frameworks
Section titled “AI frameworks”| Integration | Description |
|---|---|
| LangChain | Document loader and tool for LangChain workflows (Python & TypeScript) |
| CrewAI | Computer automation tool for CrewAI multi-agent workflows |
| Browser-Use | Connect Browser-Use’s AI web automation to Lightcone cloud computers |
| Mastra | Lightcone actions as tools in Mastra AI workflows |
| Vercel AI SDK | Lightcone as a tool in Vercel AI SDK workflows |
AI assistants
Section titled “AI assistants”| Integration | Description |
|---|---|
| OpenClaw | Give your OpenClaw AI assistant cloud computers that Northstar can see and operate |
Infrastructure
Section titled “Infrastructure”| Integration | Description |
|---|---|
| Kernel | Run Northstar on Kernel’s cloud browsers with one-click deploy |
| Playwright | Connect Playwright to Lightcone sessions via Chrome DevTools Protocol |
| MCP Server | Model Context Protocol server exposing Lightcone actions as tools |
How integrations work
Section titled “How integrations work”Most integrations follow one of two patterns:
Tool pattern — Lightcone actions (click, type, navigate, screenshot) are wrapped as tools that the model can call. Used by LangChain, CrewAI, Mastra, Vercel AI SDK.
Remote computer pattern — Lightcone provides a cloud computer that the framework connects to via CDP (Chrome DevTools Protocol) or WebSocket. Used by Playwright and Browser-Use.
Both patterns start the same way: create a Lightcone computer session, then hand the session to your framework.