Personal Assistants
Open-source AI personal assistants and coding agents on a real Linux machine. OpenClaw, OpenCode, OpenFang, ZeroClaw, and Open WebUI, all pre-wired to Ollama and hosted on EasyEnv.
What they are#
Personal Assistants are open-source AI coding agents and assistants that run on their own EasyEnv machine. Each one is wrapped as a stack recipe: install via the upstream script, run as a systemd service, auto-wire to a co-deployed Ollama box.
- One-click install. Pick a recipe, get a machine. The agent comes pre-installed and ready to take a prompt.
- Real shell access. Each agent runs on a real Linux VM with root access. It can install packages, run services, and operate the system end to end.
- Bring your model. Pair with any of the Open Models recipes, or point at an external OpenAI-compatible endpoint.
Available agents#
openclaw- Autonomous AI assistant. Browser-based control UI, full shell access.opencode- Open-source AI coding agent (anomalyco/opencode). Headless mode, auto-wired to a co-deployed Ollama box.openfang- Rust-based agent operating system. Auto-wires to Ollama viaopenfang.toml.zeroclaw- Lightweight, zero-config AI assistant designed to spin up quickly on a small box.openwebui- Polished chat UI for any LLM. Plug in Ollama or an OpenAI-compatible endpoint and start chatting.
The catalog is updated as new agents mature. Browse the live list at easyenv.io/ai/personal-assistants.
How it works#
An agent recipe installs the upstream binary, writes a config that points at http://<ollama-box>:11434, and registers a systemd service. From the dashboard, the machine's Ports tab exposes the agent's control UI.
Use with your own model#
Boot an agent recipe and an Open Models recipe in the same workspace. Both machines join the same private VPN, the agent discovers Ollama automatically, and you can start prompting. To swap the model, edit the agent's config on the agent machine and restart the service. To swap the entire model backend, edit the agent config to point at an OpenAI-compatible URL instead.
ollama_qwen2_5_coder on the same workspace is the fastest path to a working autonomous coding loop.