OpenFang is a Rust-based agent operating system (RightNow-AI/openfang). Installed via the upstream curl script, runs as a systemd service, and auto-wires to a co-deployed Ollama box via the [default_model] section of openfang.toml.
Every EasyEnv recipe spins up in seconds on a real Linux VM-not a stripped-down sandbox. The OpenFang recipe is provisioned by an open Ansible role, so the machine that boots for you is reproducible, inspectable, and matches what you would get in production.
$ easyenv workspace create --recipe openfang --name openfang-demo
Provisioning OpenFang...
Workspace ready in ~45s
$ easyenv workspace ssh openfang-demo
Connected. You're on the machine.Builds and deploys AI applications, integrates LLMs into products, and manages self-hosted AI infrastructure
Develops and deploys machine learning models and AI systems for production environments
Develops server-side logic, databases, APIs, and integrations for web applications
Self-hosted LLM inference with Ollama and OpenWebUI for private AI workloads
Self-hosted AI document assistant powered by AnythingLLM and Ollama
Automated AI workflows combining LLMs with n8n orchestration and PostgreSQL