IBM Granite 3.2 8B served via Ollama. Tool-capable, instruction-tuned model with strong instruction following. Pair with any agent UI that requires tool calling.
Every EasyEnv recipe spins up in seconds on a real Linux VM-not a stripped-down sandbox. The Ollama Granite 3.2 8B recipe is provisioned by an open Ansible role, so the machine that boots for you is reproducible, inspectable, and matches what you would get in production.
$ easyenv workspace create --recipe ollama_granite3_2 --name ollama_granite3_2-demo
Provisioning Ollama Granite 3.2 8B...
Workspace ready in ~45s
$ easyenv workspace ssh ollama_granite3_2-demo
Connected. You're on the machine.Builds and deploys AI applications, integrates LLMs into products, and manages self-hosted AI infrastructure
Develops and deploys machine learning models and AI systems for production environments
Builds and maintains data pipelines, ETL processes, and data infrastructure for analytics
Self-hosted LLM inference with Ollama and OpenWebUI for private AI workloads
Self-hosted AI document assistant powered by AnythingLLM and Ollama
Automated AI workflows combining LLMs with n8n orchestration and PostgreSQL