ADR-0012: WSL Ubuntu on CanEast AI Node as development environment¶
Status¶
Accepted — 2026-04-02
Context¶
The platform needs a development workstation capable of running Claude Code, Ollama with GPU inference, Docker, and Azure DevOps CLI tooling. The CanEast AI Node (REDACTED) dual-boots Windows and runs WSL, providing both a native Linux environment and GPU passthrough for AI workloads.
Decision¶
WSL on the CanEast AI Node is the primary development environment for all Archon platform work.
- OS: Ubuntu (WSL) on Windows
- GPU: REDACTED — passed through to WSL for Ollama/CUDA
- Claude Code: runs in WSL terminal at
~/homelab/repos/ - Ollama: Docker container with GPU access, exposed on
REDACTED:11434 - Open WebUI:
REDACTED:3000 - SSH to nodes: via
~/.ssh/configwith ansible-svc-account key, port REDACTED
Rationale¶
- WSL provides native Linux tooling (Ansible, Terraform, git, az cli) without dual-boot overhead
- GPU passthrough to WSL enables local AI inference (Qwen3:4b runs 100% on REDACTED)
- Windows host handles driver stability and desktop applications
- Single machine for development + AI + testing avoids network latency to nodes
Alternatives considered¶
- Native Linux on CanEast AI Node: better performance, but loses Windows desktop and GPU driver stability
- Develop directly on caneast-c1-node3: limited RAM (8GB), no GPU, laptop hardware
- Cloud dev environment: adds latency, cost, and dependency on internet for homelab work
Consequences¶
- All IaC development happens in WSL — nodes are deployment targets only
- WSL networking uses NAT by default —
REDACTEDis the Windows host IP visible on LAN - Docker Desktop or native Docker CE in WSL required for local container testing
- Ansible runs from WSL target nodes over SSH (port REDACTED)