Agent Models
Opensperm supports a wide variety of open-weight and proprietary models that can be run locally within your Agent Pod.
Supported Architectures
You can hot-swap models depending on the task requirements. We provide optimized inference engines (like vLLM and llama.cpp) pre-installed in the runtime.
Large Language Models (LLMs)
Llama 3 (8B, 70B), Mistral, Mixtral, Qwen, and custom fine-tunes.
Vision Models
LLaVA, Qwen-VL, and Stable Diffusion for image generation.
© 2026 Opensperm.io · Terms · Privacy · Disclaimer
