eoba

@eoba

Joined February 2026

Projects

A hybrid MoE model trained for tool use from IBM.

MODEL

Updated 14 days ago

Lightweight Gemma 3-based model (270M params) trained specifically for function calling. Text-only with a 32k context window, designed to be fine-tuned into your own tool agent while remaining small enough for laptops or edge devices.

MODEL

1

Updated 14 days ago