eoba

@eoba

Joined February 2026

Projects

General purpose reasoning and chat model trained by NVIDIA

MODEL

1

Updated on March 18

A hybrid MoE model trained for tool use from IBM.

MODEL

2

Updated on February 21

Lightweight Gemma 3-based model (270M params) trained specifically for function calling. Text-only with a 32k context window, designed to be fine-tuned into your own tool agent while remaining small enough for laptops or edge devices.

MODEL

1

Updated on February 21