Description
LFM2 is a family of hybrid models designed for on-device deployment. LFM2-24B-A2B is the largest model in the family, a 24B MoE model with only 2B active parameters per token, fitting in 32 GB of RAM for deployment on consumer laptops and desktops.
Stats
71.4K Downloads
12 stars
Capabilities
Minimum system memory
Tags
Last updated
Updated on February 24byREADME
LFM2 is a family of hybrid models designed for on-device deployment. LFM2-24B-A2B is the largest model in the family, a 24B MoE model with only 2B active parameters per token, fitting in 32 GB of RAM for deployment on consumer laptops and desktops.
Excels at agentic tool use, document summarization, Q&A, and local RAG pipelines. Supports 9 languages.
Supports a context length of 32k.
Parameters
Custom configuration options included with this model
Sources
The underlying model files this model uses