deepseek-r1-distill-llama-70b

Public

12.8K Downloads

10 stars

Capabilities

Minimum system memory

40GB

Tags

70B
llama

Last updated

Updated on May 24by
lmmy's profile picture
lmmy

README

DeepSeek R1 Distill Llama 70B by deepseek-ai

Supports context length of 128k.

Distilled from DeepSeek's R1 reasoning model.

Tuned for reasoning and chain-of-thought.

Sources

The underlying model files this model uses