Meta: Llama 4 Scout

MetaID: meta-llama/llama-4-scout

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input (text and image) and multilingual output (text and code) across 12 supported languages. Designed for assistant-style interaction and visual reasoning, Scout uses 16 experts per forward pass and features a context length of 10 million tokens, with a training corpus of ~40 trillion tokens. Built for high efficiency and local or commercial deployment, Llama 4 Scout incorporates early fusion for seamless modality integration. It is instruction-tuned for use in multilingual chat, captioning, and image understanding tasks. Released under the Llama 4 Community License, it was last trained on data up to August 2024 and launched publicly on April 5, 2025.

Pricing per 1M Tokens

Input (Prompt)$0.08
Output (Completion)$0.30
Cache ReadFree
Cache WriteFree
ImageN/A

Specifications

Context Length328K
Max Output Tokens16K
Input ModalitiesText + Image
Output ModalitiesText
TokenizerLlama4
Instruct TypeN/A
Top Provider Context328K
Top Provider Max Output16K
ModeratedNo

More from Meta

Last updated: March 23, 2026

First tracked: March 23, 2026