Meta: Llama 4 Scout

Llama-4-Scout

Llama 4 Scout 17B Instruct (16E) is a Mixture-of-Experts (MoE) model from Meta, activating 17B parameters out of 109B total. It supports native multimodal input (text + images) and generates multilingual text and code across 12 languages. With 16 experts per forward pass, Scout is optimized for assistant-style interaction, visual reasoning, and large-scale context handling—supporting up to 10M tokens and trained on a ~40T-token corpus.

Engineered for efficiency and flexible deployment, Scout uses early fusion for smooth multimodal integration and is instruction-tuned for tasks like multilingual chat, captioning, and image understanding. Released under the Llama 4 Community License, it was trained on data up to August 2024 and made publicly available on April 5, 2025.

Conversations

Download TXT
Download PDF
CreatorMeta
Release DateApril, 2025
LicenseLlama 4 Comunity Lisense Agreement
Context Window128,000
Image Input SupportYes
Open Source (Weights)Yes
Parameters109B, 17B active at inference time
Model WeightsClick here