Meta: Llama 4 Scout

Llama-4-Scout

Llama 4 Scout 17B Instruct (16E) is a Mixture-of-Experts (MoE) model from Meta, activating 17B parameters out of 109B total. It supports native multimodal input (text + images) and generates multilingual text and code across 12 languages. With 16 experts per forward pass, Scout is optimized for assistant-style interaction, visual reasoning, and large-scale context handling—supporting up to 10M tokens and trained on a ~40T-token corpus.

Engineered for efficiency and flexible deployment, Scout uses early fusion for smooth multimodal integration and is instruction-tuned for tasks like multilingual chat, captioning, and image understanding. Released under the Llama 4 Community License, it was trained on data up to August 2024 and made publicly available on April 5, 2025.

Conversations

Download TXT
Download PDF

Creator Meta
Release Date April, 2025
License Llama 4 Comunity Lisense Agreement
Context Window 128,000
Image Input Support Yes
Open Source (Weights) Yes
Parameters 109B, 17B active at inference time
Model Weights Click here

Leave a Reply

Your email address will not be published. Required fields are marked *