OpenAI: gpt-oss-120b

OpenAI-gpt-oss-120b

gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) model from OpenAI, built for advanced reasoning, agentic behavior, and versatile production use cases. It activates 5.1B parameters per forward pass and is optimized to run efficiently on a single H100 GPU with native MXFP4 quantization. The model supports configurable reasoning depth, full chain-of-thought access, and native tool capabilities such as function calling, web browsing, and structured output generation.

Conversations

Download TXT
Download PDF

Creator OpenAI
Release Date August, 2025
License Apache 2.0
Context Window 131,000
Image Input Support No
Open Source (Weights) Yes
Parameters 117B, 5.1B active at inference time
Model Weights Click here

Leave a Reply

Your email address will not be published. Required fields are marked *