OpenAI: gpt-oss-120b

OpenAI-gpt-oss-120b

gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) model from OpenAI, built for advanced reasoning, agentic behavior, and versatile production use cases. It activates 5.1B parameters per forward pass and is optimized to run efficiently on a single H100 GPU with native MXFP4 quantization. The model supports configurable reasoning depth, full chain-of-thought access, and native tool capabilities such as function calling, web browsing, and structured output generation.

Conversations

Download TXT
Download PDF
CreatorOpenAI
Release DateAugust, 2025
LicenseApache 2.0
Context Window131,000
Image Input SupportNo
Open Source (Weights)Yes
Parameters117B, 5.1B active at inference time
Model WeightsClick here