openai/o3-mini
$1.10
Input / 1M
$4.40
Output / 1M
200K
Context
95 t/s
Speed
Public benchmark scores
Sourced from each provider's published numbers. Verify before quoting.
Quality index
78
MMLU
86.5
GPQA
79.7
HumanEval
87.8
MATH
97.9
SWE-bench
—
Arena Elo
—
Sources: openai-research-page
Capabilities
toolsstructured_outputstreamingthinking
Input modalities: text, file
Use o3-mini via Relay
Configure the model alias in YAML, then call it from Python.
YAML
# models.yaml
version: 1
models:
o3:
target: openai/o3-mini
credential: $env.OPENAI_API_KEYPython
from relay import Hub
async with Hub.from_yaml("models.yaml") as hub:
resp = await hub.chat(
"o3",
messages=[{"role": "user", "content": "Hello"}],
)
print(resp.text, resp.cost_usd)pip install ai5labs-relay · full docs on GitHub