R
Relayby Ai5labs
← Back to all models

Compare models

Comparing 2models. Drop the URL into a doc — it's permalinked.

Fieldmeta-llama/llama-3.3-70b-instructgroq/llama-3.3-70b-versatile
Providermeta-llamagroq
Model IDllama-3.3-70b-instructllama-3.3-70b-versatile
Context131K131K
Max output33K
Input / 1M$0.10$0.59
Output / 1M$0.32$0.79
Cached input / 1M
Avg cost / 1M$0.21$0.69
Speed280 t/s
Quality index56.0
MMLU86.0
GPQA50.5
HumanEval88.4
MATH77.0
SWE-bench
Arena Elo
Tools
Vision
Thinking
Streaming
JSON mode
Structured output
Prompt cache

Same data, in your terminal: relay models compare meta-llama/llama-3.3-70b-instruct llama-3.3-70b

Suggested comparisons