2025 American Invitational Mathematics Examination

Benchmark Tags:
Publisher:
Artificial Analysis
Last Sync:
2026-01-28
Official Site:
Link

Overview

This benchmark evaluates AI models on olympiad-level mathematical problems from the 2025 American Invitational Mathematics Examination (AIME). The AIME is a prestigious mathematics competition for high school students in the United States, known for its challenging problems that require creative problem-solving skills and deep mathematical knowledge.

Key Characteristics

  • Problem Count: 30 problems
  • Answer Format: Integer answers ranging from 000 to 999
  • Difficulty Level: Olympiad-level competition mathematics
  • Subjects Covered: Algebra, geometry, number theory, combinatorics, and other advanced mathematical topics

Purpose

The AIME 2025 benchmark assesses whether AI models can handle sophisticated mathematical reasoning at a level typically reserved for top-performing high school mathematicians. Success on this benchmark indicates strong analytical and problem-solving capabilities.


Source: Artificial Analysis

Benchmark Snapshot