Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Size:
< 1K
ArXiv:
Libraries:
Datasets
Dask
AetherCode / README.md
zhwang01's picture
Update README.md
2324e85 verified
|
raw
history blame
4.73 kB
metadata
dataset_info:
  - config_name: v1_2024
    features:
      - name: id
        dtype: int64
      - name: description
        dtype: string
      - name: time_limit
        dtype: int64
      - name: memory_limit
        dtype: int64
      - name: checker
        dtype: string
      - name: test_cases
        list:
          - name: input
            dtype: string
          - name: output
            dtype: string
      - name: year
        dtype: int64
      - name: date
        dtype: string
      - name: difficulty
        dtype: string
      - name: contest_category
        dtype: string
      - name: contest_name
        dtype: string
    splits:
      - name: test
        num_bytes: 20187500547
        num_examples: 400
    download_size: 12737762718
    dataset_size: 20187500547
  - config_name: v1_2025
    features:
      - name: id
        dtype: int64
      - name: description
        dtype: string
      - name: time_limit
        dtype: int64
      - name: memory_limit
        dtype: int64
      - name: checker
        dtype: string
      - name: year
        dtype: int64
      - name: date
        dtype: string
      - name: difficulty
        dtype: string
      - name: contest_category
        dtype: string
      - name: contest_name
        dtype: string
    splits:
      - name: test
        num_bytes: 201028
        num_examples: 56
    download_size: 104645
    dataset_size: 201028
configs:
  - config_name: v1_2024
    data_files:
      - split: test
        path: v1_2024/test-*
  - config_name: v1_2025
    data_files:
      - split: test
        path: v1_2025/test-*

AetherCode: Evaluating LLMs' Ability to Win In Premier Programming Competitions

Introduction

Competitive programming has emerged as a critical benchmark for evaluating the reasoning and coding capabilities of Large Language Models (LLMs). Despite impressive progress on existing benchmarks, we argue that current evaluations overstate model proficiency, masking a substantial gap between LLMs and elite human programmers. This gap arises from two key limitations: insufficient difficulty and scope of benchmark problems, and evaluation bias from low-quality test cases. To address these shortcomings, we present AetherCode, a new benchmark that draws problems from premier programming competitions such as IOI and ICPC, offering broader coverage and higher difficulty. AetherCode further incorporates comprehensive, expert-validated test suites built through a hybrid of automated generation and human curation, ensuring rigorous and reliable assessment. By combining challenging problem design with robust evaluation, AetherCode provides a more faithful measure of LLM capabilities and sets a new standard for future research in code reasoning.

Highlights

Problem Curation from Top-Tier Competitions: AetherCode is the first benchmark to systematically collect problems from premier programming competitions worldwide, including the Olympiad in Informatics (OI) and the International Collegiate Programming Contest (ICPC). Our process involved a comprehensive collection, meticulous cleaning, and format conversion of problems from PDF to a Markdown+LaTeX structure. Each problem statement was manually proofread for correctness, and a team of competitive programming experts mannotated each problem with classification tags.

High-Quality Test Case Generation: We developed a hybrid methodology, combining automated generation with expert annotation, to create high-quality test cases for every problem. We evaluated the correctness and comprehensiveness of our test cases by validating them against a large corpus of collected solutions, enforcing a standard of zero false positives and zero false negatives.

Quickstart

from datasets import load_dataset

# Login using e.g. `huggingface-cli login` to access this dataset
ds = load_dataset("m-a-p/AetherCode", "v1_2024")

License

This project is licensed under CC-BY-4.0. See the LICENSE file for details.