AGI remains unsolved.
New ideas still needed.

Community Partners

Our goal is to support the research community in the pursuit of general intelligence. We partner with companies to offer compute credits, specialized datasets and grant opportunities.

Below is a list of our current partners who want to support ARC Prize 2025 participants. If you'd like to partner with us or want to suggest we add someone to the list, please reach out to us at team@arcprize.org.


Compute Partners

Modal

Lambda

Lambda is the #1 GPU Cloud for ML/AI teams training, fine-tuning and inferencing AI models, where engineers can easily, securely and affordably build, test and deploy AI products at scale. Lambda’s product portfolio spans from on-prem GPU hardware to hosted GPUs in public and private Clouds, servicing government, researchers, AI native companies and Enterprises world-wide. ARC Prize participants can get $1K of credits here.

Modal

Serve custom AI models at scale. Add one line of code to run any function in the cloud. Get instant autoscaling for ML inference, data jobs, and more $500 in compute credits. Sign up here, list the event as "ARC Prize Ecosystem."

Modal

Google

Leverage Google's world-class infrastructure and AI innovation on Google Cloud. Access powerful TPUs and the latest GPUs (H100s, A100s) through the unified Vertex AI platform to train, tune, and serve models—including the state-of-the-art Gemini family—at massive scale. Apply for credits here.

Modal

Hyperbolic

Hyperbolic is an open-access AI cloud providing the cheapest compute (H100s on-demand starting at $0.99/hr and 4090s also available) and inference for latest models from OpenAI, Qwen, Deepseek and more. Apply for up to $1000 in compute credits here.

Modal

Strong Compute

Strong Compute offers $5k-$50k of compute through their research grant program. ARC Prize participants receive priority placement for review. Guillermo Barbadillo, who achieved 2nd place in 2024, participated in this program. Apply for compute grants here.

Modal

RunPod

RunPod puts serious GPU power in your hands - H200s, H100s, A100s, and more - ready when you need them. Skip the infrastructure headaches with pre-configured TensorFlow and PyTorch environments, so you can build, train, and deploy your models while the boring stuff happens in the background. $100 in credits for ARC Prize participants. Apply here.


Model Partners

Modal

Groq

Groq’s LPU was custom built for delivering price-performant inference, giving you more compute per dollar, while allowing you to deploy immediately and access leading models like GPT-OSS 120B, Kimi-K2, Whisper V3, and more – all through an API call. ARC Prize contestants can supercharge their AI applications with $1000 in inference credits. Sign up here.

Modal

AI21

Creators of the Jamba, Frontier AI models and systems, open for builders, designed for the Enterprise. Sign up here for $200 model credits.


Productivity Partners

Modal

Granola

The AI notepad for people in back-to-back meetings. Granola takes your raw meeting notes and makes them awesome. Sign up here for 6 months off Granola business.

Toggle Animation