# Jua — Full Site Content > Jua is training a foundation model of how reality behaves (EPT-2) and the agent that acts inside it (Athena). The current focus is the atmosphere — state of the art on atmospheric prediction, in production across utilities and trading desks worldwide. Next is the rest of physical reality. Jua.ai AG is an AI lab headquartered in Zürich, Switzerland (founded 2023). Research is peer-reviewed at NeurIPS and ICLR. Series A led by Ananda Impact Ventures and Future Energy Ventures. Team and collaborators span ETH Zurich, KIT, Google Research, the ETH AI Center, DeepMind, and Meta. Backed by the founder of The Climate Corporation and executives from DeepMind and Meta. --- ## Home (https://jua.ai/) ### A foundation model for reality. Language models learned the digital world from text. The physical world is larger, older, and governed by different laws — it needs a different model. ### The thesis The last three years changed everything on a screen. The next three will change everything else. Physical industries — medicine, planes, power, food, chips — have design cycles in years and experiments costing billions. Most ideas die because nobody can afford to try them. LLMs can't fix this. What's needed is a foundation model of reality and an agent. That is what Jua is building. We started with the atmosphere — coupled fluids, thermodynamics, and radiative transfer at planetary scale. It worked. Now we go everywhere. ### EPT-2 — A foundation model that learns physics from data. State of the art on atmospheric prediction versus every incumbent, including ECMWF (~40 years of development). Trained on weather; the same base fine-tuned to handle airfoils and shock waves. Physics transferred. The domain is a variable. ### Athena — An agent that resolves objectives inside physical reality. Objective + world model + tools → simulates consequences, calls tools, resolves the objective. In production across utilities, energy traders, and hedge funds. ### The compounding loop The agent improves the world model. The world model makes the agent more capable. ### Atmospheric prediction benchmark (aggregate skill) Normalized scores (EPT-2 = 100): - EPT-2: 100 (Jua) - GenCast: 84 (Google DeepMind) - Aurora: 79 (Microsoft) - FourCastNet 3: 73 (NVIDIA) - IFS (HRES): 68 (ECMWF) Aggregate skill across RMSE, ACC, and CRPS on a held-out 2024 test set. Full tables in the EPT-2 technical report. ### Three objectives, one agent — the agent is universal, the objective is the variable. 1. Atmospheric prediction — minimize forecast error, beat ECMWF. Athena wins on the metrics traders price. 2. Prediction-market alpha — Athena runs the Jua employee quant fund on prediction markets. Same agent, different tools. Made money from day one. 3. Accelerate research — Athena helps the team build the next model. Experiments, data, GPU cluster. Humans drive; the loop shortens. ### The plan — in this order Each step funds and hardens the next: 1. Build the best atmospheric model — prove the foundation on the hardest dataset. 2. Point the agent at objectives that matter — atmosphere first, where it works today. 3. Expand the world model beyond atmosphere — turbomachinery, thermal, materials, any governing-equation domain. 4. Become the layer every physical AI runs on — like foundation models for language. ### Powering 100+ GW worldwide Customers include: Axpo, TotalEnergies, Shell, Enel, Statkraft, EnBW, EDF, Hydro-Québec, Adani Energy, Vitol, Origin Energy, ESB. --- ## Company (https://jua.ai/company) ### We are making reality programmable. Software ate information. Physics is next. $10.5B invested into world models in twelve months. ### Physical economy is the last thing software hasn't eaten. ~$85 trillion of global GDP comes from physical things. Iteration happens at human speed — weeks of CFD per turbine blade, decades per drug, lifetimes per material. World models collapse the loop. Simulation that took a day and a small town's electricity → single GPU in minutes. The cost of "what happens if" → approximately zero. Innovation goes from generational to weekly. Whoever owns the world model owns the next fifty years. ### Global state of the art — against every incumbent. | Model | Architecture | Score | Lab | |-------|-------------|-------|-----| | EPT-2 | Proprietary | 100 | Jua | | GenCast | Graph + DiT | 84 | Google DeepMind | | Aurora | 3D Swin Transformer | 79 | Microsoft | | FourCastNet 3 | AFNO Transformer | 73 | NVIDIA | | IFS (HRES) | Numerical NWP | 68 | ECMWF | ### Transfer learning — trained on weather, tested on wind tunnels. It worked. One model, one set of weights, three physics domains: - Atmosphere (EPT-2 vs previous SOTA) - Compressible flow / shockwave - Aerodynamics / vehicle body ### Deployments — one agent, three pieces of reality, same architecture, same weights. 1. Agent runs energy markets (Jua platform) — five continents, hourly. Queries the world model. Energy is the first physical market at scale. 2. Agent wagers on reality (employee quant fund) — prediction market contracts (Polymarket). Money from day one. The objective is a parameter. 3. Agent improves itself (recursive) — designs, dispatches, and evaluates experiments; trains the next world model. The recursion is closed. ### Going to design every physical thing. A jet engine in a week. A fusion reactor iterated in days. Alloys without experiments. A grid that never fails. Aircraft at half the fuel burn. A drug in months, not decades. ### Team - Marvin Gabler, Founder & CEO — Germany; biotech + CS; dropped RWTH Aachen; founded and sold first company at 22; built Jua. - Benjamin Guett, COO — Scaled six startups to $10M ARR. - Roberto Molinaro, Lead Researcher — PhD under Siddhartha Mishra at ETH Zurich; co-author of Poseidon; foundation models for PDEs. ### Research Built with ETH Zurich, KIT, Google Research, and the ETH AI Center. Peer-reviewed at NeurIPS and ICLR. Publications: - Universal Diffusion-Based Probabilistic Downscaling (ICLR 2026) — Molinaro, Siegenheim, Martin, Frey, Poulsen, Seitz, Gabler. - EPT-2 Technical Report (arXiv 2025) — Jua Team. - Poseidon: Efficient Foundation Models for PDEs (NeurIPS 2024) — Herde, Raonić, Rohner, Käppeli, Molinaro, de Bézenac, Mishra (ETH Zurich). - Generative AI for fast and accurate statistical computation of fluids (arXiv 2024). - EPT-1.5 Technical Report (arXiv 2024). --- ## Customers (https://jua.ai/customers) ### Weather is the first domain — and also the smallest thing Jua will do. Started with the hardest continuous-physics dataset humanity has recorded. Put the product in front of people whose decisions depend on it. ### In production across utilities and trading desks worldwide Axpo (Switzerland), TotalEnergies (France), Shell (UK), Enel (Italy), Statkraft (Norway), EnBW (Germany), EDF (France), Hydro-Québec (Canada), Adani Energy (India), Vitol (Netherlands), Origin Energy (Australia), ESB (Ireland). ### Why accuracy is worth money - €25M per GW wind per year — 1 GW wind produces ~3 TWh; a 20 percentage-point forecast accuracy improvement → ~€25M/year (illustrative). - $2.6B to bring one new drug to market. - 60% of global GDP comes from physical industries. ### Beyond the atmosphere — the model doesn't care what fluid it's looking at. One foundation, many domains: Turbomachinery, Thermal design, Aerospace, Materials, Drug discovery, Robotics. --- ## Energy Trading (https://jua.ai/energy-trading) ### Trade the move. Before the market sees it. Athena watches your positions, queries the models, and surfaces shifts on owned assets and spreads. Powered by EPT-2 — the most accurate forecast on the planet. ### What Athena does on the desk Three personas, same agent: - Traders — position-shaped Q&A (e.g., Germany–France spread, overnight changes). - Analysts — model comparison and pattern analysis (e.g., Feb 2021 event comparison, wind model divergence). - Portfolio managers — cross-asset views and reporting (e.g., morning meeting summaries, ramp risk). ### What Athena reads from — your edge model, market models, and the market itself. Jua models: EPT-2, EPT-2 Early, Ensemble, Rapid Refresh, High Resolution (European assets), EPT-1.5. Market models: ECMWF (IFS, ENS, EC46, AIFS), ICON, GFS (+ GraphCast), Aurora, Météo-France AROME, KNMI HARMONIE, UKMO. Data sources: ENTSO-E, EEX, EPEX, BMRS/ELEXON, Netztransparenz, prediction markets (Polymarket). ### Capabilities - Power forecasts — MW not m/s; power actuals nowcast; asset-aware. - Workflows & alerts — triggers, agent runs, actions to HTTP/OMS/chat. - Bring your own data — positions, contracts, metadata; joined with forecasts. - Market briefings — scheduled; generated, not templated; email/chat/webhook. - Model benchmarking — live skill versus reanalysis and stations; per variable and region. - Historical & backtesting — decades of data; replay periods; strategy testing. ### Forecast skill (2024 held-out evaluation) Trading edge starts with forecast skill. EPT-2 achieves lower RMSE at 7 days versus IFS, Aurora, and GFS across key atmospheric variables. Full tables and methodology in the EPT-2 technical report. ### Access Web dashboard, REST API, Python SDK, CLI. Sign in at https://athena.jua.ai/ --- ## Careers (https://jua.ai/careers) ### Putting humanity on the next exponential. AI is entering the physical world. Foundation models for forecasting, materials, simulation. Energy, manufacturing, medicine, space. The atoms-side economy at software speed. Work in Zürich. ### Values Mission-driven. Clock-speed. Research-led. Optimistic about technology. Radical candor. Factfulness. ### Benefits Vacation, sick leave, parental leave, local holidays. Flexibility and outcomes-focused. Equity. Learning budget. Equipment budget. Retreats. 1 month per year work from anywhere (Zurich-based). Daily chef lunches. ### Hiring process — seven steps, about three weeks. Most candidates hear back within 48 hours. 1. Application & questionnaire 2. Intro call (Marvin Gabler or Benjamin Guett, 15 min, Google Meet) 3. Online assessments 4. First interview (45 min, Zürich or virtual) 5. Trial day (Zürich, lunch on Jua) 6. Final interview (Marvin, 30 min, Meet or in-person) 7. Offer (+ references) Open positions loaded dynamically from Ashby. Contact: careers@jua.ai --- ## Contact (https://jua.ai/contact) ### Talk to the team that built it. Athena, the world model, your book — pick a channel. ~30-minute reply during European working hours. - Book a demo — 30 min, Athena + EPT-2 + your book. - Email — hello@jua.ai - Sign in to Athena — https://athena.jua.ai/ Enterprise plans include 24/7 support and a named contact. --- ## Blog (https://jua.ai/blog) Technical reports, product launches, and research notes. Posts available at https://jua.ai/blog/. --- ## Products - EPT-2 (Earth Physics Transformer): foundation model for atmospheric physics. Global, hourly-updating. Outperforms ECMWF HRES, IFS ENS, GEFS, NVIDIA FourCastNet 3, and Google GraphCast on RMSE, ACC, and CRPS. Model family includes EPT-2 HRRR (high-resolution Europe), EPT-2 RR (hourly global), and EPT-2e (extended range to 60 days, roadmap to 180). - Athena (https://athena.jua.ai): AI agent for energy traders. Pairs the world model with a controller and a toolset. In production across utilities and trading desks worldwide. ## Papers - Universal Diffusion-Based Probabilistic Downscaling (ICLR 2026) — https://openreview.net/forum?id=8N6HgVeXbo - EPT-2 Technical Report (arXiv 2025) — https://arxiv.org/abs/2507.09703 - Poseidon: Efficient Foundation Models for PDEs (NeurIPS 2024) — https://proceedings.neurips.cc/paper_files/paper/2024/hash/84e1b1ec17bb11c57234e96433022a9a-Abstract-Conference.html - Generative AI for fast and accurate statistical computation of fluids (arXiv 2024) — https://arxiv.org/abs/2409.18359 - EPT-1.5 Technical Report (arXiv 2024) — https://arxiv.org/abs/2410.15076 ## Social - LinkedIn: https://www.linkedin.com/company/juaai/ - GitHub: https://github.com/juaAI ## Sitemap - https://jua.ai/sitemap.xml