France 2030 Budget: €54B ▲ Total allocation | Deployed: €35B+ ▲ 65% of total | Companies Funded: 4,200+ ▲ +800 in 2025 | Startups Funded: 850+ ▲ +150 in 2025 | Competitions: 150+ ▲ 12 currently open | Gigafactories: 15+ ▲ In construction | Jobs Created: 100K+ ▲ Direct employment | Battery Capacity: 120 GWh ▲ 2030 target | H2 Electrolyzers: 6.5 GW ▲ 2030 target | Nuclear SMRs: 6+ ▲ In development | Regions: 18 ▲ All covered | France 2030 Budget: €54B ▲ Total allocation | Deployed: €35B+ ▲ 65% of total | Companies Funded: 4,200+ ▲ +800 in 2025 | Startups Funded: 850+ ▲ +150 in 2025 | Competitions: 150+ ▲ 12 currently open | Gigafactories: 15+ ▲ In construction | Jobs Created: 100K+ ▲ Direct employment | Battery Capacity: 120 GWh ▲ 2030 target | H2 Electrolyzers: 6.5 GW ▲ 2030 target | Nuclear SMRs: 6+ ▲ In development | Regions: 18 ▲ All covered |

Jean Zay is France’s primary national AI supercomputer — the central piece of public compute infrastructure that has made BLOOM, French language model research, climate science, drug discovery, and dozens of other AI-dependent research programs possible. Operated by IDRIS (Institut du Développement et des Ressources en Informatique Scientifique), a national computing center of the CNRS located at Paris-Saclay, Jean Zay has been progressively upgraded under France 2030 to become one of the most capable AI computing systems in Europe. It is not a hyperscaler’s data center — it is a national research infrastructure asset, accessible to the French academic and research community, that embodies the public investment dimension of France’s AI sovereignty strategy.

Name and Institutional Context

The system is named after Jean Zay (1904-1944), France’s Minister of National Education from 1936 to 1939 under the Popular Front government. Zay’s tenure was marked by a strong commitment to science funding and public research — he created the CNRS in 1939, just months before France’s fall to Germany. Arrested by the Vichy regime, imprisoned, and ultimately murdered by the Milice in June 1944, Zay became a martyr of the French Resistance and was posthumously inducted into the Panthéon in 2015. Naming France’s national AI supercomputer after the founder of the CNRS is a statement of intellectual lineage and political values — public research as a democratic and national good.

IDRIS, which operates Jean Zay, is one of three national computing centers in France (alongside CINES in Montpellier and TGCC at CEA Bruyères-le-Châtel). It is the national “very large scale” computing center specializing in CPU and GPU computationally intensive applications — molecular dynamics, quantum chemistry, fluid dynamics, and since 2019, AI. IDRIS reports to the CNRS national research directorate and operates under the scientific oversight of the French Ministry of Higher Education and Research.

Technical Specifications and Upgrades

Jean Zay was delivered in phases starting in 2019:

PhaseYearAdditionCumulative AI Performance
Phase 120192,296 V100 GPUs (Atos/Bull system)~1.5 petaflops
Phase 22020Additional V100 partitions, CPU expansion~3 petaflops
Phase 320221,456 A100 SXM4 80GB GPUs~14 petaflops
Phase 42023 (France 2030)Additional A100 and H100 partitions~28+ petaflops

The France 2030 upgrade in 2023, funded with approximately €22 million, added the H100 partition — NVIDIA’s current-generation flagship AI training accelerator, each H100 delivering approximately 4× the training performance of an A100. The current system architecture uses InfiniBand HDR200 interconnects between GPU nodes, achieving high-bandwidth, low-latency communication essential for distributed training of large models.

Total system: approximately 4,000+ NVIDIA GPUs organized in training partitions of 8, 32, 64, and 256 GPUs, with Lustre parallel file storage providing high-throughput data access. The system’s 28+ petaflops of AI compute places it among the top 5-8 AI supercomputers in Europe — behind German systems (JUWELS Booster at Jülich: ~70 petaflops AI) but ahead of most national academic computing systems.

Governance and Access Policy

Jean Zay’s access governance is designed to maximize strategic value from the infrastructure, balancing open academic access with France 2030’s industrial objectives.

Academic tier: French academic researchers and public research organizations (CNRS, INRIA, CEA, INSERM, etc.) access Jean Zay through GENCI (Grand Équipement National de Calcul Intensif), France’s national high-performance computing organization. Computing time is allocated through competitive calls evaluated by scientific merit. Typical allocation: millions of GPU-hours per year for major projects.

Industrial and startup tier: France 2030 created a specific access pathway for startups and industrial actors engaged in qualifying projects — enabling companies that could not afford AWS or Azure at scale to access national infrastructure. This pathway requires French registration and engagement in France 2030-aligned research or development activities. Access is time-limited and competitively allocated, but has enabled French AI startups to conduct substantial training runs.

International collaborations: Jean Zay has been used for international collaborative projects including the BigScience BLOOM training, which involved 1,000+ researchers from 60+ countries but was operationally hosted in France. This positions French infrastructure as the backbone of global open AI research — a soft power outcome with real strategic value.

BLOOM: The Flagship Achievement

The most internationally visible use of Jean Zay under France 2030 was the BigScience project (2021-2022), which trained BLOOM (BigScience Language Open-science Open-access Multilingual), a 176-billion parameter multilingual LLM.

The context: GPT-3 was released by OpenAI in May 2020 and was immediately recognized as a paradigm-shifting AI capability. But GPT-3’s weights were not released; it was accessible only through OpenAI’s commercial API. The AI research community had no open-weight model of comparable size and quality. BigScience, organized by Hugging Face and hosted on Jean Zay, was the community’s answer: a massively collaborative effort to train and openly release a comparable model.

Training ran for approximately 117 days on 384 A100 GPUs, using around 3.5 million GPU-hours. The resulting model supports 46 natural languages (including 13 programming languages), has been downloaded millions of times from Hugging Face, and served as the foundation for dozens of subsequent research projects. BLOOM demonstrated that Jean Zay was genuinely capable of frontier AI work — not a second-tier academic cluster but a system competitive with the infrastructure used by major US AI labs.

Current Research Programs

Beyond BLOOM, Jean Zay hosts a continuous portfolio of AI research:

French NLP and language models: INRIA, the 3IA institutes, and university teams have trained numerous French language models on Jean Zay, including CamemBERT (French BERT variant), RoBERTa-French, and GPT-style models for French text generation. These models underpin French-language AI applications in government, media, legal, and customer service sectors.

Climate and Earth sciences: Météo-France uses Jean Zay for AI-enhanced weather prediction, training neural network weather models (such as Pangu-Weather-style architectures) that outperform traditional physics-based models for medium-range forecasting. France 2030’s decarbonization objectives depend partly on more accurate renewable energy forecasting — a direct link between AI infrastructure and climate policy.

Drug discovery and bioinformatics: INSERM and Institut Pasteur teams use Jean Zay for protein structure prediction (AlphaFold-related), drug-target interaction modeling, and genomic data analysis. France 2030’s health innovation sector directly benefits from this compute access.

Industrial AI: Under the France 2030 industrial AI programs, companies including EDF, Airbus, and several SMEs access Jean Zay time for training AI models for nuclear safety, aircraft design optimization, and predictive maintenance.

Limitations and the Compute Gap

Jean Zay’s 28+ petaflops, while significant for European academic computing, represents a small fraction of the compute available to leading US AI labs:

  • OpenAI: estimated access to 10,000+ A100 equivalents through Microsoft Azure — roughly 100+ petaflops sustained
  • Google DeepMind: TPU v4 pods delivering 1,000+ petaflops
  • Anthropic: AWS-hosted compute at comparable scale

The gap means that French researchers and companies training at the absolute frontier — GPT-4 scale or beyond — must use US cloud infrastructure for the largest training runs. France 2030 has acknowledged this gap and committed to expanding national AI compute capacity. GENCI is planning a significant upgrade, potentially including an exascale-class French system in the 2027-2029 timeframe, in coordination with the EuroHPC Joint Undertaking.

France 2030 and Exascale Plans

France’s computing ambitions under France 2030 extend beyond Jean Zay’s current capability. France participates in EuroHPC through contributions to the JUPITER pre-exascale system at Jülich (Germany) and the LUMI system in Finland. France’s own contribution to EuroHPC is the Antoine supercomputer (operational 2024-2025), which builds on Jean Zay’s infrastructure and adds AI-specific accelerators.

The longer-term plan — subject to funding confirmation — is a French national exascale contribution in the 2027-2030 window, targeted at AI, climate, and nuclear simulation workloads. The cost: approximately €500-700 million. France 2030’s computing roadmap identifies this investment as necessary to maintain French credibility in frontier AI research.

Premium Intelligence

Access premium analysis for this section.

Subscribe →