Exa Infrastructure

The silent infrastructure behind world-changing science

23 September 2025
5 minutes
In today’s fractured global landscape, collaboration is under pressure on many fronts. But there’s one area quietly defying the trend: research and education (R&E) networks.

They’ve evolved into some of the most data-intensive systems humanity has ever built. From real-time climate modelling and AI research to the coordination of massive physics experiments, modern science is more dependent than ever on robust digital infrastructure.

These networks form an international mesh of national and regional backbones – from GÉANT in Europe to Internet2 in the US, RedCLARA in Latin America and TEIN in APAC – all carrying staggering volumes of data. In Europe alone, GÉANT has seen a 30% average annual increase in network traffic over the last five years with an average of 9 petabytes of data carried every day. And similar growth patterns are emerging across other continents as collaborative science intensifies.

The people working to solve some of the world’s most urgent problems now require connectivity that can meet extraordinary technical demands. However, it’s not just about bandwidth, these networks are looking for long-haul reliability, infrastructure sovereignty, cross-border coordination, and the flexibility to adapt to unpredictable and ever-evolving needs.

It’s important to consider the sheer scale this sector operates on: scientific networks connect tens of millions of users at thousands of institutions worldwide, enabling research that crosses borders, disciplines, and decades. If you were to print those 9 petabytes of data as text on paper, it would create a stack tall enough to get you half way to the moon.

To complicate matters somewhat, these volumes are not driven by predictable, peak business hours traffic. They are often the result of multi-terabyte bursts triggered by specific moments in global collaboration. Climate modelling institutions in Europe synchronise simulations with partners in the US. AI researchers jointly train models using federated learning across compute clusters in Asia, Europe and North America. At CERN, particle physicists move massive datasets from Switzerland to a distributed network of data centres around the world.

These activities demand not just high throughput but extremely resilient, neutral and flexible infrastructure. R&E use cases are also marked by long-term workflows that often span decades, and by the need to maintain scientific continuity across institutional and even national boundaries. In this context, connectivity is no longer a utility. It’s scientific infrastructure, just as vital as the lab bench, the particle accelerator, or the genome sequencer.

Enterprise and cloud models don’t map cleanly onto this world. Standard telecom offerings tend to be based on fixed port capacities, short-term service contracts, and tightly bundled SLAs. These can be ill-suited to scientific missions that need:

● Bursty, unpredictable traffic handling

● Custom routing and path diversity

● Infrastructure neutrality, including the ability to integrate with diverse vendor ecosystems

● Spectrum-based provisioning that allows capacity to scale without overhauling contracts

Timing is just as critical. When a global grant releases funds for a joint experiment, the network needs to be there immediately, not after a six-month procurement cycle. If commercial models don’t align with these cycles, they create friction at the very moments science is meant to accelerate. This is why public–private partnerships need a rethink. Infrastructure providers that build around scientific workflows, rather than quarterly targets, are the ones that will stay relevant.

Take Exa Infrastructure’s 15-year agreement with GÉANT, part of GÉANT’s intercontinental investment programme co-funded by the European Commission via the GN5-IC1 project. It provides spectrum-based transatlantic capacity linking Europe and North America – two of the planet’s most important hubs for science.

But even more important than the capacity itself is the fact that the project isn’t framed as a rigid product. It gives GÉANT the levers: control over spectrum, vendor choice, and scalability over decades. That kind of flexibility is what scientific continuity requires. It’s also a model that can be applied more widely: combining commercial innovation with scientific mission-driven planning, building redundancy into research backbones, and enabling data mobility across borders regardless of political or technical change.

R&E networks aren’t just heavy data users; they’re the silent scaffolding behind pandemic response, climate forecasting, new energy materials, and medical breakthroughs. Their demands aren’t simply for more bandwidth, they want sovereignty, flexibility, and guarantees that the network itself won’t become the bottleneck to discovery.

As collaboration becomes more globally distributed, the ability to move petabytes across borders, time zones, and ecosystems is now table stakes. That doesn’t happen by accident. It takes trust, patience, and partnerships designed to outlive individual product cycles or political shifts. Done right, these partnerships don’t just deliver connectivity, they lay down the digital foundations on which tomorrow’s science – and perhaps society itself – will stand.

RELATED STORIES