The New Oil? Compute Power and the Battle for Digital Infrastructure

Introduction: Compute as Strategic Commodity

In the 21st century, computational power – the combination of energy, advanced chips, and AI models – is emerging as the foundation of technological and economic sovereignty. Much like oil in the last century, “compute” has become a strategic resource: it is scarce, concentrated in the hands of a few, and indispensable for both economic strength and military poweriaps.aiiaps.ai. The capacity to generate and apply computing power underpins progress in artificial intelligence, biotechnology, autonomous systems and more, making it a defining lever of state power in our time. Indeed, compute is now often compared to what oil and steel represented in the past – an essential input for national development and securityiaps.aiiaps.ai.

This analogy is more than rhetorical. High-end semiconductors and cloud data centers form the new backbone of modern economies, similar to how oil fields and refineries once did. Nations are treating access to cutting-edge chips and computing infrastructure as matters of strategic priority. Policymakers have noted that advanced semiconductors play a role akin to oil in their dual civilian-military importancelivemint.com. Scarcity in this domain can be a national vulnerability: for example, leading AI labs spend more on cloud compute than on any other input (OpenAI reportedly spends nearly ten times more on compute than on employee salaries)iaps.ai. In turn, greater computing capacity has directly translated into more capable AI systems, reinforcing the competitive advantage of those with access to massive compute reservesiaps.aiiaps.ai.

Recent trends underscore how critical and contested this resource has become. The hardware cost of top AI supercomputers has been doubling roughly every yeariaps.ai, as companies and countries pour capital into ever-larger clusters of GPUs and specialized AI chips. A clear correlation has emerged between the amount of compute used to train AI models and their performance on complex tasksiaps.ai. In short, computing power fuels intelligence – economic, artificial, and military. And as demand soars, control over compute has become a matter of statecraft. Just as past conflicts were shaped by access to oil, today’s geopolitical competition increasingly turns on access to silicon, energy, and the algorithms they empower. The following sections explore how this “new oil” is driving a global race to secure digital infrastructure, the heavy energy costs that come with it, Europe’s struggle to keep up, and future scenarios in which compute is either jealously guarded or collaboratively managed as a global resource.

The Global Race for Compute Hegemony

Global share of advanced AI computational capacity (as of 2023). The United States (teal) holds roughly 75% of the world’s AI supercomputing performance, far ahead of China (red) at about 15%, with the European Union and others (blue/grey) comprising the rest.iaps.aiiaps.ai

A fierce global race is underway to build up sovereign “compute stacks” – the vertically integrated ecosystems of chips, data centers, and AI models that together constitute national AI power. The United States and China are clearly in the lead, with the U.S. currently dominating an estimated 75% of global AI supercomputing capacity, compared to about 15% for Chinaiaps.ai. This dominance stems in part from America’s massive investments and also from its strategic use of export controls to hobble China’s access to high-end chipsiaps.ai. Trillions of dollars are being mobilized as nations recognize that controlling compute power means controlling the future of innovation and intelligence.

The United States has explicitly prioritized maintaining its lead in this domain. Washington’s recent policies – from the 2022 CHIPS and Science Act (allocating $52 billion in semiconductor subsidies)mecouncil.org to the American AI Action Plan in 2025 – aim to establish an American-controlled AI technology stack “from our advanced semiconductors to our models to our applications”brookings.edubrookings.edu. U.S. officials speak of ensuring American AI is the “gold standard” and have even taken equity stakes in strategic chipmakers (such as a 10% stake in Intel) to bolster domestic capacitybrookings.edu. The U.S. government’s leveraging of technological statecraft is evident in sweeping export regulations that not only bar China from obtaining top-tier NVIDIA AI chips, but also restrict where those chips can be deployed globallycsis.orgcsis.org. In late 2024 and 2025, U.S. rules introduced a three-tier “AI Diffusion” framework to tightly control who gets access to advanced GPUs, effectively creating first-class (U.S. and close allies), second-class (nations allowed limited GPUs under strict conditions), and banned tiers (adversaries)csis.orgcsis.org. These moves signal that Washington views compute as a geostrategic chokepoint – one it intends to secure. American cloud hyperscalers (Amazon Web Services, Microsoft Azure, Google Cloud), for their part, are being courted and corralled as extensions of U.S. influence. They operate global infrastructure, but rising U.S. regulations mean these firms must weigh national security priorities alongside market logic. For example, cloud providers naturally seek data center sites with cheap power and welcoming business climates, which might include neutral or non-Western countries – yet under new rules, deploying AI infrastructure in certain jurisdictions could be curtailed or monitored to prevent unintended technology transfercsis.orgcsis.org. In short, U.S. tech giants now find themselves cast as geo-political actors, navigating between profit motives and their home country’s strategic mandates.

China, meanwhile, is racing to build an autonomous tech stack insulated from Western pressure. Facing U.S. export bans on cutting-edge chips and tooling, Beijing has doubled down on indigenous semiconductor development and a massive expansion of domestic computing infrastructurereuters.comreuters.com. Under President Xi Jinping’s “Eastern Data, Western Computing” initiative, China is investing billions to construct eight giant data center hubs in its energy-rich western regionsreuters.com. The idea is to harness abundant coal, hydro, and renewable power in the west to fuel supercomputing centers that will serve demand in the populous eastern provinces – effectively an internal redistribution of compute resources for greater resilience. By mid-2024 China had poured over ¥43.5 billion ($6+ billion) into this program, installing almost 2 million server racks across those hubsreuters.comreuters.com. These efforts are part of a broader drive for compute self-reliance: China’s tech giants are developing homegrown AI chips and seeking alternatives to U.S. technology. Notably, when the U.S. barred NVIDIA’s most advanced AI GPUs, Chinese firms pivoted to designing their own or optimizing algorithms to do more with less – a response that some argue could make China formidable in efficient AI once it closes the hardware gapiaps.ai. Beijing’s large language model labs, from Baidu to Alibaba, continue to push forward, and the first highly capable Chinese GPT-4-class model (DeepSeek) was achieved in 2025 partly through such efficiency innovationscnas.orgcnas.org. Still, bottlenecks remain: as of 2023, over 90% of the world’s most advanced logic chips were fabricated by Taiwan’s TSMCiaps.ai, and critical lithography machines come solely from the Netherlands’ ASMLiaps.ai. Chinese access to both has been choked by export controls. Thus, China’s strategic play is twofold – buy time by expanding capacity at home (even at lower node process technologies or via older-generation chips it can produce domestically) and forge alternative supply chains (including tapping sympathetic suppliers or investing in research to leapfrog current architectures). The outcome of this race remains to be seen, but one measure is clear: even under heavy sanctions, China’s AI investment has not relented. By one account, Chinese investors and state funds have funneled tens of billions into AI start-ups and chip ventures, and the nation continues to consume a huge share of global high-end chips (through both legal channels and gray markets) to feed its AI ambitionscnas.org. Compute has become central to China’s quest for great-power status, viewed as the engine for everything from smart manufacturing to military command systems.

Beyond the U.S. and China, other states are also entering the fray to secure a foothold in the compute hierarchy. The oil-rich Gulf states, in particular, see AI and digital infrastructure as key to their post-petroleum future. The phrase “data is the new oil” carries literal resonance in the Gulf, where leaders are investing petrodollars to acquire tech capabilities. Saudi Arabia and the United Arab Emirates have been buying compute might in bulk – reportedly procuring tens of thousands of top-tier NVIDIA GPUs (including at least 3,000 of the latest H100 chips at ~$40,000 each for Saudi Arabia) to build national AI supercomputersmecouncil.org. Both countries have published lofty national AI strategies and are pouring capital into AI research centers and semiconductor venturesmecouncil.orgmecouncil.org. The Saudi government, for example, launched a $20 billion “Transcendence” initiative aimed at AI and semiconductors, and the Public Investment Fund is backing new data centers and chip fabrication attempts domesticallypwc.compwc.com. The UAE, for its part, has stood up a state-backed AI firm (G42) and, in partnership with its Mubadala sovereign fund, announced a plan to mobilize $100 billion for AI and chip-related dealsmecouncil.orgmecouncil.org. These Gulf states are leveraging their advantages – cheap energy, access to capital, and authoritarian agility in decision-making – to become regional compute hubs. Notably, their efforts have attracted external partnership and scrutiny in equal measure. U.S. tech firms and even the U.S. government have begun to collaborate with the Gulf on AI projects, partly to ensure these emerging AI power centers remain in a friendly orbit. In mid-2025, the U.S. announced major AI cooperation agreements with Saudi Arabia and the UAE, including plans for a 5-gigawatt AI computing campus in Abu Dhabi and allowances for the UAE to import up to 500,000 advanced NVIDIA chips annuallycnas.org. In parallel, Washington quietly pressured Gulf investors to distance themselves from Chinese AI ventures – for example, nudging Saudi Arabia’s fund to divest from a Chinese-linked AI chip startupmecouncil.orgmecouncil.org. The Gulf’s role thus highlights how hyperscale cloud capacity itself is becoming a geopolitical bargaining chip. The likes of Amazon, Microsoft, and Google are building data centers in Saudi and UAE under government incentives, effectively extending U.S.-aligned cloud empires into the Middle Eastpwc.compwc.com. In doing so, the Gulf states gain cutting-edge infrastructure, while the U.S. secures influence over a critical new locus of compute growth, hoping to preempt these platforms from falling into rival (e.g. Chinese) ecosystemscnas.orgcnas.org.

Underpinning this race are a few choke points that every nation must navigate. One is the semiconductor supply chain, which remains extraordinarily concentrated and delicate. Despite national efforts to diversify, the world still relies on a handful of firms for advanced chips: NVIDIA (and to some extent AMD) designs the majority of AI processors; TSMC (Taiwan) and Samsung (Korea) fabricate almost all leading-edge chips; ASML (Netherlands) alone provides the extreme ultraviolet lithography tools needed for those fabsiaps.ai. This concentration means compute power can be controlled – and indeed, the U.S. has demonstrated this by leveraging its influence over these chokepoints. American-led restrictions now block not only direct chip exports to certain countries but also limit China’s access to the tooling and talent needed to develop equivalentsiaps.ai. In effect, the U.S. and its allies have formed a “silicon shield” akin to an OPEC for chips, constraining who can produce and use the most advanced computing fuel. Another chokepoint is cloud infrastructure itself. A few U.S. corporations operate the bulk of global cloud data centers, which means access to scalable compute can be withdrawn if political winds shift. European officials, for instance, have fretted that U.S. law (like the CLOUD Act) could compel American providers to cut off or hand over data from European serversbrookings.edu. Meanwhile, countries like Russia and Iran that found themselves sanctioned have seen Western tech companies pull out cloud and software services, illustrating how digital infrastructure can be weaponized much like a trade embargo. This dynamic gives the owners of compute infrastructure outsized leverage on the world stage – a new kind of oil barony, ruled by tech giants and the states that host them. As a result, nations are scrambling to either cultivate domestic cloud champions or to hedge by aligning with a dominant provider from abroad. It is a high-stakes game of capacity-building and alliance formation, all centered on ensuring one’s economy (and military) is not left stranded in a future where intelligence = compute.

The Energy Cost of Intelligence

The meteoric rise of AI and digitalization comes with a hefty thermodynamic price tag. Running sprawling data centers and training AI models on millions of processors consume enormous amounts of electricity and generate copious heat. As AI adoption accelerates, the energy footprint of compute is expanding so quickly that it has drawn comparisons to the early days of industrial power consumption. Global data centers in 2024 already gobble up roughly 415 terawatt-hours (TWh) of electricity annually – about 1.5% of worldwide electricity useiea.orgiea.org. And this demand is on track to more than double by 2030, reaching an estimated 945 TWh (nearly 3% of global power) in that timeframeiea.org. To put this in perspective, if current trends hold, within a few years the world’s data centers will use more electricity than some large economies (945 TWh is roughly the entire power consumption of Japan)carbonbrief.orgcarbonbrief.org. In an era of climate concern, such projections are sounding alarm bells about the environmental sustainability of unchecked compute growthcarbonbrief.orgcarbonbrief.org.

Crucially, AI is the dominant driver of this surge. Traditional IT workloads have grown steadily, but the recent spike in data center power use is largely due to the deployment of power-hungry AI accelerators (GPUs and other chips) for training and running machine learning modelsiea.orgiea.org. The International Energy Agency (IEA) notes that “AI is the most important driver” of data-center demand growth this decadecarbonbrief.org. High-performance AI servers not only draw more electricity themselves, but also require robust cooling and power backup infrastructure, compounding the energy needs. Today, AI workloads account for an estimated 5–15% of data center electricity use, but by 2030 that share could rise to 35–50%carbonbrief.org. In advanced hyperscale facilities designed for AI, the power density (watts per square foot) far exceeds that of conventional server roomspwc.com. This “thermodynamic burden of intelligence” manifests in striking ways: training a single large AI model (like a GPT-4-sized network) can consume millions of kilowatt-hours over weeks or months, equivalent to the annual power usage of hundreds of homes. Running fleets of such models continuously for cloud AI services only multiplies the load. As one analysis put it, the trajectory of AI demand risks overwhelming power grids and undermining climate goals unless mitigated by efficiency gains or clean energy scalingcarbonbrief.orgcarbonbrief.org. Already, tech companies that pride themselves on climate pledges have reported jumps in carbon emissions tied to their data center expansions for AIcarbonbrief.org.

Where and how all this compute gets built is increasingly dictated by the search for cheap, reliable energy and infrastructure stability. Electricity has become the lifeblood of the digital economy, and differences in power prices and grid capacity are influencing the geography of AI. Data centers thrive where power is affordable and abundant. The Middle East exemplifies this: nations like Saudi Arabia and the UAE boast some of the lowest electricity costs globally (on the order of $0.05–$0.06 per kWh, roughly half the U.S. average)pwc.com. Combined with large tracts of inexpensive land and ample capital, this makes the Gulf an attractive site for energy-intensive AI server farmspwc.compwc.com. Indeed, the Middle East’s largest players are capitalizing on their energy advantage by inviting major cloud providers to build there and by planning their own mega-scale data centers (e.g. Saudi Arabia’s planned 1 GW “Neom” data center and other projects)pwc.compwc.com. They see an opportunity to turn oil and gas wealth into digital infrastructure – literally converting one form of energy into another (fossil fuels into computing power). Similarly, other regions with cheap renewable resources are vying for a piece of the action. In Northern Europe, countries like Norway, Sweden, and Iceland pitch their abundant hydroelectric power and cool climates (natural cooling for servers) as a sustainable home for global data centers. These locations offer not just low energy prices but also grid stability in mature electric networks – a key factor, since an outage or fluctuation can cripple a data center. The United States, which currently hosts the largest data center capacity, also benefits from pockets of low-cost power (for instance, the Pacific Northwest’s hydro dams or the wind-rich plains states) and a generally stable grid. We are seeing hyperscalers strategically site new server farms in places like Oregon, Iowa, or Alabama where land and power are cheap, while also negotiating long-term power purchase agreements for dedicated renewable energy to lock in pricing.

However, sovereign priorities sometimes override pure market logic in deciding where compute lives. National security and regulatory concerns are prompting countries to insist on local data centers even if energy costs are higher. For example, many European countries mandate that sensitive data (from government or health sectors) be stored and processed within national or EU borders, spawning local data center builds despite electricity being pricier than in, say, the Middle East. India has similarly pushed for data localization, which is leading to a boom in Indian data center construction despite a relatively coal-heavy grid. These choices reflect that digital sovereignty – the control over one’s data and digital destiny – can take precedence over cost efficiency. Europe in particular faces a quandary: it has high energy prices and stringent environmental rules, which make building huge compute facilities challenging, yet it is keen not to be entirely dependent on foreign cloud providers. The result is ambitious initiatives like Gaia-X, a European effort to federate cloud infrastructure under common standards, and the EU’s own Chips Act to incentivize semiconductor fabs on European soil. While laudable for sovereignty, such moves are expensive. TSMC’s new fabs in the U.S. and potential fabs in Europe, for instance, are estimated to cost 30–50% more to operate than in Taiwan due to labor and energy costslivemint.com. Similarly, connecting large new data centers to the grid in Europe can be slow and difficult: in major hubs like Frankfurt, Dublin, or London, power utility capacity is so strained that lead times of 3–5 years are now common to get a data center fully energizedmckinsey.com. These bottlenecks have even led to moratoria on new data centers in some locales until grid upgrades catch up. By contrast, countries with more flexible regimes can move faster – for example, China’s Western Computing hubs were built in part because the government could swiftly allocate land and power in remote provinces for national AI needsreuters.comreuters.com.

The thermodynamic reality is that compute expansion and energy strategy are inseparable. Going forward, we can expect compute infrastructure to gravitate toward jurisdictions that offer a trifecta of low-cost power, stable grids, and supportive policy. Renewable energy will play a huge role: cloud operators are among the world’s largest buyers of wind and solar energy as they seek to both reduce carbon footprint and secure long-term fixed energy prices. Regions with a surplus of clean energy (say, desert solar in the Middle East, offshore wind in the North Sea, or geothermal in Iceland) could become the new loci of “energy-to-AI” pipelines. Yet there is also a risk that compute ends up clustered in places with cheap but dirty power (e.g. coal-abundant areas), which could complicate climate efforts. Balancing these factors will be crucial. Ultimately, just as oil refineries were built near oil fields or ports, AI “giga-factories” (data centers) will rise where electrons are cheapest – unless strategic concerns dictate otherwise. Governments may choose to subsidize electricity for domestic compute projects, viewing it like a modern Manhattan Project or a public utility necessity for competitiveness. Already, some U.S. states and European countries offer discounted power rates or tax breaks specifically to attract data center investments, effectively competing to host the engines of the digital economy. The global map of compute is thus being drawn at the intersection of power engineering and geopolitics: whoever secures the most energy-efficient, resilient compute capacity will hold a key advantage in the AI age.

Europe’s Dilemma: Rules Without Scale

Nowhere is the contrast between regulatory might and infrastructure weakness more apparent than in Europe. The European Union has become a global trendsetter in digital regulation – enforcing strict data privacy (GDPR), drafting AI governance rules (the upcoming EU AI Act), and reining in Big Tech with competition and content laws (DSA, DMA). Yet, for all its normative power, Europe has struggled to achieve scale in the compute arena. The continent that produced the Industrial Revolution now finds itself lagging in the digital infrastructure revolution, raising fears of becoming a “compute colony” dependent on foreign technology.

Europe’s strategic vulnerability starts with its lack of indigenous hyperscalers. While the U.S. is home to the major cloud providers and China has its own (Alibaba Cloud, Tencent, etc.), Europe effectively has none of comparable scale. As of mid-decade, over 74% of EU member states rely at least partially on U.S. cloud providers to meet their computing needs, whereas only 14% make any use of EU-based cloudsbrookings.edu. European companies and governments store enormous troves of data on Amazon, Microsoft, and Google servers. This reliance extends across the tech stack: by one estimate, more than 80% of Europe’s overall digital stack – from chips to operating systems to cloud platforms – is importedbrookings.edubrookings.edu. Europe produces only around 10% of the world’s semiconductors (mostly legacy and specialty chips), and its share of cutting-edge chip manufacturing is even smallerbrookings.edubrookings.edu. Despite the presence of some excellent research supercomputers and niche cloud firms, the EU lacks any player with the global footprint of an AWS or a TSMC.

This imbalance has led European leaders to worry openly about digital sovereignty. They ask: how wise is it for Europe’s data, AI models, and even critical services to run on infrastructure largely controlled from Seattle or Silicon Valley? The political winds have shifted toward urgency in addressing this. A senior German Member of European Parliament, Axel Voss, lamented in 2025 that “we do not have a reliable U.S. partner any longer” and urged Europe to develop its own “sovereign AI and secure cloud”brookings.edubrookings.edu. Such statements reflect anxieties that geopolitical rifts (for instance, a more isolationist turn in Washington) could suddenly cut off or change terms of Europe’s access to compute resourcesbrookings.edu. The U.S. export controls on chips to China – and quietly, to other countries like the Gulf states – did not go unnoticed in Europe; they underscored that the owner of the compute can dictate who gets to use it. Should a transatlantic dispute arise, Europe fears being at the mercy of decisions made in Washington (or in corporate boardrooms of American firms). The CLOUD Act and past revelations (like the 2013 Snowden spying leaks) already strained trust by demonstrating U.S. agencies could reach into data on European soilbrookings.edu. As a result, Europe feels the risk of being a digital vassal – following rules set by others, its data colonized in foreign servers.

Ironically, Europe’s strength in making rules has not yet translated into building alternatives. Initiatives like Gaia-X, launched in 2020 with the vision of federating European cloud providers under common standards for security and interoperability, have had limited practical effect so farbrookings.edu. European cloud firms remain relatively small, and many have been acquired or outcompeted by U.S. giants. The EU’s new Chips Act, which allocates €43 billion to stimulate on-continent chip production, aims to double Europe’s chip output to 20% of the world market by 2030livemint.com. But even if achieved, 20% would still leave Europe far from self-sufficient in the semiconductor value chain, and success is uncertain – TSMC and Intel have tentatively agreed to build fabs in Germany and Italy, but these projects face cost overruns and delays, and will rely on U.S. or Asian equipment and expertise. In AI, Europe has world-class research (DeepMind was co-founded by Europeans, Stability AI and Alphafold came out of European talent), yet many top AI startups and researchers have migrated to the U.S. or China in search of greater compute resources and fundingbrookings.edu. This brain drain exacerbates the infrastructure gap: talented Europeans often end up building on American compute platforms anyway.

Thus, Europe finds itself with “rules without scale.” Brussels can decree stringent AI ethics guidelines, but European companies might still end up training their AI models on American cloud servers using American chips. The risk is that Europe becomes a consumption market and regulatory supervisor in AI, rather than a production power. Some have termed this the fate of a “digital colony” – where Europe’s data and AI economic value flow outward to the platforms of others, even as it tries to tax or regulate those activitiesmolnett.combrookings.edu. A prominent European economist, Cristina Caffarra, drew an analogy: if Europe’s roads, power grids, and railways were largely in foreign hands, it would be deemed unacceptable – yet currently a large share of Europe’s cloud and AI infrastructure is effectively in foreign handsbrookings.edu. That uncomfortable realization is fueling calls in Europe for bold action to build sovereign capacity, even if it means short-term inefficiencies or higher costs.

To Europe’s credit, it has begun to address these challenges. The EU’s regulatory stance, often criticized by Silicon Valley as hostile, is actually part of a strategy to forge a third way – ensuring that European values (privacy, competition, safety) are embedded in technology, while negotiating from a position of strength for access. For instance, by being a tough regulator, the EU has leverage to demand data-localization or transparency concessions from foreign cloud providers who want access to the lucrative European market. Europe is also leveraging partnerships: discussions are underway about transatlantic alignment on AI standards, and about investing collectively in infrastructure (for example, through NATO or G7 frameworks, Europe seeks to be included in the “first-class” tier of AI access that the U.S. controlscsis.orgcsis.org). Moreover, individual European countries are stepping up. France, for one, has been investing in high-performance computing and recently unveiled a new AI supercomputer (Jean Zay expanded with Nvidia GPUs) to support local research – a bid to reduce reliance on U.S. cloud for scientists. Germany is funding a consortium for an open-source large language model on European hardware. While these efforts are modest next to U.S./China scales, they indicate Europe does not intend to simply cede the field.

Still, structural issues persist. Energy costs in Europe are high, which discourages energy-intensive data center operations – a situation worsened by the 2022–2023 energy crisis that sent electricity prices soaring. Power availability in key hubs like Ireland, Amsterdam, and Frankfurt is now so constrained that data center growth has been paused or requires creative solutions (such as operators agreeing to use backup generators during peak national demand). Additionally, Europe’s emphasis on green and ethical AI, while laudable, may slow deployment of AI systems domestically compared to more permissive environments abroad. For example, if European regulations severely restrict certain AI uses, companies might choose to develop and host those innovations outside of Europe. This raises the specter that Europe could become an “AI importer” – implementing AI-driven services that were mostly developed elsewhere. In economic terms, that means missing out on value-added and job creation in the AI supply chain.

Europe’s dilemma can be summed up as a race to build scale before it’s too late. The window is still open for Europe to carve out a significant presence in the global compute landscape, especially by partnering with like-minded allies. Proposals have emerged for a “third AI stack” championed by Europe – not to go it entirely alone, but to lead a consortium of democratic countries in a cooperative alternative to the U.S. and Chinese ecosystemsbrookings.edubrookings.edu. This might involve pooling investments on European soil, sharing cloud capacity among allies, and jointly developing open-source AI models that reflect European languages and values. Such a vision, however, will require Europe to marshal strengths beyond regulation – namely, investment, industrial coordination, and talent retentionbrookings.edubrookings.edu. The coming years will show whether the EU can translate its considerable market power and regulatory clout into tangible computing power. If not, Europe may indeed find itself essentially outsourcing its digital sovereignty, contenting itself with setting guardrails while others drive the digital revolution. In the worst case, Europe could end up as a well-regulated backwater in the AI world, comfortable with privacy and ethics but with little say in the cutting-edge developments that occur on the other side of the Atlantic or in Asia. Avoiding that outcome is paramount for European strategists today.

Future Scenarios: Compute Sovereignty or Fragmentation

As compute power solidifies its role as the new strategic resource, the global order could evolve in one of two broad directions. One path is toward compute sovereignty and blocs, where nations or tight alliances achieve self-reliance in AI infrastructure, resulting in a fragmented or even bifurcated digital world. The other path is toward some form of cooperative management – an “AI Bretton Woods” – where nations establish agreements or institutions to share or govern compute in a stable, collective framework. The stakes of these scenarios are high: they will influence whether AI becomes a field of intense zero-sum rivalry or a more open, globally accessible good.

In the first scenario, fragmentation deepens. If current trends continue, we could see the emergence of multiple, insulated compute spheres. The U.S. and a coalition of allies might operate a dominant high-performance compute network, while China and aligned states develop a parallel stack. Nations in between could be forced to choose sides or risk being shut out from the most advanced capabilities. There are already signs of this techno-decoupling. U.S. export controls have effectively drawn a line excluding China (and those too closely connected to it) from top-tier AI hardwarecsis.orgcsis.org. Now even some friendly countries find themselves in a second tier, allowed access only to slightly older generations of chips – kept “at least a generation behind the frontier” as official U.S. policy statescsis.org. If such policies persist or tighten, resentment among “Tier 2” countries (perhaps including middle powers like Turkey, Vietnam, or even Eastern European states) could growcsis.orgcsis.org. They may respond by accelerating their own sovereign compute initiatives or by seeking alternative partnerships (for instance, a country left in the cold by U.S. rules might turn to Chinese cloud providers or promote domestic chip industries). The result could be a patchwork of semi-interoperable AI ecosystems, each with its own standards and maybe even its own internet-cloud structures – an AI analogue of the “splinternet”. The global flow of AI knowledge might also suffer: advanced models and research could be guarded as national secrets or trade weapons. Already, we see how cutting-edge AI models like GPT-4 are closely held (OpenAI has not open-sourced it, partly due to safety and partly due to competitive edge), and export control debates have begun to include not just hardware but also cloud access and algorithm weightscsis.org. In a fragmented future, compute becomes weaponized. Nations could wield control of compute as a lever against others – for example, a country controlling a crucial cloud region might throttle access for a rival during a conflict, akin to an oil embargo. Military strategists are certainly treating enemy data centers as potential targets in conflict scenarios, just as oil refineries or power plants would be. One can imagine a future skirmish where cyber or even kinetic strikes aim to cripple the adversary’s AI computation nodes, seeking to blind their algorithms. Security alliances may extend into compute sharing (we already see inklings of this in deals like AUKUS, where the U.S., UK, and Australia collaborate on sensitive tech – AI and quantum included – effectively pledging not to let each other fall behind in compute)cnas.orgcnas.org. All these developments point to an arms-race dynamic where compute capacity is stockpiled and jealously guarded, much like nuclear stockpiles in the Cold War. Unlike nuclear weapons, however, AI compute isn’t about deterrence through destruction; it’s about deterrence through capability dominance – the fear that “whoever masters AI will dominate” drives each bloc to push harder, potentially sacrificing global cooperation in the process.

The fragmented outcome carries significant risks. It could lead to a widening gap between AI haves and have-nots, exacerbating global inequality. Smaller or less developed countries might find themselves dependent clients of one of the big AI powers, or cut off entirely from advanced AI services (with implications for their economic growth and security). In an extreme fragmentation, international scientific collaboration could suffer: shared global challenges like climate modeling or pandemic prediction, which benefit from pooled computing resources and data, might receive less collective compute if each bloc hoards capacity for its own priorities. There is also the danger of miscalculation and conflict. If each side views the others’ AI progress with suspicion, that could ignite a spiral of mistrust. For example, the use of AI in military command and control might make crises more unstable if adversaries fear an AI-enabled first strike or surveillance advantage. Some analysts warn of a new kind of compute arms race, where nations race to build ever-bigger training clusters to achieve AI breakthroughs or even Artificial General Intelligence first – an outcome that could be both wasteful and dangerous if not coordinated. We are already hearing rhetoric akin to an arms race: one country’s announcement of a billion-dollar AI lab prompts others to announce their own; chip export bans are met with retaliatory bans on critical minerals. The lessons of the 20th century suggest that such arms races are costly and fraught with peril.

On the other hand, the second scenario envisions an attempt at global coordination or at least partial detente – essentially bringing compute into the realm of international governance. Influential voices have begun floating ideas for an international framework to manage the proliferation of powerful AI. Even the United Nations Secretary-General has supported the idea of creating an international AI watchdog agency “inspired by” the International Atomic Energy Agency (which oversees nuclear technology)reuters.comreuters.com. The notion is that an “International AI Agency” could monitor and perhaps even ration the deployment of the most advanced AI models, track the global flow of AI-relevant chips, and ensure baseline safety standardsreuters.com. While that might sound far-fetched, it signals a recognition that completely uncontrolled competition may be untenable. Another idea is a kind of global compute reserve or consortium, where major powers collectively maintain a pool of computing resources for mutual benefit – for example, to use on global scientific challenges or to backstop countries that need AI for critical needs (much like the IMF can stabilize economies with loans). Some AI experts, including those at OpenAI, have suggested a role for international oversight that could even include “compute monitoring” to detect if any actor is concentrating enough compute to potentially develop a superintelligent AI without others knowingreuters.com. We might see embryonic steps in this direction: the G7’s “Hiroshima AI process” is one forum where leaders discussed cooperative measures for AI governance, and future summits (the UK is hosting a Global AI Safety Summit, for instance) are tackling how to handle frontier models and their compute needsreuters.comreuters.com. There are also proposals for compute usage caps or reporting requirements – analogous to carbon emissions targets – where nations would agree to limits on training runs above a certain size unless there is international notificationarxiv.org.

For a true “AI Bretton Woods” moment – referencing the 1944 conference that established the post-war financial order – the major powers would need to find common ground despite strategic competition. This could take shape if there emerges a shared concern that unchecked AI arms races pose a existential risk (for example, if advanced AI could threaten humanity or massively destabilize economies). Under that pressure, even rivals might see wisdom in treaties to limit how far things go, similar to nuclear arms control treaties in the Cold War. A hypothetical agreement might involve verifiable compute inspections, much as arms treaties have inspections; perhaps large chip fabs and data centers could be monitored to ensure no one is covertly amassing a destabilizing level of AI capability beyond agreed limitsarxiv.org. Additionally, mechanisms to share the benefits of AI could be instituted – say a global research network where developing countries get access to a certain amount of cloud AI time for education and healthcare applications, subsidized by richer nations. This optimistic scenario would frame compute power not just as a national asset but as a global commons that must be managed for collective good (with guardrails to prevent misuse).

Of course, reality may unfold somewhere between these extremes. We might see regional cooperatives – for example, a trusted network among democracies that coordinate their compute resources (as hinted by U.S. partnerships with Europe, Japan, etc. on chips and AI)brookings.educnas.org, while a separate bloc forms around China and aligned states. Within each, there could be rules of the road to prevent escalation (perhaps an AI version of the “no first use” pledge or agreements not to automate nuclear launch decisions). Meanwhile, international bodies might at least set norms for responsible use of AI (as the OECD and UN are trying to do) even if a full compute-sharing regime is out of reach. Another possibility is the rise of non-state compute actors – for instance, if a few tech companies (or a coalition of them) voluntarily establish an AI compute trust or cloud that any researcher can access for approved purposes. That could alleviate some inequities, though it wouldn’t solve geopolitical rivalry.

At present, the momentum is arguably toward fragmentation; the tech-nationalist impulses are strong, and trust between great powers is low. The U.S. has demonstrated its willingness to go to great lengths to retain compute superiority, even at the cost of fragmenting global trade (e.g. sanctioning not just Chinese firms but also any third country re-exporting chips to China)mecouncil.orgmecouncil.org. China, for its part, is charting a separate course with its own standards and is unlikely to subject its AI progress to external policing. But it’s worth remembering that early in the nuclear era, similar skepticism existed about global cooperation, yet eventually treaties and institutions did emerge once the peril was undeniable. It may take a catalyzing event – perhaps a serious AI incident or near-miss that scares everyone – to push the world toward an AI equivalent of Bretton Woods or the Non-Proliferation Treaty. Should that happen, compute power might be governed by agreements that limit extreme concentration and encourage sharing for the sake of stability. In essence, humanity would recognize compute as the new oil and seek to avoid the resource curse and conflict that oil brought by building an order around it.

In conclusion, compute power is quickly becoming the coin of the realm in global affairs. Its significance echoes that of oil in the 20th century, reshaping strategic calculations and spawning competition for control of supply lines – only now those supply lines are server racks and chip fabs rather than wells and pipelines. Nations that navigate this transition successfully will be those who secure the energy, hardware, and talent needed to harness AI, all while balancing the risks of over-concentration or conflict. Whether the future is one of every country for itself, or a more coordinated approach to managing this vital resource, will profoundly affect global prosperity and security. In the meantime, the race for digital infrastructure dominance is on, with echoes of past resource struggles guiding and warning us. The world is, in a very real sense, entering an era where compute is power – and all the opportunities and dilemmas that come with it.

Disclaimer: This think piece is provided for informational purposes by Sterling Asset Group’s Knowledge Center and does not constitute investment advice or an offer to buy/sell securities. The opinions expressed are those of the author and are subject to change. While the information is obtained from sources believed to be reliable and credibleiaps.ailivemint.com, Sterling Asset Group makes no guarantee of its accuracy or completeness. All forward-looking statements or scenarios are speculative and for analysis only. Neither Sterling Asset Group nor the author shall be liable for any actions taken based on the content of this publication. Readers should consult professional advisors before making any decisions based on the material presented.

Next
Next

China’s Geopolitical Strategy in 2025: Energy, Industry, and Tech Sovereignty in a Decoupling World