AI is hitting a wall — not in compute, not in algorithms, but in infrastructure.
Earlier this week we saw a major move to address this gap. At a White House Press Conference, Donald Trump stood alongside Larry Ellison, Masayoshi Son, and Sam Altman to announce Stargate, a new U.S.-based AI company investing at least $500 billion into AI infrastructure. This initiative will fund massive data centers and the next generation of AI compute power — all built in the United States. The first phase? Ten colossal 500,000 sq. ft. data centers in Texas, with more locations already in the works.
This is beyond investment; it’s a statement of intent. AI will be built at scale, at speed, and on American soil. This is nuanced of course — some of it hype and some of it will make a very tangible difference to our daily lives and work experiences going forward.
But is the classic Internet able to support AI’s exponential growth? And what happens when deregulation collides with monopolization concerns, national security debates, and the need for high-performance AI-native networks suitable for enterprise needs?
AI’s Infrastructure War: Speed vs. Control, Innovation vs. Regulation
Two opposing forces are shaping the future of AI.
On one side, we have unleashed innovation. With Trump also repealing Biden’s AI Executive Order, federal oversight on AI development in the US is disappearing. Companies can now move faster, scale quicker, and deploy AI models without mandatory government safety reviews. This could drive an unprecedented wave of AI breakthroughs — but it also means less accountability, fewer checks on bias, and more fragmentation in AI standards.
On the other side, regulatory pushback is intensifying. The FTC has launched investigations into Microsoft’s OpenAI partnership, Amazon’s AI dominance, and potential anti-competitive practices in AI infrastructure. At the same time, Biden’s AI Diffusion Rule — due to come into force later in 2025 — is facing fierce industry backlash, with NVIDIA, Oracle, and the Semiconductor Industry Association warning that it could weaken U.S. innovation rather than protect it.
Meanwhile, the Stargate AI initiative is racing ahead. With $100 billion committed and pouring into AI infrastructure, the question isn’t who builds the best AI models anymore — it’s who controls the infrastructure that AI runs on.
Interconnectivity: The Hidden Backbone of AI at Scale
There’s a misconception that AI’s biggest challenge is better models, better chips, more power or bigger data centers. But in reality, AI is only as fast as the networks that connect it. Today that is typically an Internet that is past its sell by date and no longer fit for current and future needs.
Right now, AI workloads require massive data flows across the globe, ultra-low latency, and high-bandwidth connectivity. Training state-of-the-art models like GPT-5 or Gemini-2 means moving petabytes of data across data centers, cloud providers, and edge computing nodes. But today’s infrastructure isn’t optimized for that.
Without AI-native networking, AI models will bottleneck before they even reach deployment. Future needs for these advanced networks will only get more critical.
The solution? A new internet and globally available high-performance interconnectivity platforms with advanced features for AI-optimized routing, such as Hyperband offered by Stelia. These networks will:
- Empower enterprise with simple, secure access to GPU compute and other AI resources
- Prioritize AI data flows dynamically, ensuring real-time responses for mission-critical AI applications.
- Leverage edge computing to reduce latency, moving AI processing closer to where it’s needed.
- Automate network scaling with AI-driven traffic optimization, reducing downtime and improving efficiency.
Without this AI-native network infrastructure, even a $500B investment in AI compute won’t be enough to sustain AI’s exponential growth.
The Big Questions for AI Leaders & Investors
So where does this leave AI builders, investors, and policymakers? Will AI deregulation unlock new breakthroughs — or lead to unintended consequences that slow adoption in the long run? Can AI-native networks scale fast enough to keep up with the speed of AI development, or will infrastructure be the real bottleneck? And as AI compute becomes the new gold rush, are we levelling the playing field — or handing too much power to a few dominant players?
AI’s next breakthroughs won’t happen in isolation — they will be decided by infrastructure, policy, and competition.