Part of 2026 May 19, 2026 ·
--- days
-- hrs
-- min
-- sec
Content Hub Daily Brief Article
Daily Brief Apr 13, 2026 · 13 min read

Daily Brief: Nvidia bets on open chips as UK's AI infrastructure dreams stall

Daily Brief: Nvidia bets on open chips as UK's AI infrastructure dreams stall

Today, 13.04.2026

Good morning, Human. The week closes with a study in contrasts: one company betting $400 million that the future of AI compute might not look like the present, while a government discovers that ambition without infrastructure is just a press release. The signals this week suggest Europe's AI sovereignty conversation is entering a new phase – one where the physics of energy and the economics of chips matter more than the rhetoric of strategy documents.

In Brief

What: Nvidia-backed SiFive has raised $400 million at a $3.65 billion valuation to bring open-source RISC-V chip architecture into AI data centers – a move that could reshape the compute layer beneath the AI stack. Why it matters: The world's dominant GPU maker is hedging its bets by backing an alternative to the x86 and ARM architectures that currently power its empire, signaling that the AI infrastructure landscape may be more fluid than it appears. What it means for Europe: As the continent struggles with energy costs, grid constraints, and regulatory uncertainty – all factors that just paused OpenAI's Stargate UK project – open chip architectures could offer a path to compute sovereignty that doesn't require building everything from scratch.

These questions – about infrastructure, sovereignty, and who actually controls the AI stack – won't be resolved in newsletters. They'll be debated in person at Human x AI Europe in Vienna on May 19, where the people shaping these decisions will be in the same room.

The Infrastructure Play: Nvidia's Open Architecture Bet

Here's something that doesn't happen often: the company that dominates a market investing in technology that could eventually compete with its own ecosystem. Yet that's precisely what Nvidia did this week, participating in SiFive's $400 million funding round alongside Atreides Management, Apollo Global Management, D1 Capital Partners, and T. Rowe Price.

SiFive, founded in 2015 by the UC Berkeley engineers who created the RISC-V open-source instruction set architecture, has been quietly building an alternative to the proprietary chip designs that underpin modern computing. Unlike Intel's x86 or ARM's architecture – the two dominant CPU families that currently feed Nvidia's GPU empire – RISC-V is open, customizable, and free from licensing fees. The company's business model mirrors ARM's original approach: licensing chip designs rather than manufacturing chips directly.

The strategic logic becomes clearer when you consider what SiFive is actually building. Its designs will integrate with Nvidia's CUDA software ecosystem and NVLink Fusion rack server architecture, allowing RISC-V-based CPUs to plug directly into what Nvidia calls its "AI factory." In other words, while Intel and AMD compete head-to-head with Nvidia's GPUs, Nvidia is backing a startup that could design the CPUs sitting alongside those GPUs – on an entirely different architectural foundation.

The valuation jump tells its own story. SiFive last raised in March 2022, bringing in $175 million at a pre-money valuation of $2.33 billion. Four years later, the company has increased its valuation by more than 50% while the broader tech funding environment has contracted. According to TechCrunch, the round was oversubscribed – a signal that investors see RISC-V's moment arriving.

For European policymakers watching the AI infrastructure race, this matters for reasons beyond Silicon Valley dealmaking. RISC-V has been gaining traction in European research and industrial applications, with the Barcelona Supercomputing Centre's recent Cinco Ranch TC1 chip demonstrating that European institutions can fabricate RISC-V silicon on advanced process nodes. The RISC-V Summit Europe, scheduled for June in Bologna, will bring together the ecosystem building on this foundation. If open architectures can deliver competitive performance for AI workloads, they offer a path to compute sovereignty that doesn't require replicating the entire proprietary stack.

The Policy Situation: Stargate UK's Pause and What It Reveals

The contrast with OpenAI's Stargate UK pause couldn't be sharper. While SiFive raises capital to build the future of AI compute, OpenAI has put its British infrastructure ambitions on hold, citing energy costs and regulatory uncertainty. The project, announced in September 2025 alongside President Trump's state visit to the UK, was meant to deploy up to 8,000 Nvidia GPUs at sites including Cobalt Park in North Tyneside – part of the government's designated AI Growth Zones.

The stated reasons are straightforward: UK industrial electricity prices rank among the highest in the world, running roughly four times higher than in the United States. For a data center drawing 100 megawatts, that differential isn't a line item – it's a structural constraint that compounds as capacity scales. Grid connection requests in the UK surged from 41 gigawatts in November 2024 to 125 gigawatts by June 2025, with an estimated 75 gigawatts attributable to data center projects. Buildings can be constructed in 18 to 24 months; grid connections take three to eight years.

But the energy story is only half the picture. OpenAI also pointed to regulatory uncertainty, specifically around copyright and AI training. The UK government's March 2026 Report on Copyright and Artificial Intelligence abandoned its preferred approach of a broad text-and-data-mining exception with opt-out – the model that would have made it easier for AI companies to train on copyrighted content. Following backlash from creative industries, including high-profile opposition from figures like Paul McCartney and Elton John, the government now says it will "not introduce reforms to copyright law until we are confident that they will meet our objectives."

For AI companies making decade-long infrastructure investments, this uncertainty creates a material business risk. A UK data center isn't just a power facility – it creates legal jurisdiction. If the UK eventually adopts a copyright framework more restrictive than the US, operating infrastructure in Britain could expose companies to liability or compliance costs that don't apply elsewhere.

The government's response has been defensive. A spokesperson noted that the UK's AI sector has attracted more than £100 billion in private investment since the current administration took office. But as Politico reported, the pause represents "a major blow to the government's AI ambitions" – particularly given how prominently Stargate UK featured in the UK-US tech prosperity deal announced during Trump's visit.

Think Tank Watch: Ada Lovelace on AI in Career Guidance

While infrastructure debates dominate headlines, the Ada Lovelace Institute has released a report that deserves attention for what it reveals about AI deployment in public services. "Navigating the Future" examines how AI is being used in career guidance for young people – a domain that might seem peripheral but actually illuminates the broader challenges of responsible AI adoption.

The report, produced in collaboration with the Nuffield Foundation, finds that career development practitioners are experimenting with AI tools to deliver careers information, advice, and guidance, while young people are turning to tools like ChatGPT for career exploration. The landscape is characterized by uncertainty: practitioners don't know how different types of AI work, what data they're trained on, or how to ensure data privacy. They also struggle to advise students on how AI is transforming employment and recruitment.

The findings matter beyond career guidance because they illustrate a pattern playing out across public services. AI tools are being adopted faster than governance frameworks can keep pace. The report calls for "purpose-led" approaches, comprehensive workforce development, and greater oversight and evaluation – recommendations that apply equally to healthcare, education, and municipal services.

For European policymakers, the Ada Lovelace work reinforces a point that often gets lost in sovereignty debates: the challenge isn't just building AI infrastructure, it's ensuring that AI deployment serves public interest outcomes. The EU AI Act's August 2026 deadline for high-risk system compliance is approaching, and many organizations deploying AI in consequential contexts – including career guidance that affects young people's life trajectories – haven't yet developed the governance capacity the regulation will require.

The Numbers That Matter

$3.65 billion – SiFive's new valuation, up from $2.33 billion in 2022, reflecting growing confidence in RISC-V for AI workloads.

$400 million – The oversubscribed funding round, with Nvidia among the investors backing open chip architecture.

8,000 GPUs – The initial deployment OpenAI had planned for Stargate UK, now on indefinite hold.

4x – The approximate ratio of UK industrial electricity prices to US prices, a structural barrier to AI infrastructure investment.

125 GW – UK grid connection requests as of June 2025, up from 41 GW in November 2024, with data centers driving much of the surge.

11,520 – Responses to the UK government's copyright and AI consultation, with the majority reflecting creative industry concerns about AI training on copyrighted works.

6.5% – Projected European data center vacancy rate by end of 2026, down from below 10% in late 2024, as AI demand outpaces supply.

The Week Ahead

April 14-18: Watch for continued fallout from the Stargate UK pause, particularly any government response or revised infrastructure commitments.

April 17-24: RISC-V Summit Europe author notifications begin, signaling which technical advances will be showcased at the June conference in Bologna.

Ongoing: The UK government's Creative Content Exchange pilot platform is expected to launch by summer 2026, testing commercial licensing models for AI training data – a potential path forward on the copyright impasse.

August 2, 2026: EU AI Act transparency obligations take effect, requiring chatbots to disclose they're AI and AI-generated content to carry machine-readable marking. Organizations serving EU users should be finalizing compliance preparations.

The Thought That Lingers

There's something almost poetic about this week's juxtaposition. Nvidia, the company that built its empire on proprietary GPU dominance, is investing in open architecture. The UK, which positioned itself as an AI superpower, discovers that sovereignty requires more than strategy documents – it requires electrons and grid connections and legal clarity that takes years to build.

The deeper lesson may be that AI infrastructure is becoming too important to be left to any single architecture, any single company, or any single jurisdiction. The future likely belongs to those who can navigate multiple paths simultaneously – open and proprietary, sovereign and collaborative, ambitious and realistic about constraints. The question for Europe isn't whether to pursue AI sovereignty, but whether it can build the physical and institutional infrastructure fast enough to make that sovereignty meaningful.

Frequently Asked Questions

What is RISC-V and why does it matter for AI infrastructure?

RISC-V is an open-source instruction set architecture that allows companies to design custom chips without paying licensing fees to Intel or ARM. For AI infrastructure, this means organizations can create specialized processors optimized for their specific workloads while maintaining compatibility with existing software ecosystems like Nvidia's CUDA.

Why did OpenAI pause its UK Stargate project?

OpenAI cited two main factors: UK industrial electricity prices that are roughly four times higher than US rates, and regulatory uncertainty around copyright law for AI training. The combination creates both immediate cost pressures and long-term legal risks for large-scale AI infrastructure investments.

How does the EU AI Act affect organizations using AI in career guidance?

Starting August 2026, organizations using AI systems that significantly impact career decisions must comply with transparency requirements, including disclosing AI use and ensuring proper oversight. Many career guidance providers haven't yet developed the governance frameworks needed for compliance.

What are AI Growth Zones in the UK?

AI Growth Zones are designated areas where the UK government aims to accelerate AI infrastructure development through streamlined planning processes and targeted support. However, the Stargate UK pause shows that designation alone doesn't solve underlying challenges like energy costs and grid capacity.

How significant is Nvidia's investment in SiFive?

Nvidia's participation in SiFive's $400 million funding round represents a strategic hedge – the GPU giant is backing technology that could eventually compete with its current ecosystem. This suggests Nvidia sees the AI infrastructure landscape as more fluid than its current dominance might suggest.

Human×AI Daily Brief is compiled from TechCrunch, CNBC, BBC, Reuters, Politico, The Next Web, Ada Lovelace Institute, GOV.UK, Data Centre Review, and World Economic Forum. This is meant to be useful, not comprehensive.

Created by People. Powered by AI. Enabled by Cities.

One day to shape
Europe's AI future

Early bird tickets available. Secure your place at the most important AI convergence event in Central Europe.