Point of View

DeepSeek upends AI’s power structure, threatens Silicon Valley dominance

“How can a small, smart team with a $10 million budget take on AI’s trillion-dollar giants?”

Just a year ago, an AI CEO confidently claimed: “It’s totally hopeless to compete with us on training foundation models.”

Enter DeepSeek from China.

With a mere $5.6 million—a fraction of what OpenAI, Google, and Meta spent—DeepSeek built a large language model (LLM) that rivals ChatGPT, Gemini, and Llama. Even more disruptive? It’s open source.

Wall Street panicked. The Nasdaq dropped 3%, and NVIDIA plunged 17%, its worst drop in years.

For enterprises, this signals a fundamental shift in AI’s competitive landscape. The cost barrier to developing advanced AI models has dropped dramatically, introducing new business opportunities—and risks. Should enterprises double down on existing AI investments or explore alternatives? Will open source become the new standard, or will proprietary models dominate the enterprise space?

These are no longer theoretical questions—they will define how enterprises use AI to adopt, scale, and compete.

Models aren’t just commoditizing; they’re becoming infrastructure

The past few years have seen significant investment in creating LLMs, with upgrades constantly announced in the race to capture market share. This race, run primarily by corporations with trillion-dollar valuations and money to burn, was considered cost-prohibitive for others to compete in. DeepSeek changed the game overnight, even if that $5.6 million figure must be taken with a pinch of salt.

The developments vindicate those who believed models would become commoditized and applications would be at the center of the action. DeepSeek’s launch will likely accelerate releases planned by OpenAI, Google, and others, putting more capabilities at the fingertips of developers leveraging models to build industry-specific solutions. AI models are going the way of cloud computing—ubiquitous and essential but no longer the main differentiator. Just as enterprises today don’t obsess over AWS vs. Azure but focus on what they build on top, LLMs will be judged not by size but by their business impact.

Open source gains the edge, but it may not be a winner-take-all

The launch of DeepSeek, with its open-source architecture and low cost, means the debate between closed models and open source has shifted toward open source. It makes the model affordable and provides a low-cost platform for companies to innovate.

However, cost alone won’t dictate the winner. Tools and solutions that seamlessly interact with existing infrastructure are essential, especially given enterprise data security and privacy concerns. Many existing customers will likely continue scaling with Microsoft because its ecosystem integrates seamlessly, and enterprises are comfortable with it. Familiarity plays a role in being the path of least resistance.

The future may not be one or the other, but both coexisting and competing for market share, as iOS and Android do in smartphones.

Demand for compute stays high, but AI’s energy game is changing

AI’s demand for compute is reshaping the energy sector. Data centers are projected to consume up to 12% of all US electricity, more than triple their 2023 levels, driven mainly by AI workloads. This has AI giants scrambling to secure energy sources, including betting on nuclear power—Google is buying 500 megawatts from Kairos, Amazon has invested $500 million in X-Energy, and Microsoft is backing a $1.6 billion reactor renovation at Three Mile Island.

But DeepSeek’s approach challenges the assumption that AI breakthroughs require ever-increasing energy consumption. It was trained on fewer chips and runs on a smaller parameter set, requiring less computation per inference—directly reducing power consumption. While DeepSeek may be the first, competitors will follow suit, optimizing models to reduce energy use without sacrificing performance.

For enterprises, this means opportunities to cut AI costs and improve sustainability. Running LLMs locally on optimized hardware rather than solely relying on cloud-based GPUs can reduce expenses and carbon footprints.

Still, AI isn’t getting any less compute-intensive overall—it’s just shifting how and where compute is used. More open models will fuel more AI experimentation, keeping demand for compute high. NVIDIA will likely continue to benefit but will face intensifying competition from hyperscalers such as Google, Microsoft, and Amazon, which are rapidly developing AI chips to control their compute destiny.

The geopolitical battle for AI dominance will be multi-polar

Technological developments have played a crucial role in shaping today’s world. The development of radar and codebreaking was pivotal to the Allies’ victory in World War II, and the development of nuclear missile technology was the sword hanging over the world during the Cold War. Technological advancement always shapes geopolitical power, and AI is the tool of today when countries are desperate to be leaders. As we opined in March 2024, the need to lead in AI made semiconductors the next frontier of geopolitics.

To gain AI leadership, the US government under President Biden passed the CHIPS and Science Act, promising to invest nearly $53 billion to bring semiconductor supply chains back to the United States to “create jobs, support American innovation, and protect national security.” The $500 billion Stargate initiative announced by President Trump builds on the theme. It proposes constructing vast data centers to scale access and capability in AI to advance developments across healthcare, national defense, and enterprise productivity.

Silicon Valley appeared to lead the AI landscape, but DeepSeek’s arrival shows otherwise. Despite sanctions and restrictions, China has caught up in the race. DeepSeek and Alibaba’s Qwen show that US dominance in foundation models is no longer guaranteed. However, it also raises significant national security concerns.

With the cost of entry no longer a barrier, nation-states and rogue actors now have a path to build powerful AI models using far fewer resources. The genie is out of the bottle, and policymakers have yet to establish safeguards. AI governance will become as much a national security priority as AI development itself. Governments will either accelerate sovereign AI efforts or impose strict regulations to prevent AI misuse and protect their ecosystems.

The AI race is no longer just about technology—it’s about control. The next battleground won’t be model size or token count but who controls the infrastructure on which AI runs. Governments will either accelerate sovereign AI efforts or impose strict regulations to protect their ecosystems. While these developments challenge the US and Silicon Valley, AI is an innovation-driven industry, and competition will spur it.

The Bottom Line: Enterprises must adapt or fall behind.

Using open-source models to deploy AI will reduce deployment costs, sparking experimentation and innovation. Enterprises should shift from chasing the latest models to building AI applications that solve real business problems.

To succeed in this new AI landscape, enterprises must:

  • Address “debt” issues—technical, cultural, or data-related—to effective AI integration
  • Prioritize interoperability and avoid lock-in to a single AI ecosystem
  • Stay agile amid geopolitical shifts—the AI race is as much about regulation and control as technology

It’s no longer about who builds the biggest AI model but who uses it to build the most innovative businesses.

Sign in to view or download this research.

Login

Register

Insight. Inspiration. Impact.

Register now for immediate access of HFS' research, data and forward looking trends.

Get Started

Download Research

    Sign In

    Insight. Inspiration. Impact.

    Register now for immediate access of HFS' research, data and forward looking trends.

    Get Started