On Jan. 20, Chinese artificial intelligence start-up DeepSeek released its first-generation reasoning models. In the release, the company made some astonishing claims.
First, DeepSeek said its DeepSeek-R1 model achieves performance comparable with OpenAI-o1, widely considered the best-performing model available across most domains. Considering the Chinese company is working with significantly worse hardware than OpenAI and other American companies, that’s certainly remarkable.
Where to invest $1,000 right now? Our analyst team just revealed what they believe are the 10 best stocks to buy right now. See the 10 stocks »
Even more impressive is that the company claims to have achieved these results at an incredibly low cost. R1 was built on DeepSeek’s V3 large language model, released in December. The company estimates the compute cost for training V3 came to just $5.6 million. To put that in perspective, OpenAI’s GPT-4 cost $100 million to train.
DeepSeek achieved similar performance at a fraction of the cost. And since it’s completely open source, allowing anyone to copy its techniques, it will have lasting implications on the entire industry.
Two companies, in particular, are in a great position to benefit from DeepSeek’s innovations.
The long-term impact of DeepSeek-V3 and R1
DeepSeek focused on maximizing the efficiency of its limited hardware capabilities. Because of AI chip export restrictions, Nvidia isn’t able to sell its most powerful H100 GPUs in China. Instead, it sells H800 GPUs, which are specifically designed to comply with U.S. regulations. The H800 reduces the chip-to-chip transfer rate, reducing the speed at which it’s possible to train large AI models.
Due to such limitations, DeepSeek developed processes that enable it to reduce the amount of data it needs to transfer throughout the system at any given time. For example, its “mixture of experts,” or DeepSeekMoE, introduced last year, made it so it only had to activate part of the model to respond to queries.
DeepSeek isn’t the only company using this method, but its novel approach also made its training more efficient. Most methods involve more training overhead in order to reduce the cost of inference later on.
The start-up also developed methods to reduce the amount of memory required for AI inference by compressing important data before storing and transmitting it. It brought new approaches to load balancing, which is how processes are distributed across its network of GPUs.
The result of these and other breakthroughs isn’t just an AI model that’s faster to train and costs less. The longer-term impact of DeepSeek’s innovations are that it’s cheaper to run, and it can run on less-capable hardware. In other words, AI inference just got a lot more accessible.
In a world with the potential to run AI systems on hardware that fits in your pocket and for a tiny fraction of a penny, there are two very big winners: Apple (NASDAQ: AAPL) and Meta Platforms (NASDAQ: META). Here’s why.
Making on-device AI a reality
When Apple started developing its artificial intelligence features for the iPhone and other devices, it put data privacy at the forefront of its efforts. Apple Intelligence is designed to run as much as possible on the iPhone. When it has to make a call to the cloud, it takes every step it can to encrypt user data in the process.
There’s a reason the new AI features Apple introduced last year are only available on iPhones released in the last 15 months. Since Apple is trying to keep everything on the device, it needs enough processing power and memory to run its AI. The newest iPhone chip, the A18 Pro, boosted the memory bandwidth to support faster AI processing.
Apple could adopt many of DeepSeek’s methods to make the iPhone more capable of handling AI inference. That opens the door for features like a more conversational and context-aware Siri, faster translation with no internet connection needed, smart camera features, and better productivity tools. More advanced AI features could boost Apple’s iPhone sales and services revenue.
Apple stock currently trades for a relatively high multiple of 32.5 times forward earnings. But with its massive cash flow, which it uses to buy back shares, and improving profitability from services revenue, it can justify that high multiple, especially considering the consistency Apple has exhibited in recent years. The potential boost from major improvements to on-device AI could be a catalyst for growth over the next few years.
Scaling AI to 3 billion people
Meta’s AI spending is growing fast as it works to scale its capabilities and expand AI features to more parts of its business. Capital expenditures grew about 40% in 2024, and management said it expects a 60% increase in 2025. Those AI investments have paid off well for Meta, resulting in stronger engagement, better advertising tools, and new features like Meta AI, which have the potential to become meaningful sources of revenue down the road.
One important decision Meta made when it came to AI was to open-source its AI model Llama. One of the impetuses behind that decision was to help make the model more efficient. In fact, DeepSeek used Llama as the foundation for developing R1, so this is exactly what Meta hoped for.
Reducing the cost of AI inference could unlock huge profits for Meta. It’s a problem Meta’s been working on for a long time. “A lot of the stuff is expensive, right, to kind of generate an image or a video or a chat interaction,” Zuckerberg said during anearnings callin Feb. 2023. “So one of the big interesting challenges here also is going to be how do we scale this and make this work more efficient so that way, we can bring it to a much larger user base.”
DeepSeek’s answering that challenge and giving Meta the tools it needs to scale AI to its 3 billion users. While Meta might not slow down its spending on AI anytime soon, it’s now capable of making a lot more money off the spending it’s committed to.
Meta stock has zoomed higher on the DeepSeek news, reaching a new all-time high. Still, shares trade for 26.8 times forward earnings estimates as of this writing. Meta’s also a cash cow, using excess free cash flow to buy back shares and support strong earnings-per-share growth. If it can make AI more profitable, it stands to see earnings climb substantially over the next few years, making it well worth the price.
Should you invest $1,000 in Meta Platforms right now?
Before you buy stock in Meta Platforms, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Meta Platforms wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $735,852!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than quadrupled the return of S&P 500 since 2002*.
Learn more »
*Stock Advisor returns as of January 27, 2025
Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Adam Levy has positions in Apple and Meta Platforms. The Motley Fool has positions in and recommends Apple, Meta Platforms, and Nvidia. The Motley Fool has a disclosure policy.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.