Artificial Intelligence Boom Continues, But Nvidia Faces Risks

Even though there are no indications suggesting a slowdown in the artificial intelligence (AI) boom, AI stocks have faced significant declines in 2025 due to the Trump Administration’s threats and policies regarding tariffs. This uncertainty has led to increased volatility in the market, impacting investor confidence.

Over the last two years, AI stocks have experienced a substantial surge, leading to high valuations as they entered this year. However, fears of an impending economic recession have raised doubts about the ambitious AI investments that major tech companies had projected earlier this year, causing a reevaluation of their strategies.

Fortunately, on Wednesday, April 9, the administration eased the most extreme reciprocal tariffs, which was a relief for many in the tech industry. Additionally, several high-profile tech CEOs reaffirmed their substantial plans for AI investments this year, indicating that the demand for AI technologies remains exceptionally robust, despite the surrounding challenges.

However, these optimistic statements from tech leaders come with cautionary notes regarding potential risks for the largest AI company, Nvidia (NVDA 2.91%), which has been a key player in the AI landscape.

Understanding the Resilience of AI Amid Market Uncertainty

The positive news is that, despite the significant market turmoil, the AI revolution continues to thrive. This week, two CEOs from the “Magnificent Seven” tech companies confirmed the ongoing momentum of AI, even as broader markets faced declines. Their insights are crucial for understanding the future trajectory of AI investments.

One notable instance was Alphabet (GOOG 2.56%) (GOOGL 2.79%), which hosted its Google Cloud Next 2025 event last week. This event may have gone unnoticed by many investors, but it featured several groundbreaking announcements, particularly about Google’s new Gemini 2.5 model. CEO Sundar Pichai emphasized that Alphabet is committed to investing a staggering $75 billion in AI data centers this year, reflecting confidence in the sector’s potential for growth.

Moreover, it wasn’t just Alphabet that shared positive news. Executives from Google highlighted their AI offerings and the enthusiasm of their cloud customers. Notably, Intuit announced that it is “doubling down” on AI initiatives, while another significant client, Verizon, reported substantial advantages from utilizing Google’s AI models, showcasing the practical benefits of these technologies.

At the same time, the challenges facing AI stocks are not solely due to tariff issues. The advent of China’s low-cost DeepSeek R1 model in January has also contributed to the turbulence in AI stocks. Nevertheless, Amazon (AMZN 2.01%) CEO Andy Jassy addressed these financial concerns during a CNBC interview, providing reassurance to investors and stakeholders.

People often misunderstand the dynamics at play. We observed this with AWS [Amazon Web Services] as well; customers greatly appreciate when the cost per unit decreases, as it allows them to save money. However, this doesn’t translate to reduced overall spending. Instead, it empowers them to innovate further, leading to increased expenditures in total.

In his recent letter to shareholders, Jassy elaborated on the transformative nature of generative AI, stating, “Generative AI is poised to revolutionize nearly every customer experience we currently understand and create entirely new experiences we have only dreamed of.” He also noted that Amazon is witnessing triple-digit growth rates in its AI revenue streams, underscoring the sector’s vitality.

Despite the macroeconomic concerns, industry insiders remain steadfast in their belief that generative AI will reshape various sectors of the economy, and they are eager to participate in this transformation. This belief suggests that AI spending is likely to maintain its strength, irrespective of broader economic conditions.

Major Cloud Providers Commit to Reducing AI Costs, Directly Challenging Nvidia

While the AI revolution appears resilient, the landscape is shifting, particularly concerning AI-related costs. These dynamics could pose additional challenges, especially if a recession materializes, impacting profitability across the sector.

This shift could incrementally complicate matters for Nvidia (NVDA 2.91%). Until now, Nvidia has been synonymous with AI development, with strong demand for its latest Blackwell chip indicating a healthy appetite for its offerings.

However, both Amazon and Google have made it clear that they are heavily focused on reducing AI costs. Jassy particularly emphasized the necessity of lowering AI expenses, stating:

AI does not need to be as costly as it currently is, and it won’t be in the future. The primary issue lies with the chips. Most AI systems developed so far have relied on a single chip provider, which drives up costs considerably.

It is evident that Jassy is referring to Nvidia. The pricing of Nvidia’s chips ranges from $30,000 to $40,000 each, which explains why the costs for large-scale AI deployments can reach staggering amounts, especially when discussing hundred-thousand-GPU or million-GPU clusters. With Nvidia achieving gross margins of 75% or higher, there are arguments suggesting that the company may be over-earning, provided that competitors can develop more cost-effective alternatives.

While Nvidia’s technology and its CUDA software lead the market at present, the well-capitalized cloud giants are determined to disrupt this status quo. Jassy mentioned that Amazon’s current Trainium2 chip generation offers 30%-40% better price performance compared to Nvidia’s existing models, likely referring to the H100. As Nvidia works on scaling its new Blackwell chip, Amazon is concurrently developing the next iteration, Trainium3.

In his interview, Jassy remarked, “If you were to join our meetings with the AWS team right now, you would see their commitment to making AI costs substantially lower than today’s levels.” This determination reflects Amazon’s strategic focus on enhancing affordability in AI.

Similarly, during the Alphabet event, executives introduced an impressive new in-house chip known as Ironwood. Ironwood represents the company’s seventh generation of tensor processing units (TPUs), which are essential for Alphabet’s internal AI operations.

This cutting-edge chip is designed to operate within 256-chip servers or massive configurations of 9,216-chip clusters. Google intends to utilize Ironwood not only for its Gemini models but also for clients who wish to train their own AI models using Google’s infrastructure. Each Ironwood chip boasts six times the memory capacity compared to the previous TPU generation, with performance metrics reaching an astonishing peak inference throughput of 4,614 teraflops. This represents a tenfold increase over the fifth-generation TPU and a fivefold increase over the sixth generation, setting a new benchmark in the industry.

The Competitive Landscape: Nvidia Faces New Challenges

Indeed, Nvidia enjoys a multi-year head start in the AI chip market, with its CUDA software providing a competitive edge, at least for the time being. However, Amazon and Google are formidable players capable of producing chips at wafer cost, which presents a significant challenge to Nvidia’s high margins, which currently stand at 75%.

This disparity indicates that Nvidia’s chips cost approximately four times more than the in-house chips produced by these cloud providers. As both companies strive to cut AI costs and eliminate Nvidia from the equation, there is potential for Nvidia’s revenue growth to decelerate, or for its profit margins to decline. This shift will depend on whether Amazon, Google, and other cloud giants successfully design and deploy their own silicon while ensuring ease of use for AI developers.

Source link

Share It

Share this post

About the author