more_reports

Get the Latest Investment Ideas Delivered Straight to Your Inbox. Subscribe

TICKERS: GOOGL, CRDO, META, NVDA

The #1 Threat to the AI Stock Surge
Contributed Opinion

View Important Disclosures for this Article
Share on Stocktwits

Source:

Stephen McBride Stephen McBride of RiskHedge shares what he believes to be the biggest threat to AI.

What poses the gravest peril to the artificial intelligence (AI)-fueled stock market ascent?

It's not an economic downturn, excessive spending by tech titans, or geopolitical tensions with China. The largest risk is neglecting to invest in the optimal AI equities.

The AI megatrend continues to evolve, uncovering new bottlenecks and anointing new victors.

Initially, Nvidia Nvidia Corp. (NVDA:NASDAQ)  reigned supreme . . . then utilities providing energy to AI data centers experienced a surge.

Earlier this year, Disruption_X members doubled their capital when we identified a networking constraint. Credo Technology Group Holding Ltd. (CRDO:NASDAQ) is among the few firms resolving high-speed data transmission within data centers. We secured a 118% return as the market abruptly recognized that AI's scalability hinged on enhanced connectivity.

Now the spotlight is shifting from training to inference.

The inaugural wave of AI was dominated by training — instructing immense models like ChatGPT to comprehend the world. This necessitated sprawling data centers equipped with tens of thousands of GPUs operating for weeks on end.

Nvidia dominated this wave as the sole company with GPUs capable of satisfying AI's prodigious compute requirements.

We deftly navigated that wave, with Disruption Investor members pocketing +500% NVDA gains. But presently, we've entered the inference era.

Inference refers to AI actually accomplishing tasks.

When you prompt it to respond to an inquiry, compose an email, condense a document, orchestrate a vacation, or tackle a mathematical problem, that all falls under inference.

Training occurs once. Inference transpires trillions of times daily. Alphabet Inc. Class A (GOOGL:NASDAQ) alone now processes 1.3 quadrillion AI "tokens" monthly!

Thus far, nearly all inference still unfolds inside colossal data centers rivaling small cities in scale. Increased AI utilization translates to more GPUs, which is why AI behemoths like Meta Platforms Inc. (META:NASDAQ) and OpenAI are constructing data centers the size of Manhattan.

Introducing "edge AI."

We're transitioning from a world where all AI models reside and operate in gargantuan data centers . . . To one where you can run top-tier models on your laptop. Research from premier AI analysis firm Epoch AI reveals that by leveraging a single off-the-shelf Nvidia chip, you can run AI models on your laptop that would have been the world's most sophisticated a mere six to 12 months ago.

Algorithmic advancements are rendering AI models so efficient and economical that soon you'll be able to run "GPT-5" on your phone!

I realize that's a bit technical. But it represents one of the biggest risks and opportunities in AI today.

Rather than every interaction with your beloved AI assistant being routed to some immense data center, your devices will address most queries.

Your phone will soon boast a built-in personal AI. That's edge AI. And only when the local AI encounters a challenge will it "phone home" — the massive data center model — for assistance.

This transition from large, centralized inference to billions of miniature inference engines constitutes the most significant shift in the AI landscape since ChatGPT's debut. Not because people cease using AI, but because usage stops correlating 1:1 with data center expenditure. It fundamentally reshapes who profits in the next wave. Instead of the straightforward "more GPUs, bigger data center" mantra. . . 

Memory emerges as the big winner from edge AI.

Most investors fixate on the GPUs powering AI "brains." But beneath the surface, AI relies on two markedly different kinds of chips:

Logic chips, like Nvidia's GPUs, handle the cognition.

And memory chips that store and supply the data that those GPUs require.

Every AI task — querying ChatGPT, generating an image, summarizing a document — boils down to two things occurring at breakneck speed. A logic chip executing immense mathematical calculations. And that chip is perpetually retrieving data from memory, then transmitting results back.

A helpful analogy is a kitchen. The GPU is the chef. Memory is the pantry. If the pantry is situated across the building, the chef squanders most of his time traversing back and forth for ingredients. But if the pantry resides adjacent to the stove, cooking accelerates dramatically.

That's precisely the dilemma AI confronts today.

Compute power has skyrocketed. Memory access hasn't kept pace.

According to SK Hynix, over 90% of the time an AI model takes to respond is spent shuttling data between logic and memory, not performing computation. That's the so-called "memory wall."

GPUs continue to accelerate. But memory bandwidth and proximity haven't matched that progress. The consequence is that $50,000 AI chips sit idle, awaiting data.

As models expand and endeavor to "remember" lengthier conversations, images, and context, the quantity of memory they'll necessitate in close proximity to the computer will explode. And that's why memory represents the next trillion-dollar upheaval in AI spending.

In October, we added our inaugural memory stock to our Disruption Investor portfolio.

We're already up 33% on it. For comparison, Nvidia is down 1% since. Likewise, the S&P 500 return has been flat. It illustrates just how crucial it is to invest in the right segment of the AI market. The most substantial gains stem from positioning ahead of the next bottleneck — not after it garners headlines.

If you want to stay ahead of where the AI puck is headed, make sure you’re getting The Jolt. It’s my free weekly letter where I break down the biggest market shifts and show you where disruption is creating new investment opportunities — before they hit the front page. Join here.


If you enjoyed this, make sure to sign up for the Jolt, Stephen McBride's twice-weekly investing letter-where innovation meets investing. Go here to join

Important Disclosures:

  1. Stephen McBride: I, or members of my immediate household or family, own securities of: None. My company has a financial relationship with: None. My company has purchased stocks mentioned in this article for my management clients: None. I determined which companies would be included in this article based on my research and understanding of the sector.
  2. Statements and opinions expressed are the opinions of the author and not of Streetwise Reports, Street Smart, or their officers. The author is wholly responsible for the accuracy of the statements. Streetwise Reports was not paid by the author to publish or syndicate this article. Streetwise Reports requires contributing authors to disclose any shareholdings in, or economic relationships with, companies that they write about. Any disclosures from the author can be found  below. Streetwise Reports relies upon the authors to accurately provide this information and Streetwise Reports has no means of verifying its accuracy. 
  3.  This article does not constitute investment advice and is not a solicitation for any investment. Streetwise Reports does not render general or specific investment advice and the information on Streetwise Reports should not be considered a recommendation to buy or sell any security. Each reader is encouraged to consult with his or her personal financial adviser and perform their own comprehensive investment research. By opening this page, each reader accepts and agrees to Streetwise Reports' terms of use and full legal disclaimer. Streetwise Reports does not endorse or recommend the business, products, services or securities of any company. 

For additional disclosures, please click here.





Want to read more about Technology investment ideas?
Get Our Streetwise Reports Newsletter Free and be the first to know!

A valid email address is required to subscribe