MarketLens
The Best Chip Stock That's Not Nvidia: A Case for the "Boring" Winners

Why the smartest money in AI isn't chasing the next GPU disruptor—it's betting on the companies that win no matter who does.
Everyone wants to find the next Nvidia. And I get it—the returns have been nothing short of spectacular. But here's the problem with that mindset: when you're hunting for "the next Nvidia," you're essentially betting you can outguess the market on which company will dethrone an entrenched incumbent with a massive software moat. That's a hard game to win.
What if there's a better question to ask?
Instead of "who beats Nvidia," ask yourself: "who wins regardless of whether Nvidia keeps winning?" That reframe changes everything. And when you start looking at the semiconductor landscape through that lens, a much clearer investment picture emerges.
The Real AI Trade: Picks and Shovels, Not Gold Nuggets
During the California Gold Rush, most prospectors went broke. You know who got rich? The people selling picks, shovels, and blue jeans. The same principle applies to the AI boom.
Nvidia makes the "gold nuggets"—the GPUs that power AI training. But there's an entire supply chain of companies making the equipment, providing the manufacturing, and supplying the specialized components that every AI system needs. These companies don't have to win a competitive war. They just have to show up and collect their share of a massively expanding market.
Let me walk you through the landscape.
TSMC: The Company Everyone Depends On
If I had to pick a single non-Nvidia semiconductor investment today, it would be Taiwan Semiconductor Manufacturing Company (TSM).
Here's why: TSMC doesn't compete with Nvidia. It manufactures Nvidia's chips. It also manufactures AMD's chips. And Apple's. And Qualcomm's. And Broadcom's custom AI accelerators. Every major fabless chipmaker on the planet depends on TSMC's cutting-edge foundries to turn their designs into actual silicon.
This is what I call "the multiplier effect." When Nvidia sells more GPUs, TSMC wins. When AMD gains market share, TSMC wins. When Google, Amazon, and Meta build custom AI chips (which they're all doing), TSMC wins. The company captures value from the entire AI ecosystem without having to pick winners.
The financial case is compelling too. TSMC trades at roughly 22x forward earnings—a meaningful discount to ASML, its closest infrastructure peer, which sits at around 26.5x. But here's what makes TSMC more interesting right now: projected EPS growth of nearly 40% in 2025, compared to ASML's 35%. And in 2026? TSMC is projecting 11-12% growth while ASML's growth essentially flatlines.
Why the difference? ASML sells the equipment foundries use to make chips. When foundries are in heavy expansion mode, ASML does great. But we're entering a phase where foundries are focused on utilizing the massive capacity they've already built. That's when the foundry itself—TSMC—benefits from running at full utilization, while equipment orders slow.
TSMC also has pricing power that's rare in semiconductors. The company has announced 5-10% price increases on its most advanced nodes starting in 2026. When you're the only company capable of manufacturing the world's most advanced chips at scale, you get to name your price.
The geopolitical risk? Yes, Taiwan's location matters. But consider this: TSMC is so systemically important that Apple, Nvidia, AMD, and Qualcomm all depend on it for their most critical products. That creates a kind of mutually assured defense. The company is also actively diversifying, building advanced fabs in the US and Japan. The risk is real but not unpriced—in fact, the "Taiwan discount" arguably makes TSMC undervalued relative to its fundamentals.
Broadcom: The Quiet Juggernaut
If TSMC is the best pure infrastructure play, Broadcom (AVGO) might be the best risk-adjusted play.
Broadcom has two engines. The first is its infrastructure software business, supercharged by the VMware acquisition. This segment generated $6.8 billion in Q3 2025 revenue with mind-boggling margins—93% gross and 77% operating. Software revenue is predictable, sticky, and recession-resistant. It's the ballast that keeps Broadcom stable when semiconductor cycles get choppy.
The second engine is custom AI silicon. Broadcom designs bespoke AI accelerators for hyperscale customers like Google and Meta. These aren't off-the-shelf chips competing head-to-head with Nvidia. They're custom designs optimized for specific workloads, built through multi-year engineering partnerships that create massive switching costs.
The company also dominates AI networking—the Ethernet switches that connect massive GPU clusters. Every time a hyperscaler builds out AI infrastructure, Broadcom's networking chips are there making sure all those expensive accelerators can talk to each other efficiently.
Q4 2025 AI semiconductor revenue is projected to hit $6.2 billion, up 66% year-over-year. The company has a consolidated backlog of $110 billion. That's not a typo—one hundred ten billion dollars in committed future revenue.
The beauty of Broadcom's position is that it's insulated from the Nvidia vs. AMD battle entirely. Its AI revenue comes from custom silicon and networking infrastructure, not competing GPUs. And if AI spending ever slows, the software business provides a safety net that pure-play semiconductor companies don't have.
AMD: The High-Beta Bet
Now, what about the companies actually trying to challenge Nvidia directly?
Advanced Micro Devices (AMD) is the primary contender. The company has been executing well, with its MI300 accelerators ramping faster than any product in AMD history. Major cloud providers like Oracle are deploying AMD's AI chips at scale. The roadmap is aggressive: MI350 launching soon, MI450 in late 2026, MI500 in 2027.
AMD is projecting a data center AI revenue CAGR exceeding 80% over the next 3-5 years. If they hit those numbers, the stock will do very well.
But here's the reality check: AMD's AI GPU market share is still under 10%. Nvidia's dominance isn't just about hardware—it's about CUDA, the software ecosystem that developers have built on for over a decade. Replicating that ecosystem is harder than building competitive silicon.
Historical context is instructive. AMD spent over five years—twenty consecutive quarters—grinding away at Intel's server CPU market share before reaching 20%. And that was in a market where the software ecosystem was far less entrenched than CUDA is today. Expecting faster progress in AI accelerators seems optimistic.
AMD is a high-beta play. If you have conviction in their execution and patience for a multi-year timeline, it could deliver outsized returns. But it requires near-flawless roadmap execution, successful software ecosystem development, and sustained hyperscaler willingness to diversify away from Nvidia. That's a lot of things that need to go right.
Intel: The Turnaround That Hasn't Turned Yet
Intel (INTC) is attempting an ambitious reinvention. The IDM 2.0 strategy aims to restore Intel's manufacturing leadership while building a world-class foundry business. The "five nodes in four years" roadmap would be an unprecedented feat.
The strategic logic is sound. Geopolitical concerns about semiconductor manufacturing concentration in Taiwan have governments eager to see alternative capacity in the US and Europe. Intel is positioning itself as the solution.
But execution has been challenging. The company is burning through capital at a staggering rate while trying to simultaneously catch up on manufacturing technology and build an external foundry business. The 18A process node is reportedly behind schedule. Intel still depends on TSMC for about 10% of its revenue—a dependency it's trying to eliminate while also competing with TSMC for foundry customers.
In the AI accelerator market, Intel has pivoted to a value positioning, pricing its Gaudi chips significantly below Nvidia's offerings. It's a pragmatic strategy, but it also limits the upside.
Intel is a speculative turnaround play. If the company executes its ambitious roadmap, there's significant value to unlock. But the execution risk is high enough that I wouldn't make it a core holding.
Micron: Riding the Memory Supercycle
The AI story isn't just about compute—it's also about memory. AI accelerators require High Bandwidth Memory (HBM) to function efficiently, and that's transformed the memory business.
Micron (MU) is projecting gross margins to expand from 41% in 2025 to 52% in 2026, driven by HBM demand. The data center segment now represents 56% of revenue. HBM3e supply is essentially sold out. The company is ramping capital expenditure to over $18 billion in fiscal 2026 to build out capacity.
This is a different Micron than the purely cyclical memory company of years past. AI is providing secular demand that smooths out the boom-bust cycles memory has historically experienced.
The risk is competition. Samsung is aggressively trying to capture HBM market share from the leader, SK Hynix. More competition could pressure pricing. But with demand growth likely to outpace any pricing pressure, and the richer HBM product mix expanding margins, Micron offers leveraged exposure to AI infrastructure growth.
The Bottom Line: Choose Your Own Adventure
Here's how I'd frame the decision:
TSMC is the highest-conviction pick for investors who want core infrastructure exposure. You're buying the indispensable manufacturer that wins regardless of which chipmaker leads the next generation. The valuation is reasonable, the growth is strong, and the moat is enormous.
Broadcom is ideal for investors who want AI exposure with downside protection. The software business provides margin stability that pure semiconductors can't match. The custom ASIC and networking businesses capture AI infrastructure spending without the competitive risks of the GPU market.
AMD is for investors with high risk tolerance and a multi-year horizon. If AMD executes its roadmap and builds out a competitive software ecosystem, the upside is substantial. But it requires patience and tolerance for volatility.
Micron offers leveraged exposure to AI memory demand. It's not a core infrastructure play like TSMC, but it captures a critical piece of the AI supply chain with an improving margin profile.
If I'm building a semiconductor allocation outside Nvidia, I'm overweighting TSMC for foundation, adding Broadcom for stability and diversification, and sizing AMD appropriately for its higher risk profile.
The AI boom is real. The infrastructure buildout is happening. But you don't have to bet on who wins the accelerator wars to profit from it. Sometimes the smartest trade is buying the companies that collect the tolls while everyone else fights for the gold.
Disclaimer: This article is for informational purposes only and does not constitute financial advice. Always conduct your own due diligence before making investment decisions.
Related Articles
Category
You may also like
No related articles available
Breaking News
View All →No topics available at the moment






