Most investors chasing AI stocks today are buying the wrong thing. They’re grabbing up anything with “artificial intelligence” in the pitch, buying into software companies and application makers that promise transformative products—while the real money flows to the companies building the pipes, the power, and the silicon that make AI possible at all. That’s the asymmetry most people miss, and it’s cost them serious returns since the 2022 ChatGPT launch sparked the current mania.
The distinction isn’t subtle: build-out companies are the firms supplying the infrastructure that AI models actually run on—data centers, networking equipment, power generation, and the chips that process trillions of calculations. Product companies are those building applications on top of that infrastructure—ChatGPT itself, Copilot features, AI-powered sales tools, or any software claiming to use AI for end users. One category has defendable moats and proven revenue streams tied to undeniable demand. The other is crowded, competitive, and full of companies whose “AI advantage” might disappear in eighteen months.
I’ve spent years analyzing tech equities, and here’s what the investment community is slowly waking up to: the infrastructure layer of AI is systematically underappreciated by retail investors, while the application layer is systematically overhyped. This creates a clear opportunity—if you know where to look and what metrics actually matter.
The most straightforward way to profit from AI build-out is buying companies whose core business is infrastructure itself. These aren’t firms adding AI features to existing products; they’re selling the shovels in the gold rush.
NVIDIA remains the king here, and no amount of “it’s already too expensive” hand-wringing changes the fundamental demand reality. Their data center revenue—the segment that includes AI chips—reached approximately $47.5 billion in fiscal year 2024, up over 200% year-over-year. That’s not a product company dabbling in AI; that’s an infrastructure company whose entire trajectory is defined by AI demand. But here’s the nuance most analysts miss: NVIDIA is becoming less of a pure play every quarter. Their acquisitions in networking (Mellanox), their software stack (CUDA, AI Enterprise), and their move into full data center systems mean they’re evolving from chip supplier to full-stack infrastructure player. That’s not necessarily a bearish signal—but it does mean you’re buying a more complex company than you were three years ago.
Broadcom is the other infrastructure name that deserves serious attention. Their custom ASIC business—specifically the chips they design for hyperscale customers like Google and Meta—generated approximately $12 billion in revenue in 2023, with meaningful growth continuing into 2024. Unlike NVIDIA’s general-purpose GPUs, Broadcom’s custom silicon is deeply embedded in the largest AI deployments in the world. If you’re looking for a less obvious play that still benefits directly from AI infrastructure spend, Broadcom is it.
Takeaway: Start your screen with companies deriving the majority of revenue from data center products. If AI disappeared tomorrow, would their core business still exist? If the answer is yes, you’re probably looking at a product company, not a build-out play.
Here’s something the AI investment analysis community largely ignores: every ChatGPT query uses roughly ten times the electricity of a standard Google search. Training GPT-4 consumed electricity equivalent to roughly 1,700 American homes for a year. And we’re just at the beginning of model scaling.
This makes power infrastructure one of the most asymmetrically underappreciated AI investment themes. While everyone debates which application wins, the companies supplying electricity to data centers are quietly collecting revenue regardless of which AI company dominates.
Vistra Corp operates the largest competitive power generation fleet in the United States, with significant natural gas and nuclear assets positioned near major data center hubs in Texas, the PJM Interconnection region, and California. Their retail electricity business serves data center customers directly. In 2023, Vistra generated over $8 billion in revenue, and their growth trajectory is increasingly tied to data center demand for constant, reliable power.
NextEra Energy, the world’s largest utility by market cap, has been aggressively positioning for data center load growth across Florida and other high-demand regions. Their renewable energy footprint—particularly solar and battery storage—positions them to supply the additional gigawatts that AI data centers will require over the coming decade.
The honest limitation here: utilities are slow-growth, regulated businesses. You’re not going to see the returns you might get from a breakout chip company. But the resilience is real, and the AI demand tailwinds are increasingly undeniable. If you want to reduce portfolio volatility while capturing AI infrastructure growth, power companies are your best vehicle.
Takeaway: Don’t ignore the electricity bill. The companies generating and distributing power to AI data centers have business models that don’t require picking a winner in the application layer—they get paid either way.
This is where most retail investors get caught. A company announces “AI-powered features” and the stock rallies 15%. Six months later, the revenue impact is negligible, but the share price has already corrected.
The critical skill is learning to evaluate revenue attribution—what percentage of a company’s actual sales is genuinely driven by AI demand versus enhanced by AI marketing. This requires reading financial statements with a skeptical eye.
When Microsoft reports Azure revenue growth, they break out “AI services” as a contributor—but that category includes everything from the actual GPUs they rent out to minor API integrations. In Q2 2024, Microsoft reported Azure growth of approximately 29%, with AI services contributing about 8 percentage points of that growth. That’s meaningful, but it also means the majority of Azure’s growth is still coming from traditional cloud services—compute, storage, and software hosting. Microsoft is legitimately a build-out play because of their massive data center infrastructure, but their revenue is more diversified than the AI headline suggests.
Contrast this with C3.ai, an enterprise AI software company that has seen its revenue grow but at a pace that hasn’t justified the valuation extremes the stock has experienced. Their customer count and revenue have increased, but the competitive moat in enterprise AI software is notoriously thin, and many of their deals are pilot programs rather than permanent revenue streams.
The rule: if a company needs to explain why their AI revenue matters, it probably doesn’t matter yet. Companies with genuine AI infrastructure revenue don’t need to convince you—they can point to purchase orders, capacity sold out through next year, or earnings beats driven specifically by AI chip demand.
Takeaway: Strip out the AI marketing and look at what percentage of revenue is actually tied to AI-specific demand. If it’s under 20% and the stock is priced like it’s entirely an AI company, you’re looking at a product company in disguise.
The four cloud hyperscalers—Amazon, Microsoft, Google, and Oracle—all benefit massively from AI infrastructure spending. But their exposure varies significantly, and that matters for your allocation.
Amazon Web Services remains the dominant cloud platform, generating approximately $25 billion in quarterly revenue. Their AI play involves both providing GPU instances (rental time on Nvidia-powered servers) and building their own custom silicon (Trainium and Inferentia chips). However, Amazon’s massive e-commerce business means you’re also buying significant retail exposure. For a purer AI infrastructure play from the hyperscaler group, Microsoft is the stronger choice—Azure’s AI services are growing faster than AWS’s, and their partnership with OpenAI gives them embedded exposure to the most talked-about AI company in the world.
Google has positioned themselves uniquely through their TPU (Tensor Processing Unit) chips, which they use both internally for their own AI products and externally for cloud customers. This gives them a more vertically integrated infrastructure play than competitors who rely entirely on Nvidia hardware. Their 2024 revenue showed cloud growth accelerating to approximately 28% year-over-year, with AI services becoming an increasingly meaningful contributor.
Oracle is the most controversial of the group. Their cloud infrastructure business is much smaller than the big three, but their database customers are increasingly moving to Oracle Cloud. The question is whether they can scale fast enough to matter in the AI infrastructure race.
Takeaway: The hyperscalers are all good infrastructure plays, but Microsoft and Google offer the cleanest exposure to AI-specific growth. Amazon’s diversification is a strength if you’re worried about cloud market share shifts, but it dilutes your AI focus.
If you’ve followed AI infrastructure investing at all, you’ve heard about GPUs and data centers. But the networking equipment that moves data between servers is equally critical—and dramatically underappreciated in the investment community.
Training an AI model requires thousands of GPUs to communicate with each other constantly. The faster they can share data, the faster models train. This has made high-speed networking equipment—specifically Arista Networks, Marvell Technology, and Cisco—essential to the AI build-out.
Arista Networks has been the clear winner in AI networking. Their revenue has grown approximately 30% year-over-year in recent quarters, driven largely by hyperscale customers deploying massive AI clusters. Their top customers include Microsoft, Meta, and Google—all companies aggressively building AI infrastructure. Arista’s switches and networking software are specifically designed for AI workloads, which gives them a genuine competitive edge over more generalist networking vendors.
Marvell Technology plays a different game. Their custom silicon business—including chiplets and networking semiconductors—positions them as a supplier to the hyperscalers rather than a direct networking competitor. Their 2024 revenue showed particular strength in AI-specific semiconductor demand, and their positioning with major customers suggests this is sustainable.
One counterintuitive admission: I’ve been slower to appreciate networking than I should have been. I focused heavily on GPUs and data centers initially, assuming networking was a commoditized business. The reality is that AI networking requirements are genuinely different from traditional enterprise networking—the latency tolerances, the bandwidth demands, and the scale requirements create real moats for companies that have solved these problems. Arista is one of the most underappreciated AI stocks in the market today.
Takeaway: Don’t ignore the pipes. The companies building the networking infrastructure that connects AI systems are essential, less crowded, and trading at valuations that don’t fully reflect their AI exposure.
Training large language models requires enormous amounts of data, and that data has to be stored somewhere. But the storage requirements for AI are different from traditional enterprise storage—AI training datasets need to be accessible at extremely high speeds, which has driven demand for flash storage and specialized data lake infrastructure.
Dell Technologies and HPE have both positioned themselves as storage providers for AI infrastructure, though their exposure varies. Dell’s infrastructure solutions group, which includes storage, generated approximately $9 billion in recent quarterly revenue. Their partnership with Nvidia on AI infrastructure solutions gives them a credible position in the build-out chain.
Pure-play storage companies like NetApp have also seen AI-driven demand increases, particularly for their AFF storage systems optimized for AI workloads. NetApp’s pivot to cloud-connected storage and their partnerships with the hyperscalers have positioned them as a potential beneficiary of increased AI data management needs.
The honest assessment: storage is the least exciting AI infrastructure category, and the returns will be more modest than chips or networking. But for a diversified AI infrastructure portfolio, storage exposure provides a useful hedge—the demand is real regardless of which application companies win.
Takeaway: Include storage for diversification, but don’t overweight it. The growth rates aren’t as compelling as networking or chips, and the competitive moats are weaker.
Here’s a trend that will reshape the AI semiconductor landscape over the next three to five years: custom silicon is eating into the general-purpose GPU market. Every major hyperscale company is developing their own AI chips to reduce dependence on Nvidia and capture more of the value chain.
Google’s TPUs have been around for years. Amazon’s Trainium and Inferentia chips are in their second generation. Microsoft is reportedly developing their own Maia AI chip. Meta has developed the MTIA (Meta Training and Inference Accelerator). The trend is clear: the biggest customers want their own silicon.
This matters for your investment analysis because it represents a potential headwind for Nvidia’s long-term dominance—and an opportunity for companies positioned to benefit from custom silicon design and manufacturing.
TSMC is the clear beneficiary of custom silicon proliferation. Regardless of which company designs the chip, TSMC manufactures it. Their advanced node capacity (particularly 5nm and 3nm processes) is essential for cutting-edge AI semiconductors. If the AI chip market grows to $150 billion or more by 2027 as some analysts project, TSMC captures a meaningful slice regardless of market share shifts among chip designers.
Takeaway: The custom silicon trend doesn’t mean Nvidia is doomed—but it does mean their addressable market will face more competition. TSMC is the safer bet if you believe custom silicon adoption accelerates.
Let me be direct: most AI companies publicly traded today are product companies masquerading as infrastructure plays, and their valuations are disconnected from their actual earnings potential. This is where the most common investor mistakes happen.
Consider the AI application and software space. Companies like Salesforce, ServiceNow, and Adobe have all added AI features, and their stocks have rallied on AI enthusiasm. But here’s the uncomfortable truth: their core businesses weren’t broken by AI, and AI hasn’t fundamentally changed their competitive positions. Salesforce adding generative AI to their CRM doesn’t suddenly make them worth 30x revenue when their growth rate is high-single-digits.
The same applies to the numerous small AI companies that IPO’d in 2023-2024. Many have seen their stocks trade down 50-70% from peak valuations as reality set in. The market learned a hard lesson in 2023-2024: AI enthusiasm doesn’t equal AI revenue, and AI revenue doesn’t equal sustainable profits.
One more honest admission: I’ve been wrong about the timeline for AI product company maturation. I expected enterprise adoption of AI tools to accelerate faster than it has. The integration challenges, the concerns about data security, and the difficulty of measuring ROI have slowed enterprise AI spending. This means the build-out phase—where infrastructure companies profit—will likely last longer than I initially expected.
Takeaway: If a company is selling AI products to end users rather than building the infrastructure those products run on, apply rigorous valuation standards. The hype cycle for product companies has consistently exceeded their fundamentals.
AI infrastructure is a global phenomenon, but your investment location matters. Companies with significant Chinese exposure face regulatory risks that could materialize quickly.
The October 2022 export controls on advanced AI semiconductors to China already significantly impacted companies like NVIDIA (who had to create modified chips for the Chinese market) and the broader semiconductor supply chain. Additional restrictions could further reshape the competitive landscape.
For now, the U.S. hyperscalers and chip companies derive most of their revenue from domestic and allied-nation customers. But as AI becomes more politically charged, regulatory exposure will increasingly matter. Companies with cleaner geographic exposure—focused on the U.S., Europe, and friendly Asian nations—carry less political risk than those with significant China revenue.
Takeaway: Factor regulatory risk into your AI infrastructure analysis. Companies with exposure to Chinese markets face headwinds that could intensify, while U.S.-centric plays have cleaner growth trajectories.
The most important thing you can do as an AI infrastructure investor is develop a framework for evaluation rather than just collecting stock picks. Markets shift, and today’s winner can become tomorrow’s laggard.
Ask these questions of any AI investment:
The AI infrastructure build-out is a multi-year, possibly multi-decade theme. We’re in the early innings—the hyperscalers are still building capacity, the chip companies are still ramping production, and the power companies are just starting to respond to data center demand. The companies that navigate this build-out successfully will generate enormous wealth for patient investors.
But the key is specificity. Generic “AI stock” advice is worthless. Understanding which companies actually provide the infrastructure—the pipes, the power, the chips, the networking—that makes AI possible at scale, and buying those at reasonable valuations, is how you outperform.
The build-out continues. The opportunity remains. The difference between those who profit and those who get burned comes down to whether you’re buying the products everyone talks about, or the infrastructure everyone depends on.
Additive manufacturing — building three-dimensional objects layer by layer from digital models — has moved…
The 3D printing industry has matured significantly over the past decade, but two distinct worlds…
The 3D printing sector confuses more investors than almost any other technology space. Part manufacturing…
Carbon credits are moving from environmentalist niche to legitimate asset class. Major institutions are allocating…
The renewable energy sector has evolved from a niche investment theme into a cornerstone of…
The nuclear energy sector is finally moving again, and the investment world is noticing. After…