The super cycle of semiconductor equipment is coming! Applied Materials ( AMAT.US ) joins forces with two major storage chip giants to trigger a wave of upgrades and capacity expansion.

One of the world’s largest semiconductor equipment manufacturers, Applied Materials (AMAT.US), along with leading memory chip maker Micron Technology (MU.US) and South Korea-based SK Hynix, announced on Tuesday Eastern Time that they have reached a significant collaboration to develop and build cutting-edge solutions and upgrade pathways for DRAM, high-bandwidth memory (HBM storage systems), and data center NAND storage systems. The goal is to comprehensively enhance overall storage chip capacity and the performance of AI training/inference systems.

It is understood that Applied Materials and the American storage giant Micron plan to leverage Applied Materials’ Silicon Valley-based EPIC supercenter and Micron’s innovation R&D center in Boise, Idaho, to further strengthen the U.S. domestic semiconductor innovation pipeline. Applied Materials stated that, following deep collaboration with Micron, both parties will focus on accelerating the production technology of DRAM, HBM, and NAND. Their partnership will integrate expertise from Applied Materials’ EPIC center and Micron’s innovations from Boise.

The collaboration between Applied Materials and South Korea’s SK Hynix will focus on improving advanced materials used in storage chips, integrating advanced process techniques, and developing 3D packaging technology for next-generation high-performance DRAM and HBM storage systems. These efforts will also be carried out at Applied Materials’ EPIC center.

By jointly developing cutting-edge storage chips with two of the world’s top three memory chip manufacturers—Micron and SK Hynix—and actively expanding storage chip capacity, this highlights that amid the booming global AI infrastructure build-up and the macro “super cycle” in storage chips, semiconductor equipment companies are entering a super growth phase. They stand to benefit the most from the rapid expansion of AI chips (including AI GPUs and AI ASICs) and DRAM/NAND storage chip capacities.

Recently, several Wall Street financial giants issued research reports stating that the semiconductor equipment sector is one of the biggest winners amid surging AI computing and storage demands. As tech giants like Microsoft, Google, and Meta lead the construction of massive AI data centers worldwide, chip manufacturers are accelerating production of advanced process AI chips below 3nm, as well as expanding CoWoS/3D packaging, DRAM, and NAND storage capacities. The long-term bullish case for semiconductor equipment is becoming increasingly solid.

A recent report from Bank of America shows that the global AI arms race is still in its “early to mid-stage”; one of the world’s largest asset managers, Vanguard, recently estimated that the AI investment cycle has only reached about 30-40% of its peak. According to the latest analyst forecasts compiled by institutions, Amazon, along with Google’s parent Alphabet, Facebook’s parent Meta Platforms Inc., Oracle, and Microsoft, are expected to spend around $650 billion on AI-related capital expenditures by 2026. Some analysts believe total spending could exceed $700 billion—implying a year-over-year increase of over 70% in AI capital investment.

In a recent statement, Micron Chairman, President, and CEO Sanjay Mehrotra said: “High-performance memory and storage are key drivers of AI technology development, and continuous innovation in these areas is crucial to unlocking AI’s full potential. For decades, Micron has collaborated with Applied Materials to provide material engineering innovations for new memory and storage devices. We are pleased to expand this partnership to Applied Materials’ new EPIC center in Silicon Valley. Coupled with Micron’s R&D and manufacturing hubs in the U.S., this collaboration creates a unique innovation pipeline from labs to final wafer fabrication, promoting the development of U.S. storage technology innovation.”

This deep collaboration also includes joint R&D on 3D advanced packaging technology to achieve high-bandwidth, low-power, comprehensive storage solutions suitable for high-power AI workloads. Both parties also stated that the new $5 billion EPIC supercenter from Applied Materials is one of the largest investments in cutting-edge semiconductor equipment R&D in the U.S.

Scott DeBoer, Micron’s EVP and Chief Technology and Product Officer, said: “Our collaboration with Applied Materials at the EPIC center goes beyond next-generation advanced process nodes—it aims to drive disruptive advancements in devices, materials, and processes, enabling future memory components, storage architectures, and technologies with higher performance and energy efficiency to meet the needs of large-scale customers.”

“Google AI compute chain” and “NVIDIA GPU chain” rely heavily on storage chips

With the full-scale outbreak of the US/Iran conflict spreading across the Middle East, igniting a new geopolitical superstorm impacting the global economy, investors’ risk appetite has sharply declined amid soaring oil and gas prices. Concerns about the fragile global economic recovery being pushed into stagflation due to runaway energy prices have recently battered global equity, bond, and cryptocurrency markets.

However, analysts from Bank of America recently reported that recent supply chain surveys and industry chain tracking indicate that the global storage industry, centered on storage chips, remains in a “super cycle.” The impact of Middle East geopolitical conflicts on the storage industry and key supply chains, as well as on fund managers’ bullish outlook on the storage sector, is nearly zero. Especially since core semiconductor equipment mainly comes from the U.S. and Europe, typically shipped by air, avoiding the Strait of Hormuz.

Whether it’s Google’s massive TPU AI compute clusters or NVIDIA’s vast AI GPU clusters, both depend on fully integrated HBM storage systems with AI chips. In addition to HBM, Google and OpenAI are rapidly building or expanding AI data centers, requiring large-scale procurement of server-grade DDR5 memory and enterprise high-performance SSD/HDD storage solutions. Unlike Seagate and Western Digital, which focus on monopolizing large-capacity nearline HDDs, and SanDisk, which focuses on high-performance eSSD, the three major memory chip manufacturers—Samsung Electronics, SK Hynix, and Micron—are simultaneously involved in several core storage fields: HBM, server DRAM (including DDR5/LPDDR5X), and high-end enterprise SSDs (eSSD). They are the most immediate beneficiaries within the “AI memory + storage stack,” collectively capturing the “super dividend” of AI infrastructure.

BNP Paribas recently issued a research report predicting that DRAM contract prices will surge 90% in Q1 2026 compared to the previous quarter, while NAND flash contract prices are expected to rise significantly by 55%, continuing the upward trend since late 2025. The bank’s analysts also set a target price of $500 for Micron within 12 months. Micron closed Tuesday at $403.11, up 3.54%.

BNP Paribas’s outlook on storage price increases is echoed by TrendForce, which revised its Q1 2026 forecast for conventional DRAM contract prices from an earlier estimate of 55-60% quarter-over-quarter increase to 90-95%. NAND flash contract prices are also revised upward to 55-60% QoQ. The surge is driven by soaring demand from North American cloud providers for enterprise SSDs, which are expected to see prices rise another 53-58% in the first quarter. These facts underscore that storage chips are becoming as critical as NVIDIA’s AI chips in the AI super wave, and are among the first to experience supply-demand imbalances and pricing power shifts.

AI compute and storage chip demand is exploding! Semiconductor equipment enters a super cycle

The unprecedented AI infrastructure wave and storage super cycle have pushed semiconductors into a new phase characterized by material intensiveness, process control complexity, and advanced packaging: the combination of 3D structures and new materials on the logic side, stacking and interconnect upgrades for HBM on the storage side, and system-level performance improvements via CoWoS/hybrid bonding on the packaging side. These forces collectively increase the value density of key processes like deposition, etching, CMP, advanced packaging, and metrology, transforming semiconductor equipment demand from cyclical to structurally expansive.

Particularly notable is the shift from “bump bonding” to “hybrid bonding,” which shortens interconnects, increases I/O density, and reduces power consumption—perfectly addressing the bandwidth, latency, and power constraints of AI training and inference. Applied Materials has detailed the performance and power advantages of hybrid bonding over TSVs on its official website, and has launched scalable hybrid bonding platforms. By investing in BESI (a leading hybrid bonding equipment maker), Applied Materials is strengthening its position in the “process-equipment synergy” industry.

Currently, global AI compute infrastructure and enterprise storage chip demand continue to grow exponentially, far outpacing supply. This is evident from TSMC’s (TSM.US) recent stellar earnings and capital expenditure guidance, as well as the strong performance and outlook of Applied Materials and Lam Research, the leading semiconductor equipment suppliers.

Breaking down semiconductor equipment capabilities, Applied Materials’ core strengths lie in “materials engineering, cutting-edge process integration, and advanced packaging.” Its key role in storage expansion involves not just traditional thin-film deposition but also deposition, CMP, metrology, hybrid bonding, and 3D advanced packaging systems for HBM/DRAM/NAND. The company explicitly emphasizes HBM as a strategic focus, noting that performance gains come not only from advanced process DRAM dies but also from 3D packaging and interconnect technologies. Its hybrid bonding solutions are already applicable to NAND and are viewed by major memory manufacturers as a key path for further stacking of DRAM and HBM. In essence, Applied Materials functions more like an integrated platform combining materials, processes, and packaging for mass production.

In chip manufacturing, Applied Materials is ubiquitous. Unlike ASML, which focuses solely on lithography, Lam Research specializes in etching, cleaning, patterning, and critical thin-film processes, especially for high aspect ratio (HAR) etching/deposition needed for advanced HBM storage. Its high-end equipment plays a vital role in nearly every step of chip fabrication, including ALD, CVD, PVD, RTP, CMP, wafer etching, and ion implantation.

In its latest technical overview, Applied Materials states that HBM manufacturing adds about 19 additional materials engineering steps compared to traditional DRAM, with its most advanced equipment covering approximately 75% of these steps. It also announced a hybrid bonding system aimed at advanced packaging and storage chip stacking. These are expected to be long-term growth vectors for the company, with new nodes like GAA (Gate-All-Around) and BPD (Backside Power Delivery) serving as key growth drivers.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin