Notebookcheck Logo

Meta shatters silicon cycle: Four new AI chips by 2027

Meta says four new MTIA chip generations are on the way as it expands its in-house AI infrastructure.
ⓘ about.fb.com
Meta says four new MTIA chip generations are on the way as it expands its in-house AI infrastructure.
Meta unveils an aggressive AI silicon roadmap, deploying four MTIA generations by 2027. Discover how custom MTIA 500 chips will power GenAI at scale.

Meta has detailed an ambitious new roadmap for its in-house AI chips, saying it is developing and deploying four new generations of Meta Training and Inference Accelerator hardware within the next two years. 

In a post, the company said the new MTIA lineup will support ranking, recommendations, and generative AI workloads, with custom silicon now sitting at the center of its broader AI infrastructure strategy.

MTIA 300 is already in production

The clearest near-term detail is that MTIA 300 is already in production. Meta said this chip will be used for ranking and recommendations training, marking a broader role for MTIA than simple inference acceleration. 

The company added that MTIA 400, 450, and 500 will be capable of handling all workloads, although Meta expects to use those later generations primarily for generative AI inference in the near term and into 2027.

That gives Meta a much faster public chip cadence than is typical in the AI space. The company said the four-chip plan will roll out over two years, which it described as a faster pace than standard chip cycles.

Meta wants its own chips to play a much bigger role

Meta first introduced MTIA in 2023 as a family of custom-built chips for AI workloads, but the company’s latest update makes clear that the project is no longer a side effort. Meta said it already deploys hundreds of thousands of MTIA chips for inference workloads across organic content and ads in its apps, and it argued that these chips are more compute-efficient and more cost-efficient than general-purpose silicon for the company’s intended uses.

The broader message is straightforward: Meta wants more control over the hardware running its AI systems. Instead of relying only on outside suppliers, the company is trying to build more of its own stack for the workloads that matter most to its platforms.

Independence from the silicon giants

While Meta recently signed a multi-billion-dollar deal for Nvidia’s latest GPUs, this roadmap is a clear signal that Menlo Park is tired of waiting on external supply chains. By moving its massive inference workloads—which account for the bulk of its AI costs—onto custom MTIA hardware, Meta is directly challenging the market dominance of third-party suppliers. It’s a shift from being a "buyer" to an "architect," allowing the company to dictate its own infrastructure margins for the first time.

The next chips focus heavily on inference

Meta said its chip strategy prioritizes rapid iteration, an inference-first design approach, and easier adoption through industry-standard software and hardware ecosystems. In practice, that means MTIA 450 and 500 are being optimized first for generative AI inference, with the ability to support other workloads as needed, including ranking and recommendation training and inference as well as generative AI training.

That focus makes sense for a company operating services at Meta’s scale. Inference is where large AI products can become especially expensive, and Meta is clearly designing these later generations to better match that demand.

Faster development is part of the plan

Meta also said it has built its MTIA roadmap around shorter release cycles. While the industry often launches a new AI chip every one to two years, Meta said it now has the capacity to release new MTIA generations every six months or less by reusing modular designs. The company said this should help it adapt more quickly to changing AI techniques and reduce the cost of building and deploying new hardware.

Another practical advantage is compatibility with existing infrastructure. Meta said the modularity of its silicon allows the new chips to drop into current rack system infrastructure, which should help speed deployment.

AI infrastructure is now the battlefield

AI infrastructure is becoming one of the most important battlegrounds in tech, and Meta’s announcement shows how seriously it is treating chip design as part of that fight. Four new MTIA generations in two years is an aggressive roadmap on paper, but the bigger takeaway is that Meta no longer talks about custom silicon as an experiment. It is now presenting MTIA as a core part of how Facebook, Instagram, and its other platforms will handle ranking, recommendations, and generative AI workloads going forward.

Please share our article, every link counts!
Mail Logo
Google Logo Add as a preferred
source on Google

No comments for this article

Got questions or something to add to our article? Even without registering you can post in the comments!
No comments for this article / reply

static version load dynamic
Loading Comments
Comment on this article
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2026 03 > Meta shatters silicon cycle: Four new AI chips by 2027
Darryl Linington, 2026-03-13 (Update: 2026-03-13)