Get this — Qualcomm, the company best known for the chip that makes your phone, laptops, and tablets, feel smart, just decided to take on Nvidia — aka the company printing money from AI GPUs right now.

They’re rolling out two new AI chips — the AI200 (coming next year) and the AI250 (set for 2027).

And here’s the twist: they’re built on cellphone tech. Literally.

These chips draw from the same Hexagon neural processor tech that powers AI features in phones and laptops. Except now, Qualcomm’s stacking up to 72 of those bad boys in a rack to act like one massive AI computer just like Nvidia and AMD’s GPUs.

So What Makes These Different?

Unlike Nvidia’s monster GPUs that train huge AI models, Qualcomm’s chips are designed for inference — aka the “doing” part of AI.  

Basically, instead of teaching the AI, these chips let already-trained models do their thing — faster, cheaper, and way more efficiently.

The AI200 packs 768GB of RAM and is tuned for peak inference performance.

As for the AI250? Qualcomm says, it will deliver a “generational leap in efficiency.” In plain terms: it could significantly reduce power consumption — which matters especially with how AI servers are guzzling energy like there’s no tomorrow.

And here's a plot twist — Humain, an AI company backed by Saudi Arabia’s Public Investment Fund, already called dibs.

They’re planning to use these Qualcomm chips to build AI datacenters across Saudi Arabia.

So yeah, Qualcomm’s not just talking about entering the AI chip race — they’ve already got a partner and a pipeline.

The Big Picture:

If this works, it could seriously shake up the AI chip hierarchy.

Nvidia’s still king, sure — but Qualcomm’s playing the long game of smaller, cheaper, and way more efficient.

Because at the end of the day, if your AI can think just as fast using a chip made from phone parts, maybe raw power isn’t the future — maybe efficiency is.

Go look it up if that piques your curiosity. 

Reply

or to participate

More From The Automated

No posts found