
Alright, quick question.
When was the last time you read something online and thought, “This says a lot… but means absolutely nothing”?
Like, the words are technically English. The video technically exists. But your brain keeps screaming, “this is junk.”
Well, congrats. You’ve just encountered slop.
And as of this year, that’s not just a vibe. It’s official.
Merriam-Webster named “slop” the 2025 Word of the Year.
Which honestly feels less like an announcement and more like a cry for help.
So what exactly is “slop”?
According to Merriam-Webster, slop refers to low-quality digital content, usually mass-produced by AI.
Now, it's not always wrong. It’s not illegal. It’s just… empty.
They even compared it to slime, sludge, and muck. You know, stuff that oozes. Stuff you don’t want to touch. Stuff that somehow gets everywhere.
And honestly, if that doesn’t describe half your feed… congrats. You’re on the good side of the algorithm.
Why this word, this year?
Because 2025 was the year the internet hit a breaking point.
Greg Barlow, Merriam-Webster’s president, basically said AI content went from fascinating…to annoying… to kind of ridiculous.
And the numbers back it up.
A study earlier this year found that around 74% of new web content had some level of AI involvement.
And when we say web content, we mean everything: Videos, podcasts, music, ads, books, entire websites that exist purely to exist
So yeah, Welcome to the slop economy where the playbook looks like this:
Generate content at massive scale
Spend almost zero human effort
Feed it to algorithms
Monetize attention, not quality
Platforms at this point don’t really care if the content’s good. They care if it gets clicks. So accuracy drops, originality drops. And genuinely useful information gets buried under a mountain of AI-written nothingness.
If you ask me, this is especially brutal in places where details actually matter, like finance, tech, and crypto where one bad AI-generated explainer can lead to real people making really bad decisions.
And it’s not just Merriam-Webster noticing.
Other dictionaries saw the same chaos and crowned similar winners:
Oxford picked “ragebait” as it WOTY (word of the year) winner
Collins went with “vibe coding”
Australia’s Macquarie Dictionary straight-up chose “AI slop”
So you see, different terms. Same message.
So… what happens next?
We’re already seeing pushbacks:
Tools that verify human-made content
Curated platforms that prioritize quality
Hybrid workflows where humans actually stay in the loop
And yes, possible rules that force AI disclosure
Because here’s the real takeaway:
AI isn’t the problem. Unchecked, low-effort, soulless AI output is.
In fact, critics have pointed out something even more uncomfortable.
As AI companies push paid tiers, the internet is quietly splitting in two.
On one side, people who can afford paywalled, higher-quality content. On the other, people stuck with a digital diet of slop, which, as you can imagine, is very light on actual informational value.
So if something feels off while you’re scrolling, trust that instinct.
You can find out more here.
