So, here’s the snitch: Anthropic—the brains behind Claude—woke up this week and decided your late-night “explain quantum physics like a pirate” chats are way too valuable to delete.

Starting September 28, every Claude user has a choice:

  1. Opt out and keep your convos private.

  2. Do nothing… and let your data help train future Claude models for the next five years.

And they’re not exactly shouting this from the rooftops. More like whispering it in a blog post buried halfway down their site. Cute, right?

Here’s What’s Changed:

  • Before: Anthropic deleted user data within 30 days unless flagged for policy violations, in which case retention could stretch up to two years.

  • Now: Claude Free, Pro, Max, and Claude Code users are all part of the training pool unless they opt out.

  • Who’s safe: Enterprise customers (Claude Gov, Work, Education, API users) remain exempt—same playbook as OpenAI’s enterprise shield.

Anthropic says this is all about user choice, safety, and making Claude smarter at coding, reasoning, and analysis. Which sounds lovely.

But in reality? This is about fuel. LLMs thrive on mountains of real-world conversations. Anthropic’s competing with OpenAI and Google, and tapping into millions of user chats is like striking gold.

But then they pulled off the classic sneaky UX Move.

New users get a clear choice upfront.

Existing users? They’re hit with a splash screen: a giant “Accept” button… and a tiny, pre-switched-on toggle for training permissions. Click too fast, and boom—you’ve just handed over your entire Claude history.

Privacy experts are side-eyeing this hard, and the FTC has literally warned companies about sneaky designs like this. But hey… who reads fine print, right?

Our Take:

No one’s shocked AI companies want your chats—your data is the secret sauce for building smarter models. But Anthropic framing this as a friendly “choice” while making it way too easy to say yes by accident? Yeah… that’s a trust-killer.

Also, It’s not the end of privacy, but it’s another nudge toward a world where “delete” means “archive forever.” If nothing else, take this as a sign to actually read those pop-ups before you smash that big black “ACCEPT” button.

Because in the AI age? The fine print is the plot twist.

Maybe go check out Anthropic’s blog post for all the nitty-gritty details on this quiet-but-huge shift.

Reply

or to participate

More From The Automated

No posts found