Technologist Mag
  • Home
  • Tech News
  • AI
  • Apps
  • Gadgets
  • Gaming
  • Guides
  • Laptops
  • Mobiles
  • Wearables
  • More
    • Web Stories
    • Trending
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On

Shark Promo Codes for October 2025

1 October 2025

Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out

30 September 2025

How K-Pop Stans Set the Stage for the US TikTok Ban

30 September 2025

We Asked Audio Pros to Blind Test Headphones. The Results Were Surprising

30 September 2025

Review: Subaru Crosstrek Hybrid 2026

30 September 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Technologist Mag
SUBSCRIBE
  • Home
  • Tech News
  • AI
  • Apps
  • Gadgets
  • Gaming
  • Guides
  • Laptops
  • Mobiles
  • Wearables
  • More
    • Web Stories
    • Trending
    • Press Release
Technologist Mag
Home » Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out
Tech News

Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out

By technologistmag.com30 September 20252 Mins Read
Share
Facebook Twitter Reddit Telegram Pinterest Email

If a user doesn’t opt out of model training, then the changed training policy covers all new and revisited chats. That means Anthropic is not automatically training its next model on your entire chat history, unless you go back into the archives and reignite an old thread. After the interaction, that old chat is now reopened and fair game for future training.

The new privacy policy also arrives with an expansion to Anthropic’s data retention policies for those that don’t opt out. Anthropic increased the amount of time it holds onto user data from 30 days in most situations to a much more extensive five years, whether or not users allow model training on their conversations. Users who opt out will still be under the 30 day policy.

Anthropic’s change in terms applies to commercial-tier users, free as well as paid. Commercial users, like those licensed through government or educational plans, are not impacted by the change and conversations from those users will not be used as part of the company’s model training.

Claude is a favorite AI tool for some software developers who’ve latched onto its abilities as a coding assistant. Since the privacy policy update includes coding projects as well as chat logs, Anthropic could gather a sizable amount of coding information for training purposes with this switch.

Prior to Anthropic updating its privacy policy, Claude was one of the only major chatbots not to use conversations for LLM training automatically. In comparison, the default setting for both OpenAI’s ChatGPT and Google’s Gemini for personal accounts include the possibility for model training, unless the user chooses to opt out.

Check out WIRED’s full guide to AI training opt-outs for more services where you can request generative AI not be trained on user data. While choosing to opt out of data training is a boon for personal privacy, especially when dealing with chatbot conversations or other one-on-one interactions, it’s worth keeping in mind that anything you post publicly online, from social media posts to restaurant reviews, will likely be scraped by some startup as training material for its next giant AI model.

Share. Facebook Twitter Pinterest LinkedIn Telegram Reddit Email
Previous ArticleHow K-Pop Stans Set the Stage for the US TikTok Ban
Next Article Shark Promo Codes for October 2025

Related Articles

Shark Promo Codes for October 2025

1 October 2025

How K-Pop Stans Set the Stage for the US TikTok Ban

30 September 2025

We Asked Audio Pros to Blind Test Headphones. The Results Were Surprising

30 September 2025

Review: Subaru Crosstrek Hybrid 2026

30 September 2025

The Best Early Amazon Prime Day Deals

30 September 2025

DoorDash’s New Delivery Robot Rolls Out Into the Big, Cruel World

30 September 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Don't Miss

Anthropic Will Use Claude Chats for Training Data. Here’s How to Opt Out

By technologistmag.com30 September 2025

If a user doesn’t opt out of model training, then the changed training policy covers…

How K-Pop Stans Set the Stage for the US TikTok Ban

30 September 2025

We Asked Audio Pros to Blind Test Headphones. The Results Were Surprising

30 September 2025

Review: Subaru Crosstrek Hybrid 2026

30 September 2025

The Best Early Amazon Prime Day Deals

30 September 2025
Technologist Mag
Facebook X (Twitter) Instagram Pinterest
  • Privacy
  • Terms
  • Advertise
  • Contact
© 2025 Technologist Mag. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.