{"id":98755,"date":"2025-08-29T06:29:49","date_gmt":"2025-08-29T06:29:49","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2025\/08\/29\/anthropic-users-face-a-new-choice-opt-out-or-share-your-chats-for-ai-training\/"},"modified":"2025-08-29T06:29:49","modified_gmt":"2025-08-29T06:29:49","slug":"anthropic-users-face-a-new-choice-opt-out-or-share-your-chats-for-ai-training","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2025\/08\/29\/anthropic-users-face-a-new-choice-opt-out-or-share-your-chats-for-ai-training\/","title":{"rendered":"Anthropic users face a new choice \u2013 opt out or share your chats for AI training"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\" class=\"wp-block-paragraph\">Anthropic is making some big changes to how it handles user data, requiring all Claude users to decide by September 28 whether they want their conversations used to train AI models. While the company directed us to its <a href=\"https:\/\/www.anthropic.com\/news\/updates-to-our-consumer-terms\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">blog post<\/a> on the policy changes when asked about what prompted the move, we\u2019ve formed some theories of our own.<\/p>\n<p class=\"wp-block-paragraph\">But first, what\u2019s changing: Previously, Anthropic didn\u2019t use consumer chat data for model training. Now, the company wants to train its AI systems on user conversations and coding sessions, and it said it\u2019s extending data retention to five years for those who don\u2019t opt out.<\/p>\n<p class=\"wp-block-paragraph\">That is a massive update. Previously, users of Anthropic\u2019s consumer products were told that their prompts and conversation outputs would be automatically deleted from Anthropic\u2019s back end within 30 days \u201cunless legally or policy\u2011required to keep them longer\u201d or their input was flagged as violating its policies, in which case a user\u2019s inputs and outputs might be retained for up to two years.<\/p>\n<p class=\"wp-block-paragraph\">By consumer, we mean the new policies apply to Claude Free, Pro, and Max users, including those using Claude Code. Business customers using Claude Gov, Claude for Work, Claude for Education, or API access will be unaffected, which is how OpenAI similarly protects enterprise customers from data training policies.<\/p>\n<p class=\"wp-block-paragraph\">So why is this happening? In that post about the update, Anthropic frames the changes around user choice, saying that by not opting out, users will \u201chelp us improve model safety, making our systems for detecting harmful content more accurate and less likely to flag harmless conversations.\u201d Users will \u201calso help future Claude models improve at skills like coding, analysis, and reasoning, ultimately leading to better models for all users.\u201d<\/p>\n<p class=\"wp-block-paragraph\">In short, help us help you. But the full truth is probably a little less selfless.<\/p>\n<p class=\"wp-block-paragraph\">Like every other large language model company, Anthropic needs data more than it needs people to have fuzzy feelings about its brand. Training AI models requires vast amounts of high-quality conversational data, and accessing millions of Claude interactions should provide exactly the kind of real-world content that can improve Anthropic\u2019s competitive positioning against rivals like OpenAI and Google.<\/p>\n<div class=\"wp-block-techcrunch-inline-cta\">\n<div class=\"inline-cta__wrapper\">\n<p>Techcrunch event<\/p>\n<div class=\"inline-cta__content\">\n<p>\n\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__location\">San Francisco<\/span><br \/>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__separator\">|<\/span><br \/>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"inline-cta__date\">October 27-29, 2025<\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/p><\/div>\n<\/p><\/div>\n<\/div>\n<p class=\"wp-block-paragraph\">Beyond the competitive pressures of AI development, the changes would also seem to reflect broader industry shifts in data policies, as companies like Anthropic and OpenAI face increasing scrutiny over their data retention practices. OpenAI, for instance, is currently fighting a court order that forces the company to retain all consumer ChatGPT conversations indefinitely, including deleted chats, because of a lawsuit filed by The New York Times and other publishers.<\/p>\n<p class=\"wp-block-paragraph\">In June, OpenAI COO Brad Lightcap called this \u201ca <a href=\"https:\/\/openai.com\/index\/response-to-nyt-data-demands\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">sweeping and unnecessary demand<\/a>\u201d that \u201cfundamentally conflicts with the privacy commitments we have made to our users.\u201d The court order affects ChatGPT Free, Plus, Pro, and Team users, though enterprise customers and those with Zero Data Retention agreements are still protected.<\/p>\n<p class=\"wp-block-paragraph\">What\u2019s alarming is <a href=\"https:\/\/techcrunch.com\/2024\/10\/02\/meta-confirms-it-may-train-its-ai-on-any-image-you-ask-ray-ban-meta-ai-to-analyze\/\">how much confusion<\/a> all of these changing usage policies are creating for users, many of whom remain oblivious to them.<\/p>\n<p class=\"wp-block-paragraph\">In fairness, everything is moving quickly now, so as the tech changes, privacy policies are bound to change. But many of these changes are fairly sweeping and mentioned only fleetingly amid the companies\u2019 other news. (You wouldn\u2019t think Tuesday\u2019s policy changes for Anthropic users were very big news based on where the company placed this update on its press page.)<\/p>\n<figure class=\"wp-block-image alignfull size-large\"><img loading=\"lazy\" loading=\"lazy\" decoding=\"async\" height=\"417\" width=\"680\" src=\"https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?w=680\" alt=\"\" class=\"wp-image-3040915\" srcset=\"https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png 2198w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=150,92 150w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=300,184 300w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=768,471 768w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=680,417 680w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=1200,736 1200w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=1280,785 1280w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=430,264 430w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=720,442 720w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=900,552 900w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=800,491 800w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=1536,942 1536w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=2048,1256 2048w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=668,410 668w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=611,375 611w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=1006,617 1006w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=708,434 708w, https:\/\/techcrunch.com\/wp-content\/uploads\/2025\/08\/Claude-screenshot.png?resize=50,31 50w\" sizes=\"auto, (max-width: 680px) 100vw, 680px\"\/><figcaption class=\"wp-element-caption\"><span class=\"wp-block-image__credits\"><strong>Image Credits:<\/strong>Anthropic<\/span><\/figcaption><\/figure>\n<p class=\"wp-block-paragraph\">But many users don\u2019t realize the guidelines to which they\u2019ve agreed have changed because the design practically guarantees it. Most ChatGPT users keep clicking on \u201cdelete\u201d toggles that aren\u2019t technically deleting anything. Meanwhile, Anthropic\u2019s implementation of its new policy follows a familiar pattern.<\/p>\n<p class=\"wp-block-paragraph\">How so? New users will choose their preference during signup, but existing users face a pop-up with \u201cUpdates to Consumer Terms and Policies\u201d in large text and a prominent black \u201cAccept\u201d button with a much tinier toggle switch for training permissions below in smaller print \u2014 and automatically set to \u201cOn.\u201d<\/p>\n<p class=\"wp-block-paragraph\">As observed <a href=\"https:\/\/www.theverge.com\/anthropic\/767507\/anthropic-user-data-consumers-ai-models-training-privacy\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">earlier<\/a> today by The Verge, the design raises concerns that users might quickly click \u201cAccept\u201d without noticing they\u2019re agreeing to data sharing.<\/p>\n<p class=\"wp-block-paragraph\">Meanwhile, the stakes for user awareness couldn\u2019t be higher. Privacy experts have long warned that the complexity surrounding AI makes meaningful user consent nearly unattainable. Under the Biden administration, the Federal Trade Commission even stepped in, <a rel=\"nofollow\" href=\"https:\/\/www.ftc.gov\/policy\/advocacy-research\/tech-at-ftc\/2024\/01\/ai-companies-uphold-your-privacy-confidentiality-commitments\">warning<\/a> that AI companies risk enforcement action if they engage in \u201csurreptitiously changing its terms of service or privacy policy, or burying a disclosure behind hyperlinks, in legalese, or in fine print.\u201d<\/p>\n<p class=\"wp-block-paragraph\">Whether the commission \u2014 now operating with just <a href=\"https:\/\/arstechnica.com\/tech-policy\/2025\/03\/trump-fires-both-ftc-democrats-in-challenge-to-supreme-court-precedent\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">three<\/a> of its five commissioners \u2014 still has its eye on these practices today is an open question, one we\u2019ve put directly to the FTC.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2025\/08\/28\/anthropic-users-face-a-new-choice-opt-out-or-share-your-data-for-ai-training\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Anthropic is making some big changes to how it handles user data, requiring all Claude users to decide by September 28 whether they want their<\/p>\n","protected":false},"author":1,"featured_media":98756,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[149],"tags":[],"class_list":["post-98755","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-business"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/98755","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=98755"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/98755\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/98756"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=98755"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=98755"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=98755"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}