{"id":98617,"date":"2025-08-26T05:41:33","date_gmt":"2025-08-26T05:41:33","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2025\/08\/26\/us-attorneys-general-tell-ai-companies-they-will-be-held-accountable-for-child-safety-failures\/"},"modified":"2025-08-26T05:41:33","modified_gmt":"2025-08-26T05:41:33","slug":"us-attorneys-general-tell-ai-companies-they-will-be-held-accountable-for-child-safety-failures","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2025\/08\/26\/us-attorneys-general-tell-ai-companies-they-will-be-held-accountable-for-child-safety-failures\/","title":{"rendered":"US Attorneys General tell AI companies they &#8216;will be held accountable&#8217; for child safety failures"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div data-article-body=\"true\">\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">The US Attorneys General of 44 jurisdictions have signed a <a data-i13n=\"cpos:1;pos:1\" href=\"https:\/\/mcusercontent.com\/cc1fad182b6d6f8b1e352e206\/files\/89a33a31-1401-7e41-24e4-e40e1595f481\/AI_Chatbot_FINAL_44_.pdf\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:letter;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">letter<\/a> [PDF] addressed to the Chief Executive Officers of multiple AI companies, <a data-i13n=\"cpos:2;pos:1\" href=\"https:\/\/www.azag.gov\/press-release\/attorney-general-mayes-joins-44-states-demanding-tech-companies-end-predatory-ai\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:urging them;cpos:2;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">urging them<\/a> to protect children &#8220;from exploitation by predatory artificial intelligence products.&#8221; In the letter, the AGs singled out Meta and said its policies &#8220;provide an instructive opportunity to candidly convey [their] concerns.&#8221; Specifically, they mentioned a recent report by <em>Reuters<\/em>, which revealed that Meta <a data-i13n=\"cpos:3;pos:1\" href=\"https:\/\/www.engadget.com\/ai\/an-internal-meta-ai-document-said-chatbots-could-have-sensual-conversations-with-children-191101296.html\" data-ylk=\"slk:allowed its AI chatbots;cpos:3;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">allowed its AI chatbots<\/a> to &#8220;flirt and engage in romantic roleplay with children.&#8221; <em>Reuters<\/em> got its information from an internal Meta document containing guidelines for its bots.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">They also pointed out a previous <em>Wall Street Journal<\/em> investigation wherein Meta&#8217;s AI chatbots, even those using the voices of celebrities like Kristen Bell, were <a data-i13n=\"cpos:4;pos:1\" href=\"https:\/\/www.engadget.com\/ai\/metas-ai-chatbots-were-reportedly-able-to-engage-in-sexual-conversations-with-minors-193726679.html\" data-ylk=\"slk:caught having sexual roleplay;cpos:4;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">caught having sexual roleplay<\/a> conversations with accounts labeled as underage. The AGs briefly mentioned a <a data-i13n=\"cpos:5;pos:1\" href=\"https:\/\/edition.cnn.com\/2024\/10\/30\/tech\/teen-suicide-character-ai-lawsuit\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:lawsuit against Google and Character.ai;cpos:5;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">lawsuit against Google and Character.ai<\/a>, as well, accusing the latter&#8217;s chatbot of persuading the plaintiff&#8217;s child to commit suicide. Another lawsuit they mentioned was also against Character.ai, after a chatbot allegedly told a teenager that it&#8217;s <a data-i13n=\"cpos:6;pos:1\" href=\"https:\/\/www.bbc.com\/news\/articles\/cd605e48q1vo\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:okay to kill their parents;cpos:6;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">okay to kill their parents<\/a> after they limited their screentime.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">&#8220;You are well aware that interactive technology has a particularly intense impact on developing brains,&#8221; the Attorneys General wrote in their letter. &#8220;Your immediate access to data about user interactions makes you the most immediate line of defense to mitigate harm to kids. And, as the entities benefitting from children\u2019s engagement with your products, you have a legal obligation to them as consumers.&#8221; The group specifically addressed the letter to Anthropic, Apple, Chai AI, Character Technologies Inc., Google, Luka Inc., Meta, Microsoft, Nomi AI, OpenAI, Perplexity AI, Replika and XAi.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">They ended their letter by warning the companies that they &#8220;will be held accountable&#8221; for their decisions. Social networks have caused significant harm to children, they said, in part because &#8220;government watchdogs did not do their job fast enough.&#8221; But now, the AGs said they are paying attention, and companies &#8220;will answer&#8221; if they &#8220;knowingly harm kids.&#8221;<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.engadget.com\/ai\/us-attorneys-general-tell-ai-companies-they-will-be-held-accountable-for-child-safety-failures-035213253.html?src=rss\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The US Attorneys General of 44 jurisdictions have signed a letter [PDF] addressed to the Chief Executive Officers of multiple AI companies, urging them to<\/p>\n","protected":false},"author":1,"featured_media":98618,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[157],"tags":[],"class_list":["post-98617","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-gadget"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/98617","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=98617"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/98617\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/98618"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=98617"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=98617"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=98617"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}