{"id":77852,"date":"2024-03-06T18:45:18","date_gmt":"2024-03-06T18:45:18","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2024\/03\/06\/can-you-tell-ai-generated-people-from-real-ones\/"},"modified":"2024-03-06T18:45:18","modified_gmt":"2024-03-06T18:45:18","slug":"can-you-tell-ai-generated-people-from-real-ones","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2024\/03\/06\/can-you-tell-ai-generated-people-from-real-ones\/","title":{"rendered":"Can you tell AI-generated people from real ones?"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<p id=\"first\">If you recently had trouble figuring out if an image of a person is real or generated through artificial intelligence (AI), you&#8217;re not alone.<\/p>\n<div id=\"text\">\n<p>A new study from University of Waterloo researchers found that people had more difficulty than was expected distinguishing who is a real person and who is artificially generated.<\/p>\n<p>The Waterloo study saw 260 participants provided with 20 unlabeled pictures: 10 of which were of real people obtained from Google searches, and the other 10 generated by Stable Diffusion or DALL-E, two commonly used AI programs that generate images.<\/p>\n<p>Participants were asked to label each image as real or AI-generated and explain why they made their decision. Only 61 per cent of participants could tell the difference between AI-generated people and real ones, far below the 85 per cent threshold that researchers expected.<\/p>\n<p>&#8220;People are not as adept at making the distinction as they think they are,&#8221; said Andreea Pocol, a PhD candidate in Computer Science at the University of Waterloo and the study&#8217;s lead author.<\/p>\n<p>Participants paid attention to details such as fingers, teeth, and eyes as possible indicators when looking for AI-generated content &#8212; but their assessments weren&#8217;t always correct.<\/p>\n<p>Pocol noted that the nature of the study allowed participants to scrutinize photos at length, whereas most internet users look at images in passing.<\/p>\n<p>&#8220;People who are just doomscrolling or don&#8217;t have time won&#8217;t pick up on these cues,&#8221; Pocol said.<\/p>\n<p>Pocol added that the extremely rapid rate at which AI technology is developing makes it particularly difficult to understand the potential for malicious or nefarious action posed by AI-generated images. The pace of academic research and legislation isn&#8217;t often able to keep up: AI-generated images have become even more realistic since the study began in late 2022.<\/p>\n<p>These AI-generated images are particularly threatening as a political and cultural tool, which could see any user create fake images of public figures in embarrassing or compromising situations.<\/p>\n<p>&#8220;Disinformation isn&#8217;t new, but the tools of disinformation have been constantly shifting and evolving,&#8221; Pocol said. &#8220;It may get to a point where people, no matter how trained they will be, will still struggle to differentiate real images from fakes. That&#8217;s why we need to develop tools to identify and counter this. It&#8217;s like a new AI arms race.&#8221;<\/p>\n<p>The study, &#8220;Seeing Is No Longer Believing: A Survey on the State of Deepfakes, AI-Generated Humans, and Other Nonveridical Media,&#8221; appears in the journal <em>Advances in Computer Graphics. <\/em><\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.sciencedaily.com\/releases\/2024\/03\/240306003456.htm\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you recently had trouble figuring out if an image of a person is real or generated through artificial intelligence (AI), you&#8217;re not alone. A<\/p>\n","protected":false},"author":1,"featured_media":77853,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[173],"tags":[],"class_list":["post-77852","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-science"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/77852","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=77852"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/77852\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/77853"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=77852"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=77852"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=77852"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}