{"id":78975,"date":"2024-03-26T19:59:19","date_gmt":"2024-03-26T19:59:19","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2024\/03\/26\/this-camera-captures-156-3-trillion-frames-per-second\/"},"modified":"2024-03-26T19:59:19","modified_gmt":"2024-03-26T19:59:19","slug":"this-camera-captures-156-3-trillion-frames-per-second","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2024\/03\/26\/this-camera-captures-156-3-trillion-frames-per-second\/","title":{"rendered":"This camera captures 156.3 trillion frames per second"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p>Scientists have created a blazing-fast scientific camera that shoots images at an encoding rate of 156.3 terahertz (THz) to individual pixels \u2014 equivalent to 156.3 trillion frames per second. Dubbed SCARF (swept-coded aperture real-time femtophotography), the research-grade camera could lead to breakthroughs in fields studying micro-events that come and go too quickly for today\u2019s most expensive scientific sensors.<\/p>\n<p>SCARF has successfully captured ultrafast events like absorption in a semiconductor and the demagnetization of a metal alloy. The research could open new frontiers in areas as diverse as shock wave mechanics or developing more effective medicine.<\/p>\n<p>Leading the research team was <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:1;pos:1\" class=\"link \" href=\"https:\/\/inrs.ca\/en\/research\/professors\/jinyang-liang\/\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:Professor Jinyang Liang;elm:context_link;elmt:doNotAffiliate;cpos:1;pos:1;itc:0;sec:content-canvas\">Professor Jinyang Liang<\/a> of Canada\u2019s Institut national de la recherche scientifique (INRS). He\u2019s a globally recognized pioneer in ultrafast photography who built on his breakthroughs from a separate study six years ago. The current research was <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1\" class=\"link \" href=\"https:\/\/www.nature.com\/articles\/s41467-024-45820-z\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:published;elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1;itc:0;sec:content-canvas\">published<\/a> in <em>Nature<\/em>, <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1\" class=\"link \" href=\"https:\/\/inrs.ca\/en\/news\/pushing-back-the-limits-of-optical-imaging-by-processing-trillions-of-frames-per-second\/\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:summarized;elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1;itc:0;sec:content-canvas\">summarized<\/a> in a press release from INRS and first <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:4;pos:1\" class=\"link \" href=\"https:\/\/www.sciencedaily.com\/releases\/2024\/03\/240325135711.htm\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:reported;elm:context_link;elmt:doNotAffiliate;cpos:4;pos:1;itc:0;sec:content-canvas\">reported<\/a> on by <em>Science Daily<\/em>.<\/p>\n<p>Professor Liang and company tailored their research as a fresh take on ultrafast cameras. Typically, these systems use a sequential approach: capture frames one at a time and piece them together to observe the objects in motion. But that approach has limitations. \u201cFor example, phenomena such as femtosecond laser ablation, shock-wave interaction with living cells, and optical chaos cannot be studied this way,\u201d Liang said.<\/p>\n<figure class=\"caas-figure\">\n<div class=\"caas-figure-with-pb\" style=\"max-height: 283px\">\n<div>\n<div class=\"caas-img-container caas-img-loader\" style=\"padding-bottom:29%\"><img decoding=\"async\" class=\"caas-img caas-lazy has-preview\" alt=\"Components of a research-grade camera spread in a row on a scientific table.\" src=\"https:\/\/s.yimg.com\/ny\/api\/res\/1.2\/pVNDArGBX7LuoeTK3NhR9w--\/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTI4Mw--\/https:\/\/s.yimg.com\/os\/creatr-uploaded-images\/2024-03\/96b53710-eb9f-11ee-beec-23c95ee5fafa\"\/><noscript><img decoding=\"async\" alt=\"Components of a research-grade camera spread in a row on a scientific table.\" src=\"https:\/\/s.yimg.com\/ny\/api\/res\/1.2\/pVNDArGBX7LuoeTK3NhR9w--\/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTI4Mw--\/https:\/\/s.yimg.com\/os\/creatr-uploaded-images\/2024-03\/96b53710-eb9f-11ee-beec-23c95ee5fafa\" class=\"caas-img\"\/><\/noscript><\/div>\n<\/div>\n<\/div>\n<p><figcaption class=\"caption-collapse\"><em>SCARF<\/em><span class=\"caption-credit\"> (Institut national de la recherche scientifique)<\/span><\/figcaption><\/p>\n<\/figure>\n<p>The new camera builds on Liang\u2019s previous research to upend traditional ultrafast camera logic. \u201cSCARF overcomes these challenges,\u201d INRS communication officer Julie Robert wrote in a statement. \u201cIts imaging modality enables ultrafast sweeping of a static coded aperture while not shearing the ultrafast phenomenon. This provides full-sequence encoding rates of up to 156.3 THz to individual pixels on a camera with a charge-coupled device (CCD). These results can be obtained in a single shot at tunable frame rates and spatial scales in both reflection and transmission modes.\u201d<\/p>\n<p>In extremely simplified terms, that means the camera uses a computational imaging modality to capture spatial information by letting light enter its sensor at slightly different times. Not having to process the spatial data at the moment is part of what frees the camera to capture those extremely quick \u201cchirped\u201d laser pulses at up to 156.3 trillion times per second. The images\u2019 raw data can then be processed by a computer algorithm that decodes the time-staggered inputs, transforming each of the trillions of frames into a complete picture.<\/p>\n<p>Remarkably, it did so \u201cusing off-the-shelf and passive optical components,\u201d as the paper describes. The team describes SCARF as low-cost with low power consumption and high measurement quality compared to existing techniques.<\/p>\n<p>Although SCARF is focused more on research than consumers, the team is already working with two companies, Axis Photonique and Few-Cycle, to develop commercial versions, presumably for peers at other higher learning or scientific institutions.<\/p>\n<p>For a more technical explanation of the camera and its potential applications, you can <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:5;pos:1\" class=\"link \" href=\"https:\/\/www.nature.com\/articles\/s41467-024-45820-z\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:view the full paper in Nature;elm:context_link;elmt:doNotAffiliate;cpos:5;pos:1;itc:0;sec:content-canvas\">view the full paper in <em>Nature<\/em><\/a>.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.engadget.com\/this-camera-captures-1563-trillion-frames-per-second-184651322.html?src=rss\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Scientists have created a blazing-fast scientific camera that shoots images at an encoding rate of 156.3 terahertz (THz) to individual pixels \u2014 equivalent to 156.3<\/p>\n","protected":false},"author":1,"featured_media":78976,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[157],"tags":[],"class_list":["post-78975","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-gadget"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/78975","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=78975"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/78975\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/78976"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=78975"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=78975"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=78975"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}