{"id":99620,"date":"2025-09-19T06:17:00","date_gmt":"2025-09-19T06:17:00","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2025\/09\/19\/meta-ray-ban-display-hands-on-discreet-and-intuitive\/"},"modified":"2025-09-19T06:17:00","modified_gmt":"2025-09-19T06:17:00","slug":"meta-ray-ban-display-hands-on-discreet-and-intuitive","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2025\/09\/19\/meta-ray-ban-display-hands-on-discreet-and-intuitive\/","title":{"rendered":"Meta Ray-Ban Display hands-on: Discreet and intuitive"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div data-article-body=\"true\">\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">I&#8217;ve been testing smart glasses for almost a decade. And in that time, one of the questions I&#8217;ve been asked the most is &#8220;oh, but can you see anything in them?&#8221; For years, I had to explain that no, glasses like that don&#8217;t really exist yet.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">That&#8217;s no longer the case. And while I&#8217;ve seen a bunch of glasses over the last year that have some kind of display, the Meta Ray-Ban Display glasses feel the closest to fulfilling what so many people envision when they hear the words &#8220;smart glasses.&#8221;<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">To be clear, they don&#8217;t offer the kind of immersive AR that&#8217;s possible with <a data-i13n=\"cpos:1;pos:1\" href=\"https:\/\/www.engadget.com\/ar-vr\/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html\" data-ylk=\"slk:Meta's Orion prototype;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">Meta&#8217;s Orion prototype<\/a>. In fact Meta considers &#8220;display AI glasses&#8221; to be a totally separate category from AR. The display is only on one lens \u2014 the right \u2014 and its 20-degree field of view is much smaller than the 70 degrees on Orion. That may sound like a big compromise, but it doesn&#8217;t feel like one.<\/p>\n<figure class=\"relative col-body mb-4\">\n<div class=\"relative\"><img loading=\"lazy\" alt=\"The Meta Ray-Ban Display glasses.\" loading=\"lazy\" width=\"960\" height=\"640\" decoding=\"async\" data-nimg=\"1\" class=\"md:rounded-[10px]\" style=\"color:transparent\" src=\"https:\/\/s.yimg.com\/ny\/api\/res\/1.2\/GA6gJ8hnPf3XKUKJo.2kQA--\/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTY0MA--\/https:\/\/s.yimg.com\/os\/creatr-uploaded-images\/2025-09\/30113ea0-94ea-11f0-b5ef-5d7b928050e5\"\/><button aria-label=\"View larger image\" class=\"group absolute bottom-3 right-3 size-10 md:size-[50px] lg:inset-0 lg:size-full lg:bg-transparent\" data-ylk=\"elm:expand;itc:1;sec:image-lightbox;slk:lightbox-open;\"><span class=\"absolute bottom-0 right-0 rounded-full bg-white p-3 opacity-100 shadow-elevation-3 transition-opacity duration-300 group-hover:block group-hover:opacity-100 md:p-[17px] lg:bottom-6 lg:right-6 lg:bg-white\/90 lg:p-5 lg:opacity-0 lg:shadow-none\"><svg viewbox=\"0 0 22 22\" aria-hidden=\"true\" class=\"size-4 lg:size-6\" width=\"22\" height=\"22\"><path d=\"M12.372.92c0-.506.41-.916.915-.916L21 0l-.004 7.712a.917.917 0 0 1-1.832 0V3.183l-6.827 6.828-1.349-1.348 6.828-6.828h-4.529a.915.915 0 0 1-.915-.915M1.835 17.816l6.828-6.828 1.349 1.349-6.829 6.827h4.529a.915.915 0 0 1 0 1.831L0 21l.004-7.713a.916.916 0 0 1 1.831 0z\"\/><\/svg><\/span><\/button><dialog aria-label=\"Modal Dialog\" aria-modal=\"true\" class=\"fixed inset-0 z-[4] size-full max-h-none max-w-none bg-white hidden\"\/><\/div><figcaption class=\"relative text-[0.875rem]\/[1.25rem] mt-1 engadget-slate-gray line-clamp-2 mt-2.5 md:mt-2 pr-2.5\">\n<div>\n<div class=\"flex items-center space-x-[2px]\"><span>Karissa Bell for Engadget<\/span><\/div>\n<\/div>\n<\/figcaption><\/figure>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">The single display feels much more practical for a pair of glasses you&#8217;ll want to wear every day. It&#8217;s meant to be something you can glance at when you need it, not an always-on overlay. The smaller size also means that the display is much sharper, at 42 pixels per degree. This was especially noticeable when I walked outside with the glasses on; images on the display looked even sharper than in indoor light, thanks to automatic brightness features.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">I also appreciated that you can&#8217;t see any light from the display when you&#8217;re looking at someone wearing the glasses. In fact the display is only barely noticeable at all when you at them up close.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">Having a smaller display also means that the glasses are cheaper, at $799, and that they don&#8217;t look like the chunky AR glasses we&#8217;ve seen so many times. At 69 grams, they are a bit heavier and thicker than the second-gen Meta Ray-Bans, but not much. As someone who has tried on way too many pairs of thick black smart glasses, I&#8217;m glad Meta is offering these in a color besides black. All Wayfarer-style frames look wide on my face but the lighter &#8220;sand&#8221; color feels a lot more flattering.<\/p>\n<figure class=\"relative col-body mb-4\">\n<div class=\"relative\"><img loading=\"lazy\" alt=\"The Meta Ray-Ban Display (left) and second-gen Ray-Ban Meta glasses (right.) The display glasses a little thicker.\" loading=\"lazy\" width=\"960\" height=\"540\" decoding=\"async\" data-nimg=\"1\" class=\"md:rounded-[10px]\" style=\"color:transparent\" src=\"https:\/\/s.yimg.com\/ny\/api\/res\/1.2\/I1VmsgpPZCmY42Z2TbAvQw--\/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTU0MA--\/https:\/\/s.yimg.com\/os\/creatr-uploaded-images\/2025-09\/ebd71e80-94e9-11f0-b579-e622f4760786\"\/><button aria-label=\"View larger image\" class=\"group absolute bottom-3 right-3 size-10 md:size-[50px] lg:inset-0 lg:size-full lg:bg-transparent\" data-ylk=\"elm:expand;itc:1;sec:image-lightbox;slk:lightbox-open;\"><span class=\"absolute bottom-0 right-0 rounded-full bg-white p-3 opacity-100 shadow-elevation-3 transition-opacity duration-300 group-hover:block group-hover:opacity-100 md:p-[17px] lg:bottom-6 lg:right-6 lg:bg-white\/90 lg:p-5 lg:opacity-0 lg:shadow-none\"><svg viewbox=\"0 0 22 22\" aria-hidden=\"true\" class=\"size-4 lg:size-6\" width=\"22\" height=\"22\"><path d=\"M12.372.92c0-.506.41-.916.915-.916L21 0l-.004 7.712a.917.917 0 0 1-1.832 0V3.183l-6.827 6.828-1.349-1.348 6.828-6.828h-4.529a.915.915 0 0 1-.915-.915M1.835 17.816l6.828-6.828 1.349 1.349-6.829 6.827h4.529a.915.915 0 0 1 0 1.831L0 21l.004-7.713a.916.916 0 0 1 1.831 0z\"\/><\/svg><\/span><\/button><dialog aria-label=\"Modal Dialog\" aria-modal=\"true\" class=\"fixed inset-0 z-[4] size-full max-h-none max-w-none bg-white hidden\"\/><\/div><figcaption class=\"relative text-[0.875rem]\/[1.25rem] mt-1 engadget-slate-gray line-clamp-2 mt-2.5 md:mt-2 pr-2.5\">\n<div>\n<div class=\"flex items-center space-x-[2px]\">\n<p>The Meta Ray-Ban Display (left) and second-gen Ray-Ban Meta glasses (right.) The display glasses are a little thicker.<\/p>\n<p><span>(Karissa Bell for Engadget)<\/span><\/div>\n<\/div>\n<\/figcaption><\/figure>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">The Meta Neural Band wristband that comes with the display glasses functions pretty much the same as the band I used on the Orion prototype. It uses sensors to detect the subtle muscle movements on your hand and wrist and can translate that into actions within the glasses&#8217; interface.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">It&#8217;s hard to describe, but the gestures for navigating the glasses interfaces work surprisingly well. I can see how it could take a little time to get used to the various gestures for navigating between apps, bringing up Meta AI, adjusting the volume and other actions, but they are all fairly intuitive. For example, you use your thumb to swipe along the the top of your index finger, sort of like a D-pad, to move up and down and side to side. And you can raise and lower the speaker volume by holding your thumb and index finger together and rotating your wrist right or left like it&#8217;s a volume knob.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">It&#8217;s no secret that Meta&#8217;s ultimate goal for its smart glasses is to replace, or almost replace, your phone. That&#8217;s not possible yet, but having an actual display means you can look at your phone a whole lot less.<\/p>\n<figure class=\"relative col-body mb-4\">\n<div class=\"relative\"><img loading=\"lazy\" alt=\"The Neural Wristband.\" loading=\"lazy\" width=\"960\" height=\"640\" decoding=\"async\" data-nimg=\"1\" class=\"md:rounded-[10px]\" style=\"color:transparent\" src=\"https:\/\/s.yimg.com\/ny\/api\/res\/1.2\/bTUM3ZnXO0w3gdav_MO7Gg--\/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTY0MA--\/https:\/\/s.yimg.com\/os\/creatr-uploaded-images\/2025-09\/cd5d5770-94ea-11f0-9bd9-beed797f3ad5\"\/><button aria-label=\"View larger image\" class=\"group absolute bottom-3 right-3 size-10 md:size-[50px] lg:inset-0 lg:size-full lg:bg-transparent\" data-ylk=\"elm:expand;itc:1;sec:image-lightbox;slk:lightbox-open;\"><span class=\"absolute bottom-0 right-0 rounded-full bg-white p-3 opacity-100 shadow-elevation-3 transition-opacity duration-300 group-hover:block group-hover:opacity-100 md:p-[17px] lg:bottom-6 lg:right-6 lg:bg-white\/90 lg:p-5 lg:opacity-0 lg:shadow-none\"><svg viewbox=\"0 0 22 22\" aria-hidden=\"true\" class=\"size-4 lg:size-6\" width=\"22\" height=\"22\"><path d=\"M12.372.92c0-.506.41-.916.915-.916L21 0l-.004 7.712a.917.917 0 0 1-1.832 0V3.183l-6.827 6.828-1.349-1.348 6.828-6.828h-4.529a.915.915 0 0 1-.915-.915M1.835 17.816l6.828-6.828 1.349 1.349-6.829 6.827h4.529a.915.915 0 0 1 0 1.831L0 21l.004-7.713a.916.916 0 0 1 1.831 0z\"\/><\/svg><\/span><\/button><dialog aria-label=\"Modal Dialog\" aria-modal=\"true\" class=\"fixed inset-0 z-[4] size-full max-h-none max-w-none bg-white hidden\"\/><\/div><figcaption class=\"relative text-[0.875rem]\/[1.25rem] mt-1 engadget-slate-gray line-clamp-2 mt-2.5 md:mt-2 pr-2.5\">\n<div>\n<div class=\"flex items-center space-x-[2px]\"><span>Karissa Bell for Engadget<\/span><\/div>\n<\/div>\n<\/figcaption><\/figure>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">The display can surface incoming texts, navigation with map previews (for walking directions), and info from your calendar. I was also able to take a video call from the glasses \u2014 unlike Mark Zuckerberg&#8217;s attempted live demo during his keynote \u2014 and it was way better than I expected. I could not only clearly see the person I was talking to and their surroundings, I could turn on my glasses&#8217; camera and see a smaller version of the video from my side.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">I also got a chance to try the Conversational Focus feature, which allows you to get live captions of the person you&#8217;re speaking with even in a loud environment that may be hard to hear. There was something very surreal about getting real-time subtitles to a conversation with a person standing directly in front of me. As someone who tries really hard to not look at screens when I&#8217;m speaking to people, it almost felt a little wrong. But I can also see how this would be incredibly helpful to people who have trouble hearing or processing conversations. It would also be great for translations, something Meta AI already does very well.<\/p>\n<div class=\"col-[body-start\/body-end] mb-5 w-full px-0\">\n<div aria-hidden=\"true\">\n<div class=\"my-2 items-center\">\n<p><span class=\"text-2xl font-normal text-gray-6 dark:text-dirty-seagull\"><span class=\"text-gray-950\">1<\/span> \/<!-- --> <span class=\"engadget-slate-gray\">5<\/span><\/span><\/p>\n<h2 class=\"mt-2 text-2xl font-bold text-inkwell dark:text-dirty-seagull\">Meta Ray-Ban Display glasses.<\/h2>\n<\/div>\n<div class=\"relative text-[0.875rem]\/[1.25rem] engadget-slate-gray text-xl leading-6 line-clamp-2 pr-2.5\">\n<div>\n<p>You can just barely see the display from the front of the lenses.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">I also appreciated that the wristband allows you to invoke Meta AI with a gesture so you don&#8217;t always have to say &#8220;Hey Meta.&#8221; It&#8217;s a small change, but I&#8217;ve always felt weird about talking to Meta AI in public. The display also addresses another one of my longtime gripes with the Ray-Ban Meta and Oakley glasses: framing a photo is really difficult. But with a display, you can see a preview of your shot, as well as the photo after the fact, so you no longer have to just snap a bunch and hope for the best.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 engadget-charcoal\">I&#8217;ve only had about 30 minutes with the glasses, so I don&#8217;t really know how having a display could fit into my daily routine. But even after a short time with them, they really do feel like the beginning of the kind of smart glasses a lot of people have been waiting for.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.engadget.com\/wearables\/meta-ray-ban-display-hands-on-discreet-and-intuitive-002334346.html?src=rss\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>I&#8217;ve been testing smart glasses for almost a decade. And in that time, one of the questions I&#8217;ve been asked the most is &#8220;oh, but<\/p>\n","protected":false},"author":1,"featured_media":99621,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[157],"tags":[],"class_list":["post-99620","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-gadget"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/99620","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=99620"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/99620\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/99621"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=99620"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=99620"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=99620"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}