**Meta Unveils Ray-Ban Display Glasses: A Hands-On Look at Their Next-Gen Wearable**
At its annual Connect event, Meta unveiled its latest foray into consumer wearable technology—the $799 Meta Ray-Ban Display glasses. While the glasses themselves are impressive as Meta’s first consumer smart glasses with a built-in display, it was the accompanying neural wristband that truly stole the show during hands-on demonstrations.
**A Step Closer to Meta’s Vision**
Meta’s CEO, Mark Zuckerberg, has been vocal about his vision for the future of computing: a world where smart glasses and headsets eventually replace smartphones as our primary digital companions. The new Ray-Ban Display glasses represent a tangible move in that direction. Unlike last year’s Orion prototype—bulky, demo-only glasses that required an external computing puck and could overlay elaborate 3D visuals—the Ray-Ban Display is a consumer-ready product, available in the United States beginning September 30.
The display, embedded in the right lens, is relatively simple compared to the sci-fi visions of augmented reality. But it offers several practical features, such as reading incoming messages, previewing photos, and even displaying live captions during conversations. While it’s not yet the immersive AR experience some may have hoped for, it marks an important incremental step toward wearable, always-available information.
**A Subtle, Translucent Display**
Wearing the Ray-Ban Display glasses, the first thing users will notice is the small, translucent screen that appears just below the right eye. This miniaturized display is comparable to a smartphone notification bar, but it floats in your field of vision without blocking your view of the real world. The icons and text are high-resolution, but depending on the lighting and background, they can sometimes appear less crisp or a bit murky—especially when contrasted directly against real-world objects.
This limitation, however, is by design. The display isn’t meant to create a fully immersive or distracting overlay, but rather to provide quick, glanceable information and simple utility. For example, users can activate the camera, check which song is playing on Spotify, or read a short message without reaching for their phone.
**Controlling the Glasses: The Neural Wristband**
What truly sets the Meta Ray-Ban Display apart from other smart glasses is its innovative neural wristband. This gray, fuzzy accessory looks unassuming but houses a sophisticated electromyography (EMG) sensor. When worn, it detects the electrical signals generated by the user’s muscles, allowing for gesture-based control of the glasses.
Putting on the wristband is as simple as fastening a watch, but users may feel a faint electric jolt as the device activates—noticeable, but not uncomfortable. Once on, the wristband unlocks a unique, intuitive method of interacting with the glasses. For instance, clenching your fist and swiping your thumb across your pointer finger works like a touchpad, letting you scroll through apps and menus.
Opening the camera app is accomplished by pinching your index finger and thumb together. This gesture-based interface is novel and fun, but not always 100% reliable on the first try; mastering the timing and cadence of pinches can take practice. The learning curve is reminiscent of the early days of computer mice—simple in concept, but requiring a bit of muscle memory to perfect. During the demonstration, the reviewer found themselves repeatedly pinching their fingers in the air, feeling both slightly awkward and amused, recalling comedic sketches about miming gestures.
**Hands-Free, Voice-Activated Features**
In addition to gesture controls, the Ray-Ban Display glasses can be operated using Meta’s AI-powered voice assistant. This feature carries over from previous generations of Meta’s smart glasses, offering hands-free functionality for taking photos, searching information, or controlling playback.
During the demonstration, the reviewer attempted to use Meta AI to identify art on the walls by snapping a photo and asking the assistant for information. Although the AI didn’t respond as expected before the demo ended, the potential for this feature is clear—imagine instantly having context and details about what you’re looking at, simply by asking.
One area where the glasses’ display shined was with live captioning. Even in a noisy environment filled with music and chatter, the system was able to accurately transcribe the tour guide’s spoken words and display them in real-time, much like closed captions on a TV. This could be incredibly helpful in loud
