Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device.
The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users.
What is Neural Handwriting?
For those catching up, the feature uses the Neural Band, the sEMG wristband Meta ships in the box with the $799 glasses, to detect subtle finger movements. Then, it translates those movements into typed text on an app.
To use the feature, wear the Ray-Ban Display, and while wearing the Neural Band on your wrist, move your fingers as though you were writing a letter. The glasses can convert your finger movements (in the air) into a message in WhatsApp, Messenger, Instagram, or your phone’s native messaging app.
The feature works on both Android and iOS. While the feature opens a new use case for the Ray-Ban Meta Display, and it surely is generating quite a lot of headlines, the update also opens the device to third-party web app developers for the first time.
What else did Meta update?
To me, that sounds like Meta is treating the glasses as a platform, not just a product it sells to end users.
This could enable developers to build AI assistants, productivity tools, navigation overlays, accessibility features, and gesture-controlled experiences that could expand the device’s appeal beyond messaging and media capture.
Beyond the two developments, Meta also brings Display Recording to the glasses, a new mode that captures the lens display output, camera footage, and surrounding audio, into a single video file.
Walking directions now cover the entire United States, along with major international cities like London, Paris, and Rome. The live captions feature is expanding to WhatsApp, Messenger, and Instagram DM voice messages. Additionally, Muse Spark AI is coming to the glasses this summer.”






