
Ten years ago, a Fitbit was about as sophisticated a wearable as you could get. The Apple Watch soon supplanted it, quickly becoming the world’s best-selling smartwatch. Then came the sleeker, more unassuming Oura ring.
Now there’s a new breed of wearables—built for your head. Instead of tracking your step count, heart rate, and skin temperature, these devices are designed to read your brain waves. Using electroencephalography, or EEG, they detect electrical impulses produced by the brain and use AI to make sense of them.
Take Elemind, for example. Rather than just tracking your sleep, the Cambridge, Massachusetts-based company’s device aims to actually improve it. Elemind’s $350 headband feels straight out of Star Trek and is designed to boost sleep quality. It detects a person’s brain signals to know whether they’re asleep or awake and delivers a type of acoustic stimulation known as pink noise to move the brain from wakeful patterns to delta waves, which represent a deeper sleep. In a small trial of 21 participants, the device helped more than three-quarters of them fall asleep faster.
If you’re the type to work smarter rather than harder, you can buy a $500 pair of headphones from Boston-based Neurable to hack your productivity. Equipped with EEG sensors, the headphones track brain activity associated with concentration—namely, beta waves—to tell users how focused they are. When I tried them out last year, they confirmed what I already suspected: My most focused working hours are during the morning. The device also nudges you to take the occasional break if it thinks you’ve been deeply focused for too long, a feature I appreciate as someone who spends a lot of time in front of a computer screen.
Apple is also getting into wearable brain tech. The company filed a patent in 2023 for EEG-sensing AirPods, though they have yet to hit the market. Earlier this year, however, Apple unveiled a new accessibility feature to allow its Vision Pro to be controlled with brain waves instead of physical movement. It means that the augmented reality headset can now be integrated with brain-computer interfaces, or BCIs—systems that read brain signals to allow users to control devices with their thoughts.
One neurotech company, Cognixion, is already taking advantage of the new Apple feature. The Santa Barbara, California, startup built an augmented reality app to run on the Vision Pro and a custom headband that detects brain signals. For now, Cognixion is focused on using the tech to help restore communication in people with speech impairments due to paralysis. But it’s not hard to see how a Vision Pro equipped with a BCI could be adopted by a wider population for things like gaming or texting with your mind.
Earlier this year, I spoke with Andreas Melhede of Elata Biosciences, who’s building what he calls the “open internet of brains,” an open-source network where anyone can create a neuro app that can run on an EEG device. The nonprofit organization created its own device and a Pong app, which it demoed this fall during a crypto conference in Singapore. Around 30 people gathered on a restaurant patio to compete in a Pong tournament, but instead of handheld controllers, competitors were fitted with a headset to track their brain signals. Their goal: Hit a ball on a screen with their paddle using just their thoughts.
