- Remove the current class from the content27_link item as Webflows native current state will automatically be applied.
- To add interactions which automatically expand and collapse sections in the table of contents select the content27_h-trigger element, add an element trigger and select Mouse click (tap)
- For the 1st click select the custom animation Content 27 table of contents [Expand] and for the 2nd click select the custom animation Content 27 table of contents [Collapse].
- In the Trigger Settings, deselect all checkboxes other than Desktop and above. This disables the interaction on tablet and below to prevent bugs when scrolling.
EEG in everyday headphones sounds like science fiction. It isn't. Here's what's actually happening when you put on a pair of brain-sensing headphones — and why it works.
What EEG is (and isn't)
EEG stands for electroencephalography. It's a method of measuring the electrical activity produced by neurons in your brain. When neurons fire, they generate small electrical signals on the order of micro-volts. With the right sensors positioned against the scalp, those signals can be detected, recorded, and amplified.
EEG is not invasive. There are no needles, no implants, no radiation. The sensors sit on the surface of the skin. The signals they detect are real — the same signals measured in hospital EEG labs for decades — but now captured through sensors small enough to fit inside the ear cups of consumer headphones.
The hardware: where the sensors go
In a traditional clinical EEG setup, a patient wears a cap covered in electrodes connected by wires to recording equipment. That's not viable for everyday use.
In lab settings, research scientists use a variety of hardware devices. Sometimes they use wet electrodes, gel caps, dry electrodes, or novel devices. For many researchers, setup time, comfort, and data quality are important factors in deciding which EEG devices to use.
In a consumer headphone form factor, EEG sensors are embedded at key contact points: typically at the ear, along the headband, and sometimes at the temples. These are locations where brain signals can be measured and contact can be maintained during normal use.
The placement is less comprehensive than a clinical cap — which can have 64 or more electrodes — but it's sufficient to capture meaningful signals from the brain associated with attention, workload, and cognitive state. The engineering challenge is making this work with everyday movement, variable fit, and without conductive gel.
The signal: what you're actually measuring
Raw EEG data is a waveform — a continuous stream of electrical signal measured in microvolts. On its own, it looks like noise. The meaningful information is in the frequency components of that signal.
Different frequency bands correspond to different brain states:
- Alpha waves (8-13 Hz): reflects relaxed wakefulness, sensory inhibition, and attentional control[
- Beta waves – frequently split into Low and High Beta:
- Low Beta (13-20 Hz): supports sustained attention, active thinking, and problem-solving. Moderate increases are associated with cognitive engagement and mental focus
- High Beta (20-30 Hz): linked to alertness, mental effort, working memory load, and focused attention. Excessive high beta is associated with anxiety, hyperarousal, or stress
- Gamma waves (30-80 Hz): associated with high-level cognitive processing such as complex problem-solving. Higher gamma → muscular tension associated with physical or emotional stress
- Theta waves (4-8 Hz): supports memory encoding, retrieval, and working memory, associated with drowsiness, creative thought, and emotional regulation
- Delta waves (0.5-4 Hz): dominant during deep sleep, excess delta while awake may indicate impaired cortical processing
By analyzing the relative power and patterns of these frequency bands in real time, and measuring them against an individual’s baselines, it's possible to make reliable inferences about cognitive changes.
Where AI comes in
Raw frequency analysis is only the starting point. The hard problem is that EEG signals are noisy — affected by muscle movement, environmental interference, and significant variation between individuals. A signal that indicates focus in one person might look different in another.
This is where AI makes the category viable. Machine learning models trained on large, diverse datasets of real-world EEG can learn to separate signal from noise, account for individual variation, and produce reliable cognitive state classifications in real time.
Neurable's AI platform is trained on the largest real-world EEG dataset in existence — collected from actual consumer use, not controlled lab conditions. That distinction matters. Lab data doesn't capture what the brain looks like when you're commuting, or working in a loud office, or wearing headphones for eight hours. Real-world data does.
The output: what you actually get
The end result of all of this is a real-time cognitive health feed: a continuous, interpretable signal that tells you — or tells the software or device you're using — something meaningful about your mental state in the moment and how it’s changing over time.
Focus score. Fatigue level. Cognitive load. These are derived from direct measurement of brain activity, processed through AI models validated against research standards.
It works. And it works in headphones you'd buy anyway.
2 Distraction Stroop Tasks experiment: The Stroop Effect (also known as cognitive interference) is a psychological phenomenon describing the difficulty people have naming a color when it's used to spell the name of a different color. During each trial of this experiment, we flashed the words “Red” or “Yellow” on a screen. Participants were asked to respond to the color of the words and ignore their meaning by pressing four keys on the keyboard –– “D”, “F”, “J”, and “K,” -- which were mapped to “Red,” “Green,” “Blue,” and “Yellow” colors, respectively. Trials in the Stroop task were categorized into congruent, when the text content matched the text color (e.g. Red), and incongruent, when the text content did not match the text color (e.g., Red). The incongruent case was counter-intuitive and more difficult. We expected to see lower accuracy, higher response times, and a drop in Alpha band power in incongruent trials. To mimic the chaotic distraction environment of in-person office life, we added an additional layer of complexity by floating the words on different visual backgrounds (a calm river, a roller coaster, a calm beach, and a busy marketplace). Both the behavioral and neural data we collected showed consistently different results in incongruent tasks, such as longer reaction times and lower Alpha waves, particularly when the words appeared on top of the marketplace background, the most distracting scene.
Interruption by Notification: It’s widely known that push notifications decrease focus level. In our three Interruption by Notification experiments, participants performed the Stroop Tasks, above, with and without push notifications, which consisted of a sound played at random time followed by a prompt to complete an activity. Our behavioral analysis and focus metrics showed that, on average, participants presented slower reaction times and were less accurate during blocks of time with distractions compared to those without them.



