Editor’s Note: Welcome to my weekly column, Virtual Case Notes, in which I interview industry experts for their take on the latest cybersecurity situation. Each week I will take a look at a new case from the evolving realm of digital crime and digital forensics. For previous editions, please type “Virtual Case Notes” into the search bar at the top of the site.

EEG headsets aren’t just for medical use anymore—in recent years, brain wave sensing technology and wearables have been marketed to consumers for personal and recreational use, and at relatively low costs ranging from about $150-$800. Recent versions of these brain-computer interface devices have come with the ability to connect them to your phone or computer, sending your brain wave signals to the system where they can be analyzed by an app or other software and tracked for wellness purposes or translated into actions for videogames.  

But like all other devices tangled up in the internet of things that increasingly connects our modern world, EEG headsets are not immune to security threats. Don’t worry—hackers can’t use them for mind control or to read your thoughts just yet. But as researchers from the University of Alabama at Birmingham discovered in recent experiments, a malicious actor could take data from the headsets—your brain waves—and learn snippets of information from them: namely, your pins and passwords.  

“One issue with these devices is that any application that might be running on a computer, or even a website, could actually access the brain signals without asking any explicit permission from the users, as opposed to, for example, the camera or microphone where a user has to give his consent before the recording can take place,” said Nitesh Saxena, researcher and associate professor of computer science at UAB, in an interview with Forensic Magazine. “And that could be a problem because then if somehow the attacker can learn your signals, and then coordinate with some information that you might be thinking about (…) the user would not be aware and the website or malicious application could access that information.”

Saxena and his research team, including UAB Ph.D. student Ajaya Neupane and then-UAB master’s student Md Lutfor Rahman, conducted a study to see if a malicious application could in fact not only access one’s brain waves, but then use them to steal valuable information such as pins and passwords that the user inputs while wearing a personal EEG headset. The experiment involved participants typing in specified sets of numbers and characters—similar to a CAPTCHA request—while wearing a personal Emotiv-brand consumer EEG headset. The researchers developed a machine-learning model that would gradually build an understanding of which brain signals corresponded to which character and number inputs, so that when the participants later typed in a password or pin number while still wearing the headset, the model could guess which keys were being pressed based on the brain waves being received.

“There were two scenarios we considered for our study—virtual keyboards and physical keyboards, and pins and passwords,” Neupane explained to Forensic Magazine. Participants were given four-digit pins and six-character passwords to input during the study, Neupane said. Both of these scenarios were then tested with physical keyboards and with virtual keyboards that involved clicking keys with a mouse on a keyboard layout that appeared on a screen, similar to a keyboard that might appear on a touchscreen device. The results of the experiment revealed that the model, named PEEP, was able to predict each digit and character with much greater accuracy than a random guess, in both pin and password scenarios. 

“We were able to identify the digits (in a pin) with an accuracy of around 47 percent. So if you make a random guess, there are 10 digits, the accuracy would be around 10 percent. But our machine learning model, our classification model, was able to predict the numbers with an accuracy of 47 percent in this scenario,” Neupane said. When it came to six-digit passwords, the accuracy rate was about 37 percent, Neupane said, also significantly greater than the approximate 3.8 percent chance of randomly guessing one out of 26 characters. 

“Our learning models were pretty good at predicting these digits and characters. There was a good possibility of inferring the numbers and the characters from the brain signals,” Neupane concluded.

A malicious program “peeping” into our minds to steal our pins and passwords may sound like a dystopian sci-fi scenario, but as Saxena explains, the model isn’t pulling the information directly from one’s thoughts or memories. 

"We didn’t really have the users memorize the passwords. We had these passwords written down and they were just entering, so it was mostly due to the motor movements, like the hand movement, or their eye muscle movement as they were looking at the screen,” Saxena said. “These are also signals that are captured by these brain-computer interfaces, because as you’re wearing them on your head, they’re meant to kind of capture many of these tiny signals.”

Importantly, these signals are not universal, and are unique from person to person—Saxena and Neupane’s research found that a generalized model that had learned from all the participants was less accurate in predicting the digits and characters, though still more accurate than random chance. The researchers also tested a clinical-grade EEG set on one participant and found it performed about the same as the recreational headset.

A time of wide-spread EEG headset hacking may seem far in the future, due to these devices currently being more of a novelty than a common household staple, but the researchers say that as they increase in popularity, especially for gaming purposes, hackers can find ways to trick users into giving up their brain signals to a similar learning model. A hacker could design a seemingly benign, EEG-headset-compatible gaming app that requests CAPTCHA inputs from the user and then uses what it learns to analyze the signals they give out as they switch out of that app to log into their banking or social media accounts, still wearing the headset. In a real-world scenario, predicting pins and passwords might be even easier, according to Saxena, as hackers who can accurately predict just a few characters can use password dictionaries of commonly used passwords to guess the rest of a string.

Saxena says one solution to the threat could be for the device to somehow transmit some “noise” or meaningless pre-recorded brain signals to the system whenever a pin or password is being input. Another solution is for users to be asked permission before any application can record their brain signals, and make an informed decision, similar to permissions required by cameras and microphones. Lastly, users can protect themselves by using longer and complex passwords, lessening the chance that a malicious program could guess the entire string.

“It’s hard to change the behavior of the users, but clearly if the users are using complex passwords, they’re using passwords that combine digits and characters and maybe special characters, that would probably already be a defense for many attacks, including the ones that we are discussing,” Saxena said.

Saxena and his team are continuing to conduct research on potential privacy threats posed by consumer EEG headsets. They are currently studying whether machine-learning models could predict whether a user has a mental condition, such as alcoholism, from their brain waves.