Mind Readers Can Steal Your Biometric Info

Mind Readers Can Steal Your Biometric InfoBy now, most people have come to the position that passwords suck. The momentum for alternate means of authentication is growing. Researchers are working on how to use biometric technology for mainstream login activities. As I have pointed out there is a number of emerging biometric techniques like; iris scans, facial recognition, or behavioral characteristics. All of these methods have flaws, which pose a problem for authentication non-repudiation.

passwords suckIn a post at IEEE Spectrum, Megan Scudellari writes that fingerprints can be stolen, iris scans spoofed, and facial recognition software fooled. In the wake of these flaws, researchers have turned to brain waves as the next step in biometric identification. Biometric identification is any means by which a person can be uniquely identified by evaluating one or more distinguishing biological traits. Unique identifiers include fingerprints, hand geometry, earlobe geometry, retina and iris patterns, voice waves, DNA, and signatures.

The researchers are racing to prove how accurately and accessibly they can verify a person’s identity using electroencephalograph (EEG) data. An EEG is a test that detects electrical activity in the brain using electrodes attached to the scalp. The IEEE article explains that as your eyes skim over these pixels you are reading and turn them into meaningful words, your brain cells are flickering with a pattern of electrical activity that is unique to you. These unique patterns can be used like a password or biometric identification. In fact, researchers have taken to calling them “passthoughts”.

brain cells are flickering with a pattern of electrical activity that is unique to youUsing brainwaves to authenticate people goes back a while. Back in 2012, I wrote about the Muse headband sensor which promised to “create a specific brainwave signature or a password they would never have to say out loud or type into a computer.” More recently, psychologists and engineers at Binghamton University in New York achieved 100 percent accuracy at identifying individuals using brain waves captured with a skullcap with 30 electrodes. Scientists at the University of California at Berkeley have adopted a set of earbud sensors that worked with 80 percent accuracy.

The problem is our brains don’t produce a single, clear signal that can be checked like a fingerprint. The article says our brains emit a messy, vibrant symphony of personal information, including one’s emotional state, learning ability, and personality traits. The author contends that as EEG technology becomes cheaper, portable, and more ubiquitous—not only for identity authentication, but in apps, games, and more— there’s a high likelihood that someone will tap into that concerto of information for malicious purposes. Abdul Serwadda, a cybersecurity researcher at Texas Tech University told Spectrum;

If you have these apps, you don’t know what the app is reading from your brain or what [the app’s creators are] going to use that information for, but you do know they’re going to have a lot of information

The Texas Tech team performed experiments to see if they could glean sensitive personal information from brain data captured by two popular EEG-based authentication systems. Surprise, surprise: they were able to capture sensitive personal information from brain data.

capture sensitive personal information from brain data.

Mr. Serwadda presented his results at the IEEE International Conference on Biometrics. The Texas Tech researchers examined EEG-based authentication systems that claimed high levels of authentication accuracy. One system examined was the Berkley model, and the second was based on the Binghamton model. The article explains that these EEG-based authentication systems utilize specific features, or markers, of brain activity to identify a person, like isolating the melody of a specific orchestra instrument to identify a song.

ListeningThe researchers wanted to see if those markers also contained sensitive personal information—in this case, a tendency for alcoholism. They ran old EEG scans which included alcoholics and non-alcoholics through the systems. Using the brain wave data, they were able to accurately identify 25% of the alcoholics in the sample. That’s 25% of people who just lost their privacy. Mr. Serwadda said;

We weren’t surprised, because we know the brain signal is so rich in information … But it is scary. [Wearable brain measurement] is an application that’s just about to go mainstream, and you can infer a lot of information about users.

The researcher said that malicious third parties could mine brain data to make inferences about learning disabilities, mental illnesses, and more. He told Spectrum, “Imagine if you made these things public, and insurance companies became aware of them … It would be terrible.”

IOActive senior consultant Alejandro Hernández told The Register that dangerous vulnerabilities exist in EEG kits. EEG’s security problems are depressingly familiar results of bad software design, Hernández said. EEG devices are vulnerable to man-in-the-middle attacks, as well as less-severe application vulnerabilities and ordinary crashes. Mr. Hernández says.

… some applications send the raw brain waves to another remote endpoint using the TCP/IP protocol, that by design doesn’t include security, and therefore this kind of traffic is prone to common network attacks such as man-in-the-middle where an attacker would be able to intercept and modify the EEG data sent.

steal raw EEG dataThe IOActive consultant found that components like the acquisition device, middleware, and endpoints lack authentication meaning an attacker can connect to a remote TCP port and steal raw EEG data. That same flaw lets attacks pull off the more dangerous reply attacks.

Unfortunately, the researchers do not have a solution for how to secure such information—though in the study, compromising a little on authentication accuracy did reduce the ability to detect who was an alcoholic. Mr. Serwadda hopes other research teams will now take privacy, and not just accuracy, into account when optimizing such systems. Professor Serwadda concludes, “We have to prepare for the movement of brain wave [assessment] into our daily lives.”

Rb-

Given the willingness of apps developers to sell share any info to any third party and the unwillingness of the public to take even basic steps to secure their info online, everyone’s deepest personal information can be hacked in the future.

Another problem with passthoughts UC Berkeley’s John Chuang identifies that stress, mood, alcohol, caffeine, medicine, and mental fatigue could change the electrical signals that are generated.

Despite advances in logging in with your mind, there might always be a need for an old-fashioned eight-plus character phrase with no spaces. “Passwords will never go away,” says Berkeley’s Chuang. He reasons that for a computer, a typed password may be the easiest way to verify identity, while a finger swipe may be best for a touch screen.

But we need to think beyond those to future devices—wearables, for instance—for which there will be neither a keyboard nor a touch screen. “For each device, we must figure out what are the most natural, intuitive ways to tell the device that we are who we are,” Professor Chuang says. Going directly to the brain seems like an obvious choice.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Comments are closed.