The EEG Headband & Security
Wearable tech is part of our world, and increasingly bodyhackers are finding ways to enhance themselves using these common devices. Many of these devices incorporate off-the-shelf tech, which is convenient. But IOActive consultant Alejandro Hernandez gave a talk at Defcon 23 about one example of common medical technology with considerable security weaknesses—the increasingly ubiquitous electroencephalography (EEG) headset.
The EEG is something of a technomedical curiosity. A (usually) non-invasive means of tracking voltage fluctuations in the brain with a series of electrodes mounted on the scalp, this technology offers a window into the functioning of the brain. Unfortunately, as it turns out, brains are complicated, and so the EEG as mind-reading device is somewhat limited in utility. A neurologist present at Hernandez’s talk described trying to use an EEG to detect a thought as “picking out one conversation in a football stadium from the bridge of the Goodyear Blimp.”
That said, the EEG has a number of common applications. In recreation, it has found use in the Shippo tail, as well as some common toys: Hernandez made use of the NeuroSky MindWave, a common example of a home EEG. In medical contexts, EEGs have been used as a diagnostic measure for sleep disorders, coma, epilepsy, and even brain tumors and strokes. For the most part the higher-stakes medical outcomes are now handled by imaging studies, but the EEG has a place in neurology, particularly because wearable EEGs can be sent home with patients to record activity over a longer period.
It is these devices, medical and recreational, that Hernandez was most interested in—mostly because they are an area of active research as learning aids and control interfaces. “One quick example would be an EEG device connected to a local computer to process the brain waves which in turn is sending instructions to a remote device, let’s say a drone. Currently, many of these EEG protocols are implemented over TCP/IP, that as we know, by design doesn’t count with encryption or authentication, so, if the EEG technology implemented in this example doesn’t count with these security mechanisms in the application layer, perhaps an attacker in the network could sniff valid EEG traffic, make a local copy of it, and later on, the attacker could modify this data and send it back to the drone.” For those unfamiliar with the terms, this is a “reply attack” where valid data is captured and then either sent again later or modified to get a different result. This would be difficult with an EEG, but substituting a totally different EEG from a different brain could have significant impact on health care choices made by providers. On top of that, the idea of hacking someone’s Shippo tail is hilarious on its own but reveals a lack of concern with security. If there’s no significant movement towards a secure implementation, any bodyhacker making use of commercially available EEG gear would be at risk of falling prey to these exploits.
Hernandez is realistic about the scope of the problem. “I only did research on EGG software in terms of acquisition, processing, transmission and storage, in order to identify different flaws that we currently know in other technologies, such as the lack of encryption/authentication and application vulnerabilities (memory corruption, Denial of Service, etc.).” The major issue is mostly that EEG transmission standards just haven’t been updated. As Hernandez says, “10 years ago only a few people were talking about ICS/SCADA security, nowadays, there are still many PLCs crashing with basic malformed packets and many critical systems vulnerable to reply attacks (no authentication).” These are similar to the reply attacks listed above, but in many cases on a larger scale. ICS (Industrial Control Systems) and SCADA (Supervisory Control and Data Acquisition) are both computer systems that monitor and control industrial processes. The point in this case is that we’ve known that there are vulnerabilities in this system for a long time, but to this day there are still very basic attacks that work on them. For medicine, for example, there’s the Meaningful Use standards in the US, which are changing the way medical providers deal with electronic health information (and none too soon), but for educational devices like home EEGs, the implementation of security is up to the individual.
The real solution, however, comes from good design, not grudging regulatory compliance. Hackers and bodyhackers alike have to be aware that whatever tools you use are only as secure as their designers them knew how to make them. Per Hernandez, “From the deep of my heart, I think that secure technical designs, programming and implementations, are much better than just a marked checkbox in a legal A4 piece of paper.”
Alejandro Hernandez is a consultant for IOActive and is reachable at @nitr0usmx.