Security of Wearables

Security of Wearables

By now, we’re becoming more aware of the security risks that wearables and the ‘internet of things’ poses to ourselves and our data. Whether it’s Fitbit users having their sexual activity logged and viewable online, or people realizing that fitness trackers can prove someone’s whereabouts (and thus innocence or guilt) through GPS tracking, consumers are becoming more and more conscious of the fact that our quantified self apps and devices can leave a lot of our PII—personal identifying information—easily compromised.

Symantec recently released a report that covers the concerns of devices, apps, and medical office devices, and covers their call to actions for ensuring that these goods are properly created in the future. Their research found that 52% of apps or devices used for personal tracking didn’t even have a privacy policy. Also, 20% were sending their login information in cleartext – with passwords written plainly readable, rather than encrypting the data, leaving users vulnerable.

So whether you’re using a wearable device or a personal tracking app on your phone, there are numerous points where your information can be compromised; from the device broadcasting on Bluetooth LE, or the syncing of an app to its database. Devices that communicate with Bluetooth usually have a pairing pin, and -not counting any backing up to the cloud- would require a hacker to be close enough to ‘sniff’ the traffic. These pins tend to be around 6 numbers long, which takes a surprisingly short amount time for brute-force hacking. With some easily available tools, it’s incredibly easy to figure out with brute-force the six digit pin, disabling the encryption between devices (if they even have it), leaving whatever information is passed between them easily readable. This can include login information, plain-text text messages, photos, or gps location data.

Symantec, along with consumers and other companies, have called for future apps and devices to be designed with security in mind during the entire production cycle, and not leaving it as an afterthought. In a recent survey of healthline.com readers, more than 45% of wearable or app users were concerned about the privacy of their data. And this is not without cause. HP studied 10 ‘industry leading’ smart watches and found that 9 used no encryption at all, and that all have at least one “area that raises security concerns.” [LINK 3]

This doesn’t mean we need to stop using our quantified self tools though. By using best practices such as using unique, complex passwords for each account, and by understanding the privacy policies you are signing, users can stay on top of their data risks. This, combined with the call to actions raised by consumers and corporations alike, will continue to encourage the device and app developers to take a realistic look at the security of their products during the development and deployment life cycles.

Susan Butler

No Comments

Post a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.