Cracked Glass: Why Wearables Are The Next Security Maelstrom

Google Glass has plenty of issues. There's a fair chance you'll get laughed at for wearing it, or at the very least stared at. Battery life won't last you a day, and the list of things you can actually do with the wearable is limited. For all the Saturday Night Live skits and "Glasshole" jokes, though, wearables aren't going away, and that means a new set of security problems for those whose job it is to keep data safe. We sat down with Marc Rogers, long-time threat intelligence expert and current Principal Security Researcher at Lookout Mobile Security to talk wearable risks, what happens when your Nest turns against you, and the big Glass elephant in the room.

Advertisement

Rogers is one of those deceptively innocent people at first glance. When we meet up with him, he's only been in the country for a few hours, but already he's been gathering up a photo gallery of potentially exploitable tech: displays in the airport that have inadvertently booted into the wrong mode, computers left unlocked out in public, even ATMs that have somehow glitched and left themselves susceptible. As he flicks through the casually-snapped pictures, he gives a brief explanation of what he might be tempted to try were he aiming for some unofficial use.

That's before you get into his nondescript black holdall and its contents of devices of mixed degrees of nefariousness. There are high-power NFC scanners that can pull the data from a contactless credit card from a meter or so away, for instance; Rogers points out that, back when banks were first putting NFC chips into cards, they couldn't imagine tag-reading hardware being broadly available. Now, most high-end Android phones can scan bank cards with the right app loaded.

Advertisement

However, Rogers also has with him Google's Glass, the sleek wearable computer that has sent geeks giddy and prompted no small amount of concern about privacy in a world where cameras are wearable. Finally making it into the hands of developers over the past few months, albeit in limited "Explorer Edition" format with a $1,500 price tag, the number of Glass in the wild may still be small, but the privacy furore it has whipped up is exponentially greater.

[aquote]Fears over who's taking your photo are only a small aspect of the Glass risk[/aquote]

As Rogers sees it, fears around who might be taking your photograph, and whether you might know that it's happening, are only a small aspect of the new risks Glass presents. Lookout is already working with Google on one security flaw the team discovered in the wearable, he admitted, though he declined to tell us exact details as addressing them is a work-in-progress.

"There are some issues with Glass, we can't actually talk about what they are – responsible disclosure – we're working with Google to fix it," Rogers told us. "But what I can say is that it's a new piece of technology which is used in a new way, and the vulnerabilities are out there."

Advertisement

Some of Glass' stumbles are clear from the outset, and have a strong whiff of "beta" about them. The wearable lacks any sort of onboard security, for instance – there's no way to lock it if you have to take it off, leaving third-party developers to patch the hole unofficially – and its focus on cloud-based processing means data is constantly being shuffled off the headset and to someone else's servers. They're not, though, the biggest issues.

As the Lookout security researcher points out, what potentially leads to loopholes in Glass – and, indeed, other wearables – is users and developers approaching the headset with the same mindset as they might, say, a smartphone. Even though the internal components may be broadly the same, there's potentially a world of difference in how they could be used – or mis-used.

"You have to think about anything you start using for a new purpose: what's the threat model around that?" Rogers asks. "How would you change the economy of it for a bad guy? Is it now more interesting; is it collecting interesting data? Or, conversely, is it more vulnerable? Are you exposing something that was never exposed before?"

The fact that Glass will be out of a pocket or bag far more often than a smartphone, and can presume a persistent data connection, and demands a blurring of the line between when you know a person is using their device and when they're not, all add up to a piece of equipment with new danger potential, Rogers says. "Things aren't just victims – they're also the aggressors. I can wear Glass in situations where people are not expecting a computer. And that means I can do interesting stuff."

Advertisement

"The first wearable computer was designed in 1960, by a mathematics professor, who used it to break blackjack at casinos" Rogers points out, referring to mathematician Edward O. Thorp whose co-development of the wearable with Claude Shannon earned him membership of the Blackjack Hall of Fame. Thorp's computer – roughly the size of a cigarette pack – could be used to predict where a roulette ball would land, giving the user an expected gain of around 44-percent. "People weren't expecting someone to be able to take a computer into that environment," Rogers says.

[aquote]Glass could identify every security camera and plot you a path through[/aquote]

In fact, as he reels off the first few potential uses of Glass for nefarious purposes, it starts to dawn that having your photo taken at the bus-stop when you weren't expecting it might not be the worst thing a wearable user could do to you. "Who's to say what Glass will allow. Industrial espionage, identifying flaws in buildings; scoping out security positions," he lists. "It would be easy to modify Glass to identify every single security camera, and plot you a path you could walk through a shopping center where you're not going to be recorded."

Advertisement

Could the answer be some sort of low-level disabling of the camera hardware, with Google perhaps entrusting the keys to the system to some morally-minded party? Rogers isn't sure that's necessarily the way forward, despite the calls for more broad-sweeping controls to stop rest-room photography or conversations being unexpectedly recorded.

"I think such an idea is interesting. As a technologist, I'm dubious to see it implemented" he told us. "And also, you have to consider the potential for abuse. Do you want some kind of secret deactivation switch built into your technology? What happens if some bad guy is able to turn it on? There's no way I can think of, off the top of my head, that couldn't be abused."

Moreover, focusing all your efforts on one device – in this case Google's attempt at wearable with Glass – or on one component – like the camera – misses the bigger picture. We've already seen multiple projects come out of the woodwork since Project Glass broke cover, many based on Android but otherwise with completely different architecture to what Google has developed. That, Rogers argues, leads to a classic case of paranoid myopia.

"What is the difference between wearing Google Glass and wearing a pinhole camera? I'd argue it's more easily accessible to access [a pinhole camera]. That kind of stuff is mainstream," he points out. "What Glass has done is draw people's attention to new concepts. So, if people are talking about the risk of Glass, in reality that risk has been around much longer, it's just Glass is making you think about it."

Advertisement

If anything, then, it's the ubiquity of the underlying OS that Rogers thinks should give security experts sleepless nights, not the wearables form-factor. "Convergence is not just your device's functions: most of them are running the same OS, or variants of the same OS. If you're talking about Android, your phone's running Android, your Glass is running Android, your TV is running Android," the researcher points out. "And if you break it down even further, with Linux: your phone's running Linux, your Glass is running Linux, your TV is running Linux, your thermostat is running Linux!"

That may sound flippant, but if knowledge is power then your HVAC controls might know more about your household movements than you could end up feeling comfortable about. We're a long way away from a mechanical switch by the door in the hall.

[aquote]Hacking a Nest? I now own your house[/aquote]

"Some of the things that we're connecting, and allowing to build huge amounts of data... I keep on coming back to the Nest thermostat because it's an awesome example," Rogers says. "Because the Nest knows how many people are in your house, it knows when they come home, it knows what temperature they like it, it knows when you're on holiday, it knows if you've got any pets, it knows if you've got any wireless networks, it knows what's on the wireless networks, and it knows how to connect to the wireless networks. So suddenly, hacking a thermostat – not cool. Hacking a Nest? I now own your house."

Advertisement

SlashGear asked Nest about the security systems in its eponymous smart thermostat, and the company pointed us to its official privacy statement. There, among other things, Nest says it uses "industry-standard methods" to secure the information the thermostat collects "while transmitted over your home network and through the Internet to our cloud servers." Data on the device itself "is encrypted and cannot easily be accessed" the company said, though did not go into further detail on the nature of that security.

Rogers – and Lookout – isn't suggesting we tear our thermostats from the wall. However, he is calling for more intelligent assessment of how today's devices modify the security landscape. "Thinking about it, if you change the purpose of these things [like smart thermostats], how do you assess that?" he asked rhetorically. "Look at all the new bits of data this thing has, and ensure that you put in appropriate levels of security onboard. Is there a patch management process in place? You can no longer say, well, these things are just updated as firmware; you need a scalable process. These are all things that we have to think about, and I don't see many people doing it."

That means keeping an open mind to low- and high-tech ways to evolve security as the devices we use themselves evolve. "If I'm building a shopping mall, I'll consider the physical threats to it ... Glass is the same, so if you are running an organization where you're worried about, say, data being stolen, identify Glass as a theft risk," Rogers suggests. "I know of a company that physically destroys the lenses of any camera-phone they buy, because that's the most effective way to ensure that that camera-phone can't be used to record sensitive documents."

Advertisement

As we pointed out, and Rogers conceded, taking a hammer to the camera isn't really a practical mainstream way to balance privacy and security with future wearables. Instead, we need a combination of awareness – education as to risks and responsibilities, even for those who may never wear something like Glass – and the technology to make it efficient.

"You have to assess what the risk is from people walking through. If it's someone walking through an enterprise, and the risk is data theft, then you have a policy in place to make sure awareness training" he sketched out to us. "Make sure everyone knows that, if you see a device like this, report that kind of suspicious behavior and you intercept the person that's doing it. You have surveillance cameras around that can detect it, maybe even software that identifies someone is wearing it."

[aquote]Anti-virus and anti-malware isn't sexy[/aquote]

We're arguably a long way from that. At the moment, there's a pervading sense that smartphones are immune to hacking or exploits, perhaps where the Mac user-base was a few years back. Anti-virus and anti-malware software isn't sexy, and even if you offer it free (as Lookout does with its basic Mobile Security package) there's no guarantee that users will care enough to install it.

Advertisement

Lookout has a number of deals with carriers in North America and Europe, preloading its software on phones, but most are still going out into the wild unprotected by any vendor. In fact, many users don't even have a PIN number locking their handset, and are quite willing to hit "OK" without reading the security run-down of any third-party app.

As Rogers sees it, you can either be disheartened by that, or you can use it as a wake-up call: getting realistic both about what people need and what they're willing to accommodate in their mobile lives. "You have to think about what's a scalable process for assessing the security risk of new things as they come online," he said, "and that allows us to develop that process, to manage the security process on that thing moving forward, bearing in mind that security doesn't stand still."

Wearables aren't going anywhere, and neither is the growing need to develop ways to protect them, and protect against them. "Right now, there's no answer," Rogers concluded, "but it's something we have to think about."

Recommended

Advertisement