There Is No "Simple Trick" To Privacy
As 2019 draws to a close, if the past twelve months – indeed, the past decade – have taught us anything, it's that you can't take privacy for granted. While there have been plenty of high-profile hacks, leaks, and data exposed through general mismanagement by companies large and small, the reality is that much of the time our personal information gets distributed not because it's stolen, but because we don't take sufficient care with it.
Already we're looking to close out 2019 with another corporate confession of misappropriated data. Security camera company Wyze is behind the latest mea-culpa in our inbox, admitting that its databases were exposed for a time, and accessed by an unknown third party.
Wyze, though, is by no means the only company that has discovered, to its embarrassment, that its ability to secure the data its customers share is far less impressive than its ability to collect it in the first place. Certainly, the fact that the company is in the security field makes the irony more stinging. Yet far bigger companies than the connected camera and alarm startup have had to come, cap in hand, and tell its users that a screw-up has occurred.
Embarrassment, though, is arguably an insufficient motivator for meaningful change. If there's one thing companies understand – and quickly – it's impact that affects their bottom line. There, though, we're not living up to our end of the bargain.
Take Facebook, for example. Its track record over the past decade when it comes to handling your personal information has been dire, frankly. Investigative attention from regulators has forced it, seemingly grudgingly, to massage its privacy policies and the ways in which it uses our data, but it's hard to escape the feeling that this is the digital equivalent of closing the stable door after the horse has bolted.
Facebook growth-hacked its way to dominance, for instance, by using the phone number you provided for extra security to encourage more people to add you as their friend. Now it won't do that any more, which is good, but the damage is already done. And I'm not sure that it would ever have stopped misusing two-factor authentication numbers in that way, had it not suddenly found itself under the microscope for privacy mismanagement.
You can't just blame Facebook, though. Google, Apple, Microsoft, and many other vast firms have equally vast databases of our data. Their privacy polices undoubtedly differ, but do you really know how? We're just as guilty when it comes to clicking "accept" on privacy policies and user agreements without reading them, too focused on getting to use a new toy or service to consider the longer-term implications of what we're giving permission to.
Companies – and hackers – have already figured out that personal data is arguably the most valuable currency out there today. Adages like "if you're not paying for it, you're the product" may be commonly quoted, but there's little indication that we're taking the sentiment to heart. Sure, there are meaningful barriers like privacy agreements dripping in legalese and a dozen pages long, but even so; it's tough to argue that most of us are doing our due-diligence.
Today, with likely hundreds of thousands of smart speakers freshly installed in homes across the world after Amazon, Google, and others pushed them so eagerly over the holidays, it's easy to assume that tapping the microphone-mute button or sliding a camera shutter is enough to secure your privacy. Yet the reality is that, while you could well argue it's not sensible to bring an always-on microphone into your home, any feeling of wellbeing from knowing that Alexa, the Google Assistant, or whichever other AI isn't listening to you is outweighed by the rest of the data you're freely sharing every time you go online.
There are signs that changes are afoot. You may have noticed an uptick in emails landing in your inbox, from companies notifying you that they've updated their privacy policies. That's down to the imminent arrival of the California Consumer Privacy Act (CCPA), which comes into force on January 1st.
The CCPA won't change what data companies can collect about you: they'll still be able to gather up as much as you'll willingly give them. What it changes, though, is user access to that data. Those in California will be able to find out what personal data a company has saved on them, access it, request it be deleted (with a few security-minded exceptions), and not only discover if it has been sold or disclosed, but deny permission for such a sale.
Though it'll be the toughest consumer privacy law in the US, it's still not perfect. The CCPA only covers information shared by the consumer; if a company purchases data, or gathers it from publicly-available sources, the law doesn't apply. A business needs to be either large (with gross annual revenues above $25 million) or deal significantly with personal information (either buying or selling the data on 50,000 or more consumers or households, or earning more than half its annual revenue from the sale of such information) to be subject to the new rules.
Other limits fall around the repercussions of contravening CCPA. If a company doing business in California is subject to a data breach, and can't demonstrate it had maintained "reasonable security procedures and practices," it can be fined and the target of class action lawsuits. However there's no explicit punishment for companies that sell data even if they're told by a user not to.
CCPA, in theory, only impacts the state of California. However several big companies are adopting its mandates nationwide – including Microsoft – even as others protest what they claim is a lack of clarity from the law's authors. Even if rule-breakers persist, it'll be down to organizations like the office of California's Attorney General to actually step up and enforce the CCPA's requirements. It's unclear whether federal legislators have the appetite to roll out a US-wide version.
And there, again, we come to individual responsibilities, the counterpart to our digital rights. Rules like the CCPA may outline our expectations from the companies we entrust with our digital lives, but all the transparency in the world about privacy policies and access to records are for naught if we don't actually read them. The CCPA might force disclosure of data being collected, but that's only useful if we ourselves read that disclosure, and make balanced decisions about who we'll then share that data with.
In short, there's no "simple trick" to ensuring privacy and the security of your personal information. Even starting out 2020 by deleting your Facebook account won't be enough to keep "safe" online. Rules like the CCPA and the proposed Online Privacy Act may end up giving us the tools to control how our information is shared and monetized, but that's only if we acknowledge that there's no quick fix to taking care of our most important data.