There's probably something you do right now you wouldn't really want everyone to know about. Maybe you're letting a Fitbit gather dust while you eat Doritos and watch The Good Wife (understandable). Maybe you're in the habit of driving around at 3 AM when you can't sleep. Whatever you do, if you're doing it while using "internet of things" devices, those private vices may not be so private.
Without the proper oversight, companies that make gadgets and appliances that record our movements and behaviors could share profiles on your bad habits with your bosses, insurers, and creditors. And in many cases, there's no laws stopping them.
This sounds, I realize, culled from a dystopian novel. But the Federal Trade Commission report on the internet of things makes it very clear that data-gathering sensors can suck up enough information to make us lose jobs or see our insurance rates jacked up. The FTC's report highlights how broad the internet-connected dragnet on our data can be:
"According to a participant, "researchers are beginning to show that existing smartphone sensors can be used to infer a user's mood; stress levels; personality type; bipolar disorder; demographics (e.g., gender, marital status, job status, age); smoking habits; overall well-being; progression of Parkinson's disease; sleep patterns; happiness; levels of exercise; and types of physical activity or movement."
And that's just on smartphone sensors! Since internet-connected devices generate even more data about the things people do inside their homes, they're even more potent as tools to collect information and create profiles on people. This FTC report is important because there needs to be more rules for how the mounds of data we create by using internet-connected devices can be used against people. The devices we buy could end up turning into sneaky surveillance gadgets that could screw up our lives:
"One researcher has hypothesized that although a consumer may today use a fitness tracker solely for wellness-related purposes, the data gathered by the device could be used in the future to price health or life insurance or to infer the user's suitability for credit or employment (e.g., a conscientious exerciser is a good credit risk or will make a good employee)"
This means your lazy workout habits, as determined by a fitness band or connected home gym, could be used to decide whether you get a raise or even keep your job. This means that, if you decide to buy a connected car for safety reasons, the times you hit the brakes hard could be analyzed and shared with your insurer.
Told your boss you quit smoking, but somehow she got data on your e-cig use? Good luck explaining shit like that away.
Now, many of the individual companies selling Internet of Things products are already coming out and saying they won't sell data to third parties, like BMW with its current crop of connected cars. Nest, for instance, asks permission before sharing your data, so right now your boss couldn't use it to bust you. And the Fair Credit Reporting Act places some limits on what companies selling internet-connected devices can share. But the FCRA isn't comprehensive:
The FCRA excludes most "first parties" that collect consumer information; thus, it would not generally cover IoT device manufacturers that do their own in-house analytics. Nor would the FCRA cover companies that collect data directly from consumers' connected devices and use the data to make in-house credit, insurance, or other eligibility decisions – something that could become increasingly common as the IoT develops.
Without laws limiting how in-house data analytics are shared, people will perpetually be at risk of the stuff they buy turning into surveillance equipment for corporations. Even if someone agrees to share data with their bosses as part of a company-wide healthcare initiative, the point of living a quantified life is supposed to be self-improvement, not volunteering for a relentless digital panopticon of judgement and punishment.
Image: Scott Bedford