The following is an open letter to iRobot CEO Colin Angle. His company makes the very popular Roomba robotic vacuum cleaner. On Monday, ZDNet’s Jake Smith wrote about iRobot’s intention to sell mapping data from customers’ homes to other companies.
One of the ways Webster defines “dear” is “highly valued” or “precious.” So when I start a letter with “Dear Colin,” you could interpret my statement as saying I highly value our relationship. But the fact is, we’ve never met and I’m just using a commonly accepted way of starting a letter.
This is relevant to our discussion because you’ve recently talked about taking from your customers information that is dear to them, even though you’ve never met most of the people who enjoy the benefits of your products. When your customers buy your products, there are some common expectations.
It looks like you may be thinking about or trying to violate those expectations. By extension, it’s looking like you might be violating the trust given to you by your customers. Even worse, you could be opening the door to security risks that are far worse than they would be worth, just so you can make a few extra bucks on the side.
In a recent Reuters interview, you talked about the value of mapping data, both for doing the job of cleaning a room, and for understanding the environment where Internet-connected things need to interoperate. So far, I’m with you.
While you didn’t dive into technical details in that interview, I can definitely infer from your comments how a robot given free roaming privileges in a home or office space might build up a map of that space.
With good sensors, it might even be able to place the exact location of other smart devices on that map. And, by combining image analysis with radar or sonar, and with radio signal analysis, a room-roaming robot might be able to derive a useful representation of an environment.
That data could be used to help maximize lighting, tune sound, optimize microphones, determine when people or pets are in a space, and help conserve energy. All that is good.
I can even see how you might want to develop interoperability with the other big players in the home space, whether that’s Amazon and Alexa, Apple, Google (particularly for Google Home), and others. The tech industry has a long and well-regarded history of building APIs and interprocess communication protocols (XML and JSON come to mind) for creating systems that are greater than each of the individual components.
If you want to work with Apple, Amazon, and Google — with our permission, of course — to help create better home automation environments, I’m good with that.
But here’s the thing that has the whole internet a-flutter. Apparently, you’re trying to sell that mapping data. I understand that iRobot is a smaller company than any of the big three, but once you get into the mode of selling data, the potential for abuse rears its oh-so-ugly head.
After all, if you’re just developing good interop, there’s no reason to look at selling that mapping data. But when you’re talking about selling data that describes our homes to companies like Amazon and Google — companies that thrive on being able to get inside our heads and our wallets — something far more disturbing comes to mind.
You’re no longer mapping our homes to make sure you don’t tear out a power cord or fall down a flight of stairs. You’re moving into the realm of spying on your customers. In your case, though, it’s far worse than those stories of possible always-on webcams or TV sets.
See, none of those other devices can move around the house on their own power. If my TV is in the living room, I know it’s there. If I’m concerned about my privacy, I’m probably not going to parade my naked butt in front of it. But a Roomba can decide to wake itself up, it can wander around the house, it can measure, map, and with your onboard camera, even take pictures.
What would you do with the pictures your Roomba snags? Would you send them to Amazon’s or Google’s machine learning so they can develop an inventory of our possessions? What could they do with that information?
It’s not just a worry that they might discover I like coffee and pitch me on better coffee beans. It’s that they might learn about medical conditions, lifestyle choices, or anything else we want to keep private. Where would that information go? Would you give it up to the government if you got a National Security Letter or subpoena?
Now, at least, if our houses are searched by a government entity, we’d have a pretty good chance of knowing because someone will have to enter with a warrant. But if your robots are spending their days mapping, snapping, and spying, will we even know who our data is shared with?
And what about foreign or terrorist organizations? How good can spear phishing get if the bad guys truly know our interests and our home environments? You might not know you’re selling data to ISIS or Iran, or even China. The concept of a front company has been around for a very long time.
Sure, we live in a time where selfies and YouTube and Facebook and Instagram mean we voluntarily give up enormous levels of privacy. But that’s our choice, even if it might be a foolish one.
But here’s where you might be going wrong. If you’re asking for permission to map homes to provide better service capabilities, that’s one thing. But if you’re capturing all that data, storing it on your own servers, and then selling it to any random buyer that comes along, then you’re spying.
I know you’re familiar with Isaac Asimov’s Three Laws of Robots. But for our readers, let’s recap them here:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
I know you’re familiar with these laws because you chose to name your company, iRobot, after the classic book in which Asimov’s Laws debuted, I, Robot.
Colin, by potentially selling or sharing information about our lifestyles that is dear to us, you may well be violating the First Law. And if you know your Asimov, you know that won’t end well.