iRobot Roomba violates users’ privacy, photos circulate on the web

Le Roomba d'iRobot viole la vie privée des utilisateurs, des photos circulent sur le web

Robot vacuum cleaners from the iRobot company are very popular today, especially the Roomba series. Small, cylindrical and compact, these trusty household helpers have haunted the homes of many of us for some time now. In addition, their technology has evolved over the years to provide the best possible service, worthy of a human.

iRobot, the largest seller of robot vacuum cleaners, recently acquired by Amazon, confirmed the news. Its Roomba robot vacuums have taken hacked images of some users even in intimate circumstances. The MIT Technology Review website managed to obtain 15 of these photos, which were also posted on closed social media groups. In the following article we will delve deeper into the story, trying to explain how this happened, giving an insight into a seemingly underground world.

The Roomba “victims” and iRobot’s response

The Roomba robot in question has taken photos of people all over the world. One, which we think will become infamous, depicts a woman in the bathroom. Others show a young boy who appears to be up to eight years old, face exposed, lying on the ground. All the screens, however, show houses in general, some occupied by humans, one by a dog. The most distinctive thing is the fact that all objects are labeled by rectangular boxes like a Terminator would. Furniture, decorations and even objects hung on the wall are labeled “TV”, “plant_or_flower”, “ceiling light”, to name just a few examples.

What if the iRobot Roomba really spied on us in the future?

For its part, iRobot claims to have only delivered the incriminated Roomba to employees and collectors. The latter also allegedly signed an agreement acknowledging that they were sending data streams, including videos, to the company. According to the company, it was up to them to prevent Roomba from recording confidential information. Finally, the manufacturer emphasized that these were robots that had undergone hardware and software modifications that were not present on the commercialized Roomba. Obviously, not even those that have been sold previously.

Who is really behind the distribution of the images?

iRobot Roomba devices, to date, feature powerful sensors and, in general, very capable hardware. Whether it's highly sophisticated cameras or other tools, everything contributes to the collection of data for processing by artificial intelligence. However, to make sense of this data, humans must first see, classify, label, and give context to each piece of information.

irobot roomba

There is always a group of people somewhere doing data categorization work for artificial intelligence. It is the “gig workers” who carry out this data annotation process. The company iRobot has announced that it is subcontracting this work to the Scale AI platform. However, she defends herself by saying that this leak of images represents something bigger than the actions of a single company.

It is extraordinary to see how far data goes to “train” artificial intelligence algorithms. In this case, the data comes from homes in North America, Europe and Asia and ends up on iRobot's servers based in Massachusetts. From there to those at Scale AI in San Francisco and finally to workers around the world. Including, in this case, the independent workers from Venezuela who distributed the images on private social groups.

IO project on iRobot's Roomba and how our data ends up around the world

Of all the companies that have emerged in the last decade, Scale AI has become a leader in its field. Founded in 2016, it has built an entire teleworking business in less wealthy countries with its Remotasks platform. In 2020, she was given a new task: the IO project. These were images taken with an upward tilt of about 45 degrees, showing walls, ceilings, floors, houses. Including, of course, people, whose faces were clearly visible.

The workers then discussed the task on Facebook, Discord and other groups created to exchange all kinds of advice. Whether it's managing late payments, assigning the best payments, or requesting help with labeling. Social groups welcomed all of this. Ultimately, if iRobot expressed Scale AI's violation of agreements, Scale AI blamed the workers.

irobot roomba

The fundamental problem is that our face is like a password that cannot be changed. Once someone records our facial “signature,” they can use it forever to find us in photos or videos. Worse still, these illegal distribution actions are almost impossible to regulate on sharing platforms. Workers are difficult to control one by one, in short.

Once again, the iRobot Roomba affair highlights that technology carries enormous (possibly harmful) potential, it's true. But it is always depending on the uses we make of it that it can turn against us, or in this case, benefit others.