Open letter from Boston Dynamics and other anti-gun companies

Lettre ouverte de Boston Dynamics et d'autres entreprises anti-armes

The Waltham-based company has this time distanced itself from the weapons by writing an open letter. It did this with five other companies in the sector: Agility Robotics, Unitree, Anybotics, Clearpath robotics and Open robotics. At a time when videos of robot dogs armed with machine guns and missile launchers are all the rage, these companies have declared their disagreement with weapons.

The fears of the signatories of the open letter

What has raised awareness is the growing public concern over the viral videos that we have already talked about. In particular, “a small number of people have visibly made public their improvised efforts to weaponize commercially available robots.”
“Untrustworthy people who could use them to undermine civil rights or to threaten, harm or intimidate others,” the letter further reads.

What the collective fears is the risk of harm, which raises serious ethical questions. Not to mention that such weaponized applications will undermine public awareness and trust in the technology. All this at the expense of the enormous benefits that robots can bring to society, according to Boston Dynamics and others.

The commitment of the business collective

It is for these reasons, among others, that the signatories of the open letter spoke out against the armed use of their “general purpose robots”. But that's not all: as the letter was written, they also expressed commitments to be kept in order to limit the proliferation of war phenomena linked to robots.

The most obvious of these commitments is not to produce multi-purpose robots equipped with weapons, and the same goes for software. Furthermore, the signatory companies will not support those who venture down this path. In fact, extra effort will be made to verify their customers' designs so that they do not use weaponry. Finally, the commitment to seek solutions to mitigate or reduce weapons-related risks through the development of technological solutions in this regard.

The important appeal at the end of the open letter

The authors of the letter conclude with a sincere appeal for the mobilization and collaboration of all, reproduced here:

We recognize that our efforts alone are not enough to address these risks, and we therefore call on policymakers to work with us to promote the safe use of these robots and prohibit their misuse. We also call on all organizations, developers, researchers and users in the robotics community to similarly commit not to build, license, support or enable the installation of weapons on these robots. We believe that the benefits of these technologies for humanity far outweigh the risks of misuse, and we are excited about a bright future in which humans and robots work side by side to address some of the global challenges.

Boston Dynamics, Agility Robotics, Unitree, Anybotics, Clearpath robotics, Open robotics

Is this the right initiative?

We do not know how the situation will develop, even in a few years. Everyone knows Boston Dynamics' past and it's no mystery that it developed the first prototypes for DARPA. Likewise, the French army's use of the Spot robot dog is well known, but in exercise scenarios.

Open letter

Some comments received online do not bode well for the success of the initiative. Phrases like: “Once I buy the product, I should be free to use it as I see fit.”
Or again, many attribute poorly concealed hypocrisy to the main company signing the letter, recalling the financial origins of its most famous robots.

The fact remains that in the history of humanity, many newly introduced technologies have always had a double-edged impact on the public. There were those who looked at them with enthusiasm and those who condemned them with ferocity. This was the case for electricity, nuclear energy or telecommunications, to name just a few.

Like them, Boston Dynamics and other companies have understood that the problem lies in the use made of it, and not in the technology itself.