It is difficult to get a public procurement officer to understand something, when a vendor’s salary depends on his not understanding it.
In the “Shitty Technology Adoption Curve,” oppressive technologies are first imposed on people who don’t get to complain — prisoners, migrants, children, mental patients, benefits recipients — in order to normalize these tools and sand down their rough edges. Once the technology has been rendered a little more acceptable, it crawls up the privilege gradient, bit by bit, until even the most socially powerful among us are using it.
In other words: 20 years ago, if you ate dinner under a CCTV’s unblinking eye, you were probably dining in a supermax prison. Today, you’re likely just someone who bought some luxury surveillance, like a “home automation” system from Google, Apple, Amazon or Facebook.
The lie that raced around the world before the truth got its boots on.
It didn’t happen.
The story you heard, about a US Air Force AI drone warfare simulation in which the drone resolved the conflict between its two priorities (“kill the enemy” and “obey its orders, including orders not to kill the enemy”) by killing its operator?
It didn’t happen.
The story was widely reported on Friday and Saturday, after Col. Tucker “Cinco” Hamilton, USAF Chief of AI Test and Operations, included the anaecdote in a speech to the Future Combat Air System (FCAS) Summit.