The lie that raced around the world before the truth got its boots on.
It didnât happen.
The story you heard, about a US Air Force AI drone warfare simulation in which the drone resolved the conflict between its two priorities (âkill the enemyâ and âobey its orders, including orders not to kill the enemyâ) by killing its operator?
It didnât happen.
The story was widely reported on Friday and Saturday, after Col. Tucker âCincoâ Hamilton, USAF Chief of AI Test and Operations, included the anaecdote in a speech to the Future Combat Air System (FCAS) Summit.
We have to do Bard because everyone else is doing AI; everyone else is doing AI because weâre doing Bard.
The thing is, there really is an important area of AI research for Google, namely, âHow do we keep AI nonsense out of search results?â
Googleâs search quality has been in steady decline for years. I blame the companyâs own success. When youâve got more than 90 percent of the market, youâre not gonna grow by attracting more customersâââyour growth can only come from getting a larger slice of the pie, at the expense of your customers, business users and advertisers.
Authorâs Note: This short story was originally commissioned by Deakin College as part of an AI ethics course; they have since take it down. This is its new home. For a nonfiction analysis of the problems set forth herein, see my Guardian column on the subject. Hereâs an audio edition.