Will AI Be Too Human?

W.J. Astore

Watch out if the robots and computers copy their human creators

Robot dogs as potential enforcers. AI chatbots that write scripts, craft songs, and compose legal briefs. Computers and cameras everywhere, all networked, all connected, all watching—and possibly learning?

Artificial intelligence (AI) is all the rage as science fiction increasingly becomes science fact. I grew up reading and watching Sci-Fi, and the lessons of the genre about AI are not always positive.

To choose three TV shows/movies that I’m very familiar with:

  1. Star Trek: The Ultimate Computer: In this episode of the classic 1960s TV series, a computer is put in charge of the ship, replacing its human crew. The computer, programmed to think for itself while also replicating the priorities and personality of its human creator, attempts to destroy four other human-crewed starships in its own quest for survival before Captain Kirk and crew are able to outwit and unplug it.
  2. The Terminator: In this 1980s movie, a robot-assassin is sent from the future to kill the mother of its human nemesis, thereby ensuring the survival of Skynet, a sophisticated AI network created by the U.S. military that gains consciousness and decides to eliminate its human creators. Many sequels!
  3. The Matrix: In this 1990s movie, the protagonist, Neo, discovers his world is an illusion, a computer simulation, and that humans are being used as batteries, as power sources, for a world-dominating AI computer matrix. Many sequels!

Sci-Fi books and movies have been warning us for decades that AI networks may be more than we humans can handle. Just think of HAL from Stanley Kubrick’s 2001: A Space Odyssey. Computers and droids of the future may not be like R2-D2 and C-3PO from Star Wars, loyal servants to their human creators.

Nothing to worry about: It’s a cute “Digidog” featured by the New York Police Department. You may be the one begging and rolling over, however.

As they say, it’s only a movie, but I do worry about too much hype about AI. If AI becomes a reflection of its human creators, especially a distorted one, we could have much to worry about.

Assuming computers could truly learn from their human creators, it makes sense they would act like us, pursuing violence and issuing death sentences in the name of AI’s security and progress.

To AI networks of the future, linked to robotic enforcer dogs and armed aerial drones, humans just might be the “terrorists.”

Killer Robots

terminatorresistance-blogroll-1573775949328
My money is on the killer robots

W.J. Astore

Killer robots!  How many “Terminator” movies do we have to see before we conclude this is not a good idea?

You guessed it: the U.S. military is at it again.  Awash in cash, it’s investigating killer robots in earnest, striving for ever more “autonomy” for its robots, thereby reducing the need for humans in the loop.  Part of this drive for robotic warfare comes from the Covid-19 pandemic, notes Michael Klare at TomDispatch.com.  America’s tech-heavy approach to warfare puts lots of people in close proximity in confined spaces, whether on ships and submarines or in planes and tanks.  “Social distancing” really isn’t practical even on the largest ships, such as the aircraft carrier Theodore Roosevelt, briefly put out of commission by the pandemic.  So why not build ships that need few or no people?  Why not build autonomous killer robot ships?

Obviously, the Pentagon thinks that movies like “The Terminator” and “The Matrix,” among so many others that warn about humanity’s overreliance on machinery and the possibility the machines themselves might become conscious and turn on their creators, are just that: movies.  Fantasies.  Because technology never has unpredictable results, right?

So, killer robots are on the horizon, making it even easier for the U.S. military to wage war while risking as few troops as possible.  I’m sure once America invests billions and billions in high-tech semi-autonomous or fully autonomous killing machines, we’ll keep them in reserve and use them only as a last resort.  Just like we do with our big bombs.

To read Michael Klare’s piece on killer robots, follow this link.