Sta Hungry Stay Foolish

Stay Hungry. Stay Foolish.

A blog by Leon Oudejans

Can robots be our partners?

You probably recall C-3PO, the tall yellow human-like robot from Star Wars. He has a high voice and is always afraid of anything. He is harmless, isn’t he? He could be your friend. Suppose, just suppose, that you would have watched a scene in which C-3PO killed another robot. Would you still trust C-3PO? This is the same reason why humans don’t trust other humans.

Today’s blog title refers to a recent Guardian interview with AI ethicist Kate Darling: “Robots can be our partners”. There are some movies about this theme: Westworld (IMDb-1973, IMDb-2016 on HBO), Bicentennial Man (1999, IMDb), Her (2013, IMDb), and Ex Machina (2014, IMDb).

There’s an interesting dilemma regarding robots: the more robots will become like humans, the less we will trust them. This dilemma could – obviously – be solved by only using facial similarities. Probably, that would be (too) boring for us. Similar to the Sumerian Tower of Babel, humans will again act like gods and build mirror images of ourselves. Murphy’s Law will be next.

Since 1961, robotics has been introduced in industrial processes (eg, automotive). Basically, robotics does (ie, deeds) the same as humans but “harder, better, faster, stronger” (Daft Punk songvideo). More recently, robots have been able to converse (ie, using words) with humans (eg, Alice the care robot, my 2016 blog). Still, these robots do not have intentions (my blogs). 

Most dogs are extremely loyal towards humans. They rarely show hostile intentions. However, dogs that bite (humans) are often terminated. The 1984 Sci-Fi movie Terminator is about a cyborg assassin sent back in time to kill a human. Today, armies are already employing robot soldiers (eg, Guardian-2020) and killer robot tanks (eg, Futurism, Forbes). 

The reason why many people do not trust artificial intelligence is similar to the Dutch saying: “Zoals de waard is, vertrouwt hij zijn gasten”. A literal translation would be: “The innkeeper trusts his guests after his own character”. The official English equivalent is: “Ill doers are ill deemers” (source). In other words: we trust others like we trust ourselves.

In general, humans take their time before they fully trust the intentions of other humans. Perhaps, we will extend this courtesy to robots, like we do to our pets (eg, cats, dogs). I doubt it.

“Logical but not reasonable. Wasn’t that the definition of a robot?” A quote from The Naked Sun (1956) by Isaac Asimov (1920-1992), an American writer and professor of biochemistry.

The Logical Song (1979) by Supertramp

artists, lyrics, video, Wiki-1, Wiki-2

Won’t you please, please tell me what we’ve learned? 

I know it sounds absurd 

But please, tell me who I am

Note: all markings (bolditalicunderlining) by LO unless stated otherwise.

Archives

Framework Posts

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Pin It on Pinterest