
AI ethics expert, John C. Havens, foreshadowed a future where humans outsource difficult or tedious conversations to robots, including breaking up and talking to grandma â and the other parties will be none the wiser, when he spoke at Bond University on Tuesday night (29 May).
Havens, the executive director of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, expanded on the potential of Google's artificial-intelligence assistant Duplex which attracted criticism earlier this month when it was shown apparently fooling a hair salon receptionist into thinking she was talking to a real person on the telephone.
Havens said he was not anti-Google but the technology could be used to denigrate workers in so-called menial jobs because they would not be considered worthy of receiving a personal call.
âIs it cool if a robot calls you because your job as a hair salon person means youâre not really worth a human calling you? Because thatâs kind of whatâs implied here, right?â Havens said.
âMy mom is 78, I love calling her, seeing how sheâs doing, but I think for a lot of people itâs like, âUgh! I gotta call my mom, itâs the weekend, sheâs gonna ask me the same things. (I could get a robot to say) âHi mom, good, kids are good, Iâm taking my meds, I love youâ.
âDo you start to get a sense of what identity becomes? I am going to let things be usurped from me because they can. Iâm giving permission not to just one thing but the mentality of giving permission for the rest of my life.
âIâm not fighting Google. Iâm fighting the idea that we are going to give these things over to technology without thinking about them.â
Havens said the technology behind Google Duplex was âgloriousâ but there was no doubt it would be misused.
âMachine learning, vocal synthesis, all that stuff, itâs astounding. But the (Duplexâs) âmm-hmmâ, and the âhuhâ, all that stuff was programed to trick you into thinking youâre talking to a human,â he said.
âWhat I know is that someone is going to get Duplex and say, âI donât want to use it to call my barber but I am uncomfortable in other situations. Iâm going to program this technology to break up with my girlfriend because thatâs hard. Confrontation is hard.
âIâm going to type in some key phrases: itâs not me itâs you. Maybe a couple of specific things. Remember when we met at the movies? I wish you the bestâ. Sheâs going to get the phone call and think, âWhat a jackass, calling me on the phone to break upâ.
âBut then how is she going to feel when she finds out it was a robot calling? Happy or sad? Iâm thinking sad.â
Havens said he hated when people said robots should replace humans in menial jobs and drew on the example of his grandfather, a school custodian, or caretaker.
âThe second a robot can replace a custodian thereâs a fleet of people who will say, âFantastic, that menial labour should be replacedâ, Havens said.
âBut my grandfather was essentially a therapist to a lot of young kids in high school.
âThey called him Uncle Phil because he wasnât a threat to them as an adult. He was a wonderful guy who also happened to be a custodian.â
Havens was at Bond University as part of the Faculty of Lawâs Baker McKenzie Eminent Visitor Program. He spoke at a Centre for Professional Legal Education (CPLE) public seminar, participated in tutorials and panel discussions with Bond staff and students and is keynote speaker at the Financial Times Innovative Lawyers Forum and Awards, of which Bond is co-sponsor, being held in Sydney tomorrow (31 May).Â