Five Lessons from Hollywood About Autonomous User Experience
It doesn’t take someone named Cipher to figure out keeping riders calm will be the ultimate secret code.
September 21, 2018
Steve Tengler will be presenting at the WardsAuto User Experience Conference, Oct. 2, at the Suburban Collection Showplace in Novi, MI. Learn more at the event website.
Believe it or not, Hollywood can teach us things if we pay attention. Movies dramatize the silly and extreme for effect, but that provides us a guide to what not to do when designing for the future. I have collected five examples from futuristic movies to provide a road map for future user experiences and associated lessons.
Lesson 1: Users’ Safety is Paramount
In “Minority Report,” Tom Cruise is riding in a sleek, autonomous vehicle down the face of a building. Like many drivers of fictional autonomous cars, he decides to no longer stay behind the wheel.
But what makes the scene especially interesting is that Cruise decides to no longer remain inside the car, and the car keeps driving down the side of the building. Does the vehicle change its course or speed to accommodate safety concerns? No. Not at all.
Saying “consider your users’ safety” is motherhood and apple pie and of course should always be paramount. But it is worth noting that standards such as Automotive-SPICE do not explicitly require user experience requirements or testing. And if we have collectively learned anything from Tesla’s beta testing with live drivers, getting them to re-engage in the driving experience when they have checked out is difficult at best. So, usability testing and A-SPICE requirements must include safety-related user testing to understand the real-world unpredictable oddities that humans do.
All that said, Tom Cruise has appeared in 46 movies and has never died in a car crash, so maybe Lexus doesn’t need to worry too much about his autonomous safety UX.
Lesson 2: Voice Input Must Understand Natural Language
In a few scenes from “Total Recall” to which we can all relate, Arnold Schwarzenegger finds himself in a “Johnny Cab,” an autonomous ride-share vehicle driven by a robot named Johnny. During those scenes, Arnold’s character (Douglas Quaid) asks multiple questions and/or commands and Johnny fails at understanding every one of them. Johnny’s fate is sealed when Arnold finally gets angry and rips Johnny out of his floorboard connections.
The reason many of us can relate to this: Callers often opt for call center agents when they get frustrated with automated speech systems, because agents are able to adjust to environmental variables in real time and handle the dynamics of human conversation intuitively and with ease. However, this is the direction that technology is going (ComScore projects 50% of all searches will be by voice by 2020), with artificial intelligence adoption growing 60% in the last year. And voice likely will be an integral part of autonomous vehicles, given the unknown location of the user.
The key will be making the underlying intelligence able to understand natural language with accuracy. If Douglas Quaid exclaims “start driving,” the vehicle must not remain stationary awaiting a destination. And when he asks, “How did I get in this taxi?” rolling your robot eyes is naturalistic (and funny), but not helpful or great UX.
Lesson 3: Emotional AI Will Make a Great UX
In the mediocre movie “Hot Tub Time Machine 2,” the main characters interact with a “smart car” that reacts to compliments and emotion. At the end of a rather pointless scene, Lou (played by Rob Corddry) storms off in anger because the smart car hasn’t responded in a way that suits him. Ironically, emotion has been discussed as being part of what will define great AI.
As best stated by designer/writer Michael Greenwood, “Learning is an emotional process. There’s the joy of discovery, the agony of defeat and the shared experience of gaining new knowledge. AI should be in sync with these types of emotions to generate these feelings, acknowledge them and even reward them at times. UX designers who can emulate these emotional connections will take us another step further into the types of AI experiences that we’ve longed for in popular culture since the first science fiction story was written, put onto the silver screen or broadcasted into our living rooms via our television sets.”
Lesson 4: There Always Will Be a Desire to Drive Manually
In the movie “I, Robot,” Will Smith interacts with a multimodal AI interface, doing work without a steering wheel visible. He is requesting and reading reports while his car speeds through a long tunnel. He sees, though, a curious series of events and decides it’s time for him to regain control of the vehicle.
This mirrors what studies to date have shown: drivers will want to retain the ability to drive in the future, with “a sense of freedom” and “love of driving” being the top two reasons cited. In a U.S.-based study from market research company AutoPacific, more than half of those surveyed said they would never give up driving, and driverless cars “… would take all the fun out of driving.” In fact, the number one reason people want a driverless option is NOT to give up driving, but rather for the vehicle to park itself after arriving at the destination.
As noted by Forbes’s Technology Council, this means the autonomous vehicle must have an enhanced safety system that accounts for both self-driving and driverless modes. I guess you could say Will had a lot of fun driving in this clip, although I’m not sure about “enhanced safety.”
Lesson 5: Trust Will Be an Ongoing Issue
In “The Fate of the Furious,” Charlize Theron plays Cipher, a hacker who turns cars into an army of autonomous drones that create a massive scene of smash ’em robots with a police motorcade. In a memorable scene, the vehicles under assault drive down Seventh Avenue and, coincidentally, Cipher has control of a bevy of vehicles stored in the penthouse of a neighboring building. She literally orders the probe vehicles to rain down on the motorcade, thereby creating a massive pile-up. The autonomous vehicle is portrayed as an evil, untrustworthy weapon.
In three studies conducted by J.D. Power between 2012 and 2018, 80% of people do not want an autonomous car, mostly due to a lack of trust in the technology (a 2017 MIT study put that number at 48%); 36% of those surveyed in a 2018 study said driverless vehicles are less safe than human-driven vehicles. And 67% of Americans are concerned about cyberthreats to driverless vehicles (versus 18% for self-driving vehicles). A 2017 study from the University of Michigan showed fear for personal safety due to cybersecurity was a significant issue.
Trust is a significant issue and finding the User Experience that overcomes that fear will be a winning differentiator. All of this despite a McKinsey & Co. report that autonomous vehicles could reduce fatalities 90% and save more than $190 billion in health-care costs per year.
It doesn’t take someone named Cipher to figure out that keeping riders calm will be the ultimate secret code.
Steve Tengler has worked in the automotive industry on the connected car for more than a quarter of a century for some of the world's top brands: Ford, Honeywell, Nissan and OnStar. He now is a Principal at global consultancy Kugler Maag Inc. He has 30-plus publications to his name, and 50-plus patents. He has a BSE and MSE from the University of Michigan and previously taught at Wayne State University.
About the Author
You May Also Like