Engineering Explained: Trusting Driverless Vehicles

Posted by STEER On June 18, 2021

Everything we do is rooted in trust.

Consumers are fascinated by a driverless vehicle departing on command and returning when beckoned. For years, consumers have been hearing that self-driving cars are coming. Multiple revisions of expectations later, the Gartner hype cycle is course correcting, the initial wave of exuberance has passed and given way to the trough of disillusionment.

Some bold market moves do exist and are challenging the status quo with what I call the “Sometimes Autonomous Vehicle”, or SAM for short. These SAMs, like Tesla’s Auto Summon, STEER’s Valet Park, and Summon or Hyundai’s Smart Park, are delivering to passenger cars what driverless cars represent to the billions of future patrons of these technologies- convenience at consumer-friendly pricing levels. But the fascination that drives the media around and the market of self-driving vehicles is not enough alone to promote industrial growth and acceptance of autonomous vehicles.

Consumers need to trust self-driving cars with the same conviction that they trust human drivers.

But, how? This article series, Engineering Explained, aims to break down the technicalities of autonomous vehicles so they are more easily understood and their behavior predictable and bridge the gap between consumers and engineers.

Humans drive with sensibilities that are honed over several years and combined with their dual experience of often playing the pedestrian as well. An occasional nod, eye contact, or a gesture (and the meaning behind it) are part of a broader sentiment and intent that we humans read generally very well but that machines will have a tough time interpreting to an actionable level. Yet for humans, it is second nature. In areas where pedestrians are more likely to be encountered, humans drive relatively slower while simultaneously making fast, calculated decisions based on multiple layers of decision making. And some calculated risks.

Let’s break a simple, common scenario down to compare what the consumer sees with what an engineer sees in terms of the car’s decision-making.

Take the instance of a car turning and backing into a parking spot:

What the Consumer Sees

As a vehicle is turning and backing into a spot, pedestrians will quickly determine whether to walk by or wait. It’s not out of the ordinary for a passerby to quickly dart by if they are in a rush. Pedestrian actions are largely based on tactical information like vehicle speed and distance and more perceptive information like the trust that relies on appearance and smoothness of vehicle operation (whether driverless or not!). This builds another case for how human trust in automation plays an important role in successful interactions between humans and machines.

What the Engineer Sees

Every execution instance of a seemingly routine maneuver is different due to several factors, many of which come from the operating environment. Decision-making can get very complex depending on the forces at play. Yielding to pedestrians and other objects moving towards the vehicle is the top priority. As a vehicle turns and backs into a spot, correctly identifying objects in the periphery and classifying them, whether as other vehicles adjacent to an open spot or people that could very shortly be in your way is crucial. The former must be maneuvered around while for the latter, a decision must be made to yield or not. The distinction between the corner of another vehicle and a human at that corner has different implications.

Building trust in human-machine interactions can be increased organically by offering humans the right of way. It also allows the vehicle to prune the decision tree significantly for subsequent actions. The result is a perception of the vehicle as a courteous driver. We are largely seeing this phenomenon play out exactly in our trials at train stations, airports, and private spaces.

“Most people have never seen a robot,” says Elon Musk on Lex Fridman’s recent podcast. Most people don’t know how to react to one as a result. What they can respond to is the behavior of the robot. The closer and more human-like the behavior, the higher the initial trust. Just how first impressions between people set the foundation for relationships, first interactions with autonomous vehicles set the foundation for that consumer’s perspective of the technology.

What Does This Mean for the Future of Self-Driving?

The most important takeaway is that these vehicles are here. The driverless varieties will be Sometimes Autonomous. They will look like their ordinary counterparts because 76% of Americans favor appearance in making new car purchase decisions. Even in STEER’s experience, pedestrian perception of our autonomous vehicle’s familiar appearance is very positive and embracing.

To increase the acceptance of autonomous vehicles, it is vital for the consumer to understand how the vehicle behaves. This understanding leads to trust which leads to acceptance. After all, courteous drivers are always welcome!

Watch the STEER car communicated in a live environment in this video.

Engineering Explained

STEER Tech’s Engineering Explained series serves to translate the technical jargon of your average automotive engineer into that of your average consumer. Why everything we do is rooted in trust. Why were there speed bumps in rolling out Tesla’s Smart Summon feature? What hasn’t the driver thought about when looking at CASE vehicles? What are Deep Neural Networks, and why is cybersecurity important for your car? This series seeks to answer these questions and more. Have an idea you want to explore? Get in touch here.