Level 5 Autonomy: Is it AI or Just Deep Programming?

We need to keep checking AV programmers’ work and watch them with a Level 5 Autonomous Black Box. It needs to be at least as advanced as avionics flight data recorders.

Joel Hoffmann, Automotive Strategist, Intel Automotive Solutions Division

September 28, 2018

4 Min Read
Waymo AV Road Testing
Waymo autonomous vehicle testing in Chandler, AZ.

Joel Hoffmann will be presenting at the WardsAuto User Experience Conference Oct. 2 in Novi, MI.

My daily commute consists of a 10-mile (16-km) loop from my home to my, uh, home office.

I choose to use my road bike and get some exercise if the heat is manageable in Chandler, AZ, the world headquarters of self-driving vehicle testing.

I pass by all the tech icons, the $7 billion Intel Fab 42 plant where they are struggling to get a 7-nanometer chip to build in volume for autonomous cars; the Chandler city hall where Waymo drives at least three cars by every time I go out, and the empty space where Uber used to test its self-driving cars.

Occasionally I see a regular traffic accident, and lots of police and rescue workers show up. A couple of weeks ago I noticed the police were pushing a disabled car by hand since no tow truck was there yet. I stopped to take a photo and my phone rang.

It was my friend Jake Segal who runs Tome Software, a Detroit-area software company working on projects for Ford developing safety systems for cyclists of the future. His inventions may be embedded in new production Trek bicycles and send out silent signals to autonomous cars to make sure they can’t miss being detected.

I’m a little paranoid on my bike ever since an Uber self-driving car ran over a bicyclist in Tempe this year.

We now know the Tempe death was not caused by the failure of the vehicle sensors, the software or the Volvo safety features. Instead, it was caused by faulty programming. The experimental car’s electronics detected the bicycle and person, but the programmer determined it was a false positive and just drove through them. Nobody will buy a self-driving car that stops all the time for false positives, so they tend to be ignored. How will we fix these problems in a production self-driving car?

Today, automakers embed tiny black boxes or event data recorders (EDR) in the airbag safety systems to make sure they do their job. Few people know this, and the devices are not mandated. Privacy concerns have caused the National Highway Traffic Safety Admin. to advise automakers only to capture tiny bits of information, thus not raise concerns by consumers.

The automakers keep the data secret and share with National Transportation Safety Board (think NTSB in”Sully”) if an investigation occurs. That’s what happened with the well-known Tesla crash in Mountain View, CA, except Tesla was a little loose-lipped about the data and spoiled the investigation. NTSB fired them as a party to the analysis and went without the EDR data.

With a self-driving car accident, there will be no driver to blame, and the EDR data will be the only evidence. Can we trust the relationship between automakers and investigators to allow the truth to be revealed? How will we know what the car saw before a crash with a human-driven car or another bicyclist? I think we will need an objective third-party assessment with a full complement of data. At least four other major companies agree with me, and a prototype sample is now available.

It is based on the Renesas R-Car ARM-based development platform using Automotive Grade Linux, Western Digital SSD modules and a Tuxera automotive file system. And it is all wrapped up in a DiSTI FuSa capable human-machine interface designed to prove the truth is correctly recorded at autonomous speeds with up to eight dedicated cameras and other automotive inputs.

We hear a lot about Artificial Intelligence and how it is helping solve complex human-like thinking problems. For cars, these solutions will be baked into future chips performing specialized functions like steering, stopping and saving lives. Is it genuinely human intelligence or just disciplined programming by smart humans? You will decide when the last-ever accident occurs, and we have the software just right.

Until then, we need to keep checking the programmers’ work and watch them with a Level 5 Autonomous Black Box. It needs to be at least as advanced as avionics flight data recorders which regardless of how well the planes can fly themselves, still record every second of every flight with a high level of detail. AVs will be no different. We need to track their actions to build consumer trust. When consumers see the protection systems in place and are assured the programming keeps getting better, they will gradually climb inside and take their rides.

Moreover, it needs to be open, including the data collected, so the truth of autonomous driving always is available to make the next car even safer.

You can see the prototype level 5 black box data recorder in suburban Detroit Oct. 2 at the WardsAuto User Experience Conference

.

About the Author

Joel Hoffmann

Automotive Strategist, Intel Automotive Solutions Division, Intel

You May Also Like