THERE is a fear in some circles that artificial intelligence will eventually outsmart the human race and take over the world.
They might be right.
A pair of studies by the Insurance Institute for Highway Safety indicate that we may not be smart enough to keep up with AI in our cars.
One study found that the various system names manufacturers use for automated options “can send the wrong messages to drivers regarding how attentive they should be,” according to an IIHS report on the studies.
The other study “found that drivers don’t always understand important information communicated by system displays.”
IIHS President David Harkey noted that while vehicle automation could improve safety, “unless drivers have a certain amount of knowledge and comprehension, these new features also have the potential to create new risks.”
This is the tough thing about evolution—the beginning of a new era can be tough, weeding out those less adaptable while the rest learn and move on.
Right now, we are in the early stages of a new era. Automation capabilities in vehicles are in the very early stages, known as Level 1 of Level 2. Level 1 automation performs one task, such as “adaptive cruise control,” while Level 2 can handle two tasks.
The end game is to get vehicles to Level 5, complete automation.
The study on automated system names surveyed 2,000 people, asking them what they could do with certain automated vehicle systems. The results show how a simple name can cause confusion.
Regarding the Autopilot system in Tesla vehicles (participants did not know the vehicle models), participants were asked if it would be safe to take their hands off the wheel while driving on autopilot.
Forty-eight percent thought it would be OK to take their hands off the wheel while in autopilot.
It’s not, as some Tesla drivers have learned the hard way.
Wait, it gets even scarier.
“Autopilot also had substantially greater proportions of people who thought it would be safe to look at scenery, read a book, talk on a cellphone or text. Six percent thought it would be OK to take a nap while using Autopilot, compared with 3 percent for the other systems,” according to the report.
This has happened in the real world, with several fatal crashes involving Tesla drivers who thought their systems did more than advertised.
The second study focused on the automation displays inside vehicles.
Eighty participants in that study struggled to understand actions of some automation features, such as the system not detecting a vehicle ahead that had moved out of range.
“Understanding these displays is important because automated systems can behave unexpectedly and changing circumstances may require the driver to intervene,” the report noted.
The report suggested that training, such as at dealerships, would help drivers acclimate to the automated features.
The message here, while we’re still in the learning curve stage of vehicle automation, seems to be that if you have one of these vehicles, keep your hands on the wheel and eyes on the road.
Otherwise, that curve in the road ahead might be the last one you take.