Auto

Lorraine Explains: Repeat after me: there is no such thing as an autonomous passenger vehicle

A new study from the Insurance Institute for Highway Safety (IIHS) has found that a stunning number of drivers with vehicles sporting the latest and greatest in driver assistance are treating these partially automated features as fully autonomous. File under: ‘No Kidding’.

The IIHS is sounding the alarm because a disturbing number of people, “53 per cent of Super Cruise users, 42 per cent of Autopilot users and 12 per cent of ProPILOT Assist users” reported that they consider their rides to be fully self-driving. They are mistaken, as evidenced by numerous crashes and deaths when they’ve elected to hand the car driving duty to the car.

If you would like a list of the top five myths about commercially available autonomous vehicles, just say, “autonomous vehicles don’t exist” five times in a row. Even just a few years ago, the talk about the pending arrival of totally autonomous cars was louder than the rhetoric surrounding electric vehicles. You could be forgiven for being confused that the latter is here in quantity long before the former. That’s not to downplay the brilliance of the rapid advancement of all driver-assistance systems (ADAS), though it really needs to be reinforced that all of these systems still require a driver.

There is a standardized measure of the six levels of vehicle autonomy. Currently available vehicles, like those listed above, are ‘Level 2’. While many have the ability to change lanes, merge or even bring the vehicle to a safe stop in the event of a driver emergency, all require a driver to be in control at all times.

So why do so many consider their vehicles to be self-driving when they aren’t? Who is at fault? “Early adopters of these systems still have a poor understanding of the technology’s limits,” IIHS President David Harkey said in a statement.” If a great number of people have the wrong understanding of the safe limits of a deadly product they are using, it’s hard not to consider that the purveyor of that product is doing a lousy job in making those limits clear.

It’s easy to blame social media for the yahoos who are purposely flying their stupid flag in videos that feature them abusing that technology.

“…we wrote about an absolute, unrepentant knob that was arrested for sitting in the back seat of his Tesla while it was under the partial control of Tesla’s Level 2 semi-automated Autopilot system. Said knob claimed to have not-driven over 40,000 miles in this irresponsible and dangerous manner,” wrote Jalopnik in May of 2021. These are the stories that make the Darwin headlines. They’re a problem, but they’re not the problem.

“Regular users of Cadillac Super Cruise, Nissan/Infiniti ProPILOT Assist and Tesla Autopilot said they were more likely to perform non-driving-related activities like eating or texting while using their partial automation systems than while driving unassisted,” they report. The demographics of the study point in the direction you’d guess — Cadillac systems users mostly over 50, AutoPilot users many under 35, ProPILOT users across the spectrum.

“Many of these drivers said they had experiences where they had to suddenly take over the driving because the automation did something unexpected, sometimes while they were doing something they were not supposed to.” This is the problem: gaining control of your vehicle should never be something you have to do suddenly. Why do so many drivers feel so confident in relinquishing control in the first place?

auto, autos, car, cars, lorraine explains: repeat after me: there is no such thing as an autonomous passenger vehicle

Argo Lidar point cloud shows a busy city street in the Strip District Photo by Argo AI

Proponents would have you believe that fully automated cars are right around the corner, and if you look back, they were forecast to be here already. They won’t be. There are too many factors that are nowhere close to being ready, and manufacturers have to stop creeping closer and closer to the message that they are.

An IIHS study from 2020 found what we already know: the more actions a car is capable of performing, the less a driver feels obligated to do. “Drivers were more than twice as likely to show signs of disengagement after a month of using Pilot Assist compared with the beginning of the study…[c]ompared with driving manually, they were more than 12 times as likely to take both hands off the wheel after they’d gotten used to how the lane centring worked.”

There’s obviously a difference between opening up a Snickers bar and playing a video game. In an effort to track exactly how often these Level 2 systems were involved in serious crashes, NHTSA began last year requiring manufacturers to report these incidents. The initial numbers look terrible for Tesla, but they have more vehicles on the road, and those are vehicles able to use the AutoPilot in more settings. Regardless, NHTSA expanded their investigation into AutoPilot earlier this year. From TechCrunch: “Perhaps all Level 2 systems are more dangerous than human drivers alone, due to driver inattention. Or it could be the case that Tesla’s Autopilot as deployed is in fact less competent and more dangerous than rival ADAS technologies.”

The argument that more people would be dead without this not-ready-for-primetime autonomous technology is proving to be less durable than originally stated. Predictive modelling in an IIHS study from 2020 shows even fully automated, driverless vehicles won’t deliver the 100 per cent crashless future that has been predicted; they found only a third could have been corrected by current automation unless programmed to put safety over speed and convenience. Call it the human factor.

Cars still require drivers, as NHTSA continues to remind us. “No commercially available motor vehicles today [are] capable of driving themselves.”

Breaking thailand news, thai news, thailand news Verified News Story Network