Here’s the thing about an autonomous vehicle without driver controls: It’s dumb.
Even in sci-fi stories where there are A.I.-driven vehicles, the A.I. is just a part of the interface, like Spooner’s collapsible Audi steering wheel in I, Robot, or a separate entity like the Johnny Cabs from Total Recall. The point I’m making is that they’re defeatable-the same way a horse might not walk through fire or jump a ridge unless you tie his eyes up.
Furthermore, human nature is to self-determine and self define. What if I know a quicker route to someplace, but it’s not on the map or even the road? What if I want to jump over something? What if I want to pass someone?
What if the car behind me malfunctions past the speed limit?
What If MY car malfunctions past the speed limit?
The systems Google and Chevrolet are touting blatantly ignore the fact that not every emergency situation a driver encounters in the real world can be pre-programmed and quantified into an algorithm, unless one is including liability insurance in the car warranty. If nothing else, driver controls place the onus of liability on the driver, which insulates the company from all cases except malfunction. In an autonomous vehicle, the ONLY likely result of a malfunction-which is inevitable,given that we have not yet perfected the technology itself-is tragedy, gridlock, and chaos. That seems dramatic, but what do a highway full of A.I. controlled cars, trucks semis and SUVs moving in perfect lockstep at 50+mph around a turn do when one car in the front spins out?
Yeah, you keep your driverless Bolt, I’ll go invest in some hay and a carriage. And a motorcycle.