Is that nothing is infallible. And when the machines do make a mistake, who’s fault is it, who pays out, and how can we correct it? The roads are the very opposite of a controlled environment. Even with autonomous cars, you will still have jaywalkers, erratic cyclists, bus stops, breakdowns, animals, debris, and existing crashes to avoid. When one or multiple of these scenarios goes wrong, who will be blamed? Probably the manufacturer. Then, who will pay out? Do you need a driver’s license for an autonomous car? Who pays for insurance? How will you cover the cost of insurance without affecting everyone, or the owner in particular? What would the development costs be for a software patch? Would they issue a stop sale if the problem happens again while they’re developing the fix? And if humans are fallible, why are humans programming the car? What kind of hidden glitches could be present? Just things to think about.