After driving the Tesla Model 3 a few times I’ve reached the conclusion that, if the car wanted to, it could basically drive me to the nearest police station and turn me in for the myriad of infractions I commit on a daily basis.
This is true of many new cars, that have dozens of cameras, internet connectivity, and even some self-driving tech that require AI programming capable of telling humans apart from poles, or trucks from scooters.
The programing could determine all of my offenses on its own, and while the tech still has some discouraging misses, it confused my mom for a box truck this morning, there’s no wonder that it will only become better over time.
But I wonder if cars will tell on us for some offenses.
The EU already ruled that all new cars sold from 2022 will need to come with “ISA” a speed limiter that would prevent drivers from exceeding the speed limit based on GPS and mapping information. But this system is, to an extent, only passive because it doesn’t brake the car, as much as limit power.
So, what would happen if the driver goes down a steep incline with the car in neutral, doubling the speed limit? should the car tell on the driver? If the car is fitted with an ISA, and the driver breaks the speed limit, should they pay a bigger or a smaller fine than a car without an ISA?
2. Light-skipping/not giving pedestrians right-of-way
This is actually a very contentious topic in Mexico City, where some drivers argue that red lights should be turned off after 10-12PM to save time and avoid possible muggers.
Tesla’s latest autopilot version has traffic light recognition, and even if it’s passive, it would probably be able to apply the brakes if the driver is about to blow a light. Autopilot already has pedestrian recognition, which would also be important to take into consideration because at unsignalized pedestrian crossings, pedestrians tend to have the right of way.
But, if a car detects a pedestrian is about to cross, should it brake regardless of how the throttle is applied/ if the car is in autonomous mode? Should it tell on a driver for not giving right of way or driving recklessly?
3. More serious crap
What if a car, equipped with all of the autonomous driving features, detects that the driver is willingly running over someone? Should the car call the cops? Would it be pertinent for the car to lock itself and leave the culprit unable to escape? What if the car has incriminating evidence? Should it stop the owner from deleting it?
Lets imagine the car detects an accident and locks itself, what if it was an accident and the driver needs to take the victim to a hospital to save their life?
This shit doesn’t even include full autonomy.
While full autonomy is still many years away, I wonder if all of these measures could be applied soon. I don’t think it would be very good, as far as privacy goes, but it’s undeniable that the tech is going to be ready to report on us very soon.