Great point. Btw: The problem is corporate irresponsibility:
When self-driving cars were first coming out a professor of mine said "They only have to be as a good as humans." It took a while but now i can say why that's insufficient: human errors are corrected by discipline and justice. Corporations dissipate responsibility by design. When self-driving cars kill, no one goes to jail. Corporate fines are notoriously ineffective, just a cost of doing business.
And even without the legal power, most people do try to drive well enough to bit injure each other which is a different calculus from prematurely taking products to market for financial gain.
The top 3 causes of death by vehicle accident in USA are [0]:
- DUI
- speeding
- distraction
In other words all human errors. Machines don’t drink, shouldn’t speed if programmed correctly, and are never distracted fiddling with their radio controls or looking down at their phones. So if they are at least as good as a human driver in general (obeying traffic laws, not hitting obstructions, etc.), they will be safer than a human driver in these areas that really matter.
What do you care more about—that there is somebody specific to blame for an accident or that there are less human deaths?
Under corporate control safety spirals down to increase profit. See: opiods, climate change, pesticides, antibiotic resistance, deforestation, and privacy. 50 years from now
self-driving cars will be cheaper and more dangerous. Human driving misbehavior will still be disincentivized through the justice system, but corporations will avoid individual responsibility for dangerous programming.
They only have to be as good as humans because that's what society deems an acceptable risk.
I do think the point about how companies are treated vs humans is a good one. Tbh though, I'm not sure it matters much in the instance of driver-less cars. There isn't mass outrage when driver less cars kill people because that (to us) is an acceptable risk. I feel whatever fines/punishments employed against companies would only marginally reduce deaths, if that. I honestly think laws against drunk driving only marginally reduce drunk driving.
I'm not saying we shouldn't punish drunk driving... just that anything short of an instant death penalty for driving drunk probably wouldn't dissuade many people.
In my country, drunk driving is punished by losing license and banning you from using another one for half year for first time and of life for second. And it's very effective, as those cases are rarity now
> It took a while but now i can say why that's insufficient: human errors are corrected by discipline and justice.
If they did, we'd be living in utopia already.
But also, by the same token, generative AI errors are similarly "corrected" by fine-tuning and RLHF. In both cases, you don't actually fix it - you just make it less likely to happen again.
When self-driving cars were first coming out a professor of mine said "They only have to be as a good as humans." It took a while but now i can say why that's insufficient: human errors are corrected by discipline and justice. Corporations dissipate responsibility by design. When self-driving cars kill, no one goes to jail. Corporate fines are notoriously ineffective, just a cost of doing business.
And even without the legal power, most people do try to drive well enough to bit injure each other which is a different calculus from prematurely taking products to market for financial gain.