Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> So, for example, when a Florida driver on Autopilot drops his phone and blows through a stop sign, hitting a car which then hits two pedestrians, killing one, Tesla will claim “this driver was solely at fault.” In that case, a judge agreed that the driver was mostly at fault, but still assigned 33% of blame to Tesla, resulting in a $243 million judgment against the company.

His foot was on the gas though

Looking at this author's other articles, he seems more than a bit unhinged when it comes to Tesla: https://electrek.co/author/jamesondow/ Has Hacker News fallen for clickbait? (Don't answer)



A couple of facts on the Florida case: it was a jury verdict, not a judge. The jury found Tesla 33% at fault for a 2019 Key Largo crash. Damages were $129M compensatory (Tesla responsible for 33% of that) plus $200M punitive, for $243M total.

The driver admitted he looked down after dropping his phone and blew a stop sign; Tesla argues his foot was on the accelerator, but the jury still assigned partial fault because Autopilot was allowed to operate off limited-access highways and the company didn’t do enough to prevent foreseeable misuse. The driver had already settled separately.


is there any blame to be associated to Tesla for its feature? what's the right percent for you? 20%? 10%? 5%? 0%?

If the wheels of the car fell off, whould Tesla have any blame for that? If we had laid wires all along the road to allow for automatic driving, and Tesla's software misread that and caused a crash, would it be to blame?

When is Autopilot safe to use? Is it ever safe to use? Is the fact that people seem to be able to entirely trick the Autopilot to ignore safety attention mechanisms relevant at all?

If we have percentage-based blame then it feels perfectly fine to share the blame here. People buy cars assuming that the features of the car are safe to use to some extent or another.

Maybe it is just 0%. Like cruise control is a thing that exists, right? But I'm not activating cruise control anywhere near any intersection. Tesla calls their thing autopilot, and their other thing FSD, right? Is there nothing there? Maybe there is no blame, but it feels like there's something there.


0%. This is entirely on the driver. He's someone who should spend a few years in prison, and then never be allowed to have a license again.

A foot on the gas overrides braking on autopilot and causes it to flash up a large message up on the screen that "Autopilot will not break / Accelerator pedal is pressed"


Yet almost any other car made after that tesla (and much cheaper) will automated break if it's about to hit something, no AI involved, just radar obstacle detection.


> any other car made after that tesla (and much cheaper) will automated break if it's about to hit something,

My 2015 Tesla S brakes if it detects something in its path using radar and usually correctly identifies the object type (truck, car, motorcycle, cyclist, pedestrian) using the camera.


Good for you, but

1) didn't they drop the radar?

2) clearly didn't work in this case


Also check the author's Bluesky: https://bsky.app/profile/jamesondow.bsky.social

It can't be healthy to be so obsessed with something/someone you dislike.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: