This is key, there's expectation and some wiggle room that as a human driver, humans will fuck up predictably and experienced drivers know how to avoid getting into incidents when this happens (usually).
Self-driving cars are weird to drive around. They will absolutely stop in situations where no human would think to stop. I think about this as a motorcycle rider, what if I'm committed to cornering on a corner I can't see around and the software decides on a self-driving car that it should just stop in the middle of the road after the apex? A human driver could do this too but many will know that this is a dangerous place to stop and try to put the car on the shoulder or minimize the amount of time it's stuck there.
I don't know if this is something we need to tolerate a temporary increased incident rate on as people get used to them being on the road, or if we need to make the software drive more like humans (with the assumption that means potentially making the behavior act sloppier than it can handle so that increased software reaction rate doesn't cause humans with slow reaction rate to slam into them)
This is key, there's expectation and some wiggle room that as a human driver, humans will fuck up predictably and experienced drivers know how to avoid getting into incidents when this happens (usually).
Self-driving cars are weird to drive around. They will absolutely stop in situations where no human would think to stop. I think about this as a motorcycle rider, what if I'm committed to cornering on a corner I can't see around and the software decides on a self-driving car that it should just stop in the middle of the road after the apex? A human driver could do this too but many will know that this is a dangerous place to stop and try to put the car on the shoulder or minimize the amount of time it's stuck there.
I don't know if this is something we need to tolerate a temporary increased incident rate on as people get used to them being on the road, or if we need to make the software drive more like humans (with the assumption that means potentially making the behavior act sloppier than it can handle so that increased software reaction rate doesn't cause humans with slow reaction rate to slam into them)