Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is not whether the machine can see the human, or even whether the machine knows that the human has seen the machine.

The problem is that the human currently has no way to know whether the machine has seen the human.



The GP was talking about the driver knowing if the "human has seen the machine".

For the inverse problem, we could simply start adding screens (instead of windshields?) to self-driving cars that acknowledge the pedestrians in a particular way (when there's only a few people, in the Black Mirror realm, they'd actually greet them by name using facial recognition and universal DB of everyone :).


What? I was absolutely talking about human cyclists and pedestrians needing to know that the machine sees the human, and not by blind faith, but through some active explicit demonstrable indication.


I believe my reply was to user LoLFactor (threads open up subthreads which open up more subthreads...).

If you are disagreeing with my use of "GP" (grandparent post), always go with the HN rule of "assume the best possible interpretation" :)


Of course there is, the behavior of the machine should be the same as of a driver - the car shaped object starts to slow down in a way that will make it stop before hitting me at the pedestrian crossing.


This one scenario is an entirely insufficient list of all possible situations that occur.

Some of the worst incidents are when a stopped car stars moving just for the most obvious example.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: