Posted by c420 18 hours ago
  Self-driving cars are constantly subject to mini-trolley problems. By training on human data, the robots learn values that are aligned with what humans value. -- Ashok Elluswamy (VP AI/Autopilot at Tesla)
You probably get more honest answers by presenting a trolley problem and then requiring a response within a second. It's a great implicit bias probe.
Wonder why the title states allegedly but not the article?
https://waymo.com/blog/2025/05/waymo-making-streets-safer-fo...
Though, Waymo should absolutely be responsible for this and be treated as if it were a human who hit the cat.
Also note that there is an enormous issue of trust and dignity.
By "trust" I mean: We have seen how data and statistics are created. They are useful on average, but trusting them on very important, controversial topics, when they come from the private entity that stands to benefit from them, is an unrealistic ask for many normal humans.
By "dignity" I mean: Normal humans will not stand the indignity of their beloved community members, family, or pets being murdered by a robot designed by a bunch of techies chasing profit in silicon valley or wherever. Note that nowhere in that sentence did I say that the techies were negligent - they may have created the most responsible, reliable system possible under current technology. Too bad normal humans have no way of knowing if that's the case. Especially humans who are at all familiar with how all other software works and feels. It's a similar kind of hateful indignity and disgust to when the culpable party is a drunk driver, though qualitatively different. The nature of the cause of death matters a lot to people. If the robot is statistically safer, but when it kills my family it's because of a bug, people generally won't stand for that. But of course we don't know why exactly, as observers of an individual accident - maybe the situation was truly unavoidable and a human wouldn't have improved the outcome. The statistics don't matter to us in the moment when the death actually happens. Statistics don't tell us whether specifically our dead loved one would have died at the hands of a human driver - only that the chances are better on average.
Human nature is the hardest thing for engineers to relate to and account for.
This is really only true for Waymo, who appear to be the only folks operating at scale who did the work properly. Robotaxi, Cruise and all the others are in a separate bucket and should be statistically separated.
Waymo? How is this ambiguous. Waymo makes the car, writes the software and operates the vehicle.