Well, this is Profoundly NOT Reassuring

It appears that the robot Uber than ran down and killed a pedestrian saw the woman, but ignored her, because it had been programmed to.

Basically, Uber’s self-driving software is so crappy and has so many false positives that it was programmed to ignore actual human beings.

Uber is still Uber:

Uber has concluded the likely reason why one of its self-driving cars fatally struck a pedestrian earlier this year, according to tech outlet The Information. The car’s software recognized the victim, Elaine Herzberg, standing in the middle of the road, but decided it didn’t need to react right away, the outlet reported, citing two unnamed people briefed on the matter.

The reason, according to the publication, was how the car’s software was “tuned.” 

Here’s more from The Information:

Like other autonomous vehicle systems, Uber’s software has the ability to ignore “false positives,” or objects in its path that wouldn’t actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company’s system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn’t react fast enough, one of these people said.

Let me translate this into English:  Uber put a 4000 pound death machine on the road with software that was incapable of determining the difference between a plastic bag and a human being.

This is not just reprehensible, it might very well be criminal.

Leave a Reply