This past week, an autonomous car in a pilot program directed by Uber struck and killed a pedestrian. How should we evaluate whether Uber is morally culpable for this death? The crash occurred at night, and many have suggested that a human driver would also have killed the pedestrian. Let's assume for the sake of argument that that's true. It strikes me that this comparison is not the right way for us to assess Uber's culpability. But what then? Well, it has also been suggested that current technology should have been sufficient for the autonomous car to detect the pedestrian. Does that make Uber culpable? If so, that seems to imply that, if Uber's car were in fact state of the art, that would absolve them--and that seems just as wrong to me.

Read another response about Business, Ethics