Daniel Reitberg, a tech writer, says that the road to self-driving cars has hit a bump. A recent accident in which a self-driving bus swerved to avoid a jaywalker and hurt other passengers in the process has sparked a heated discussion about the ethics of programming. The main point of the matter? Should the car put the safety of pedestrians or passengers first when it can’t be helped? This accident shows the moral minefield that programmers have to work through when they write the code that makes decisions for self-driving cars. You could be hurtling down the highway and trusting your self-driving car to make quick choices. Should the car swerve to miss a sudden obstacle, putting you in danger, or go straight ahead, putting a pedestrian in danger? These are the moral problems that keep coders up at night. The technology offers a future without drunk driving and road rage, but it’s still not clear who or what gets priority in accidents that can’t be avoided. Before self-driving cars become common, this ethical tangle needs to be cleared up. Then we can really look forward with confidence to a world where self-driving cars change everything.