A recent survey has raised the question of whose lives our autonomous cars will prioritise.
What kind of ethical decisions will the first driverless cars make? It might be a funny question, but it's one with serious implications.
If a computer-driven car is expected to prioritise the preservation of its passengers' lives, does that mean that it would choose to plough into a group of pedestrians in the event of an impending crash?
A recent Massachusetts Institute of Technology (MIT) study found that 76 percent of the people surveyed would expect an autonomous car to do the opposite, and prioritise the safety of a group of ten pedestrians over a single passenger.
However, the same survey also found that few people would actually want to own or travel in such an 'ethical' car.
It's an interesting conundrum that the current crop of autonomous car pioneers must face. "Most people want to live in in a world where cars will minimise casualties, but everybody wants their own car to protect them at all costs," says Dr Iyad Rahwan, the study's author (via BT).
With such a huge potential benefit to road safety - it's calculated that autonomous cars could cut traffic accidents by 90 percent - we're unlikely to be able to avoid or defer a decision on such chilling calculations. It's likely to be a no-win case for this new breed of auto makers.
"Manufacturers of utilitarian cars will be criticised for their willingness to kill their own passengers," predicts Harvard University psychologist Professor Joshua Greene. "Manufacturers of cars that privilege their own passengers will be criticised for devaluing the lives of others and their willingness to cause additional deaths."