MIT is using a peculiar twist on the age-old ethics “Trolley Problem” in order to understand the decisions people believe self-driving vehicles should make.

Moral Machine is a simple site, operating on binary choices in difficult situations. The essential conceit is that of an autonomous vehicle experiencing total brake failure which will cost lives in the inevitable accident. Your job is to choose which lives are more valuable over the course of the 13 given scenarios.

It’s a problem that could just as easily be applied to human drivers, but is specifically geared toward helping researchers at MIT understand the decisions that a human would rather a robot make in the fraction of a second the AI would have to react.

The choices given are simple and direct, though some are easier than others. It’s not difficult to choose who should survive between a family of four over some dogs in an accident, but once the problems include occupation, gender, or background for those involved, things get stickier.

That specification can be made after you finish all 13 questions, if you choose to help MIT understand the reasoning behind your decisions by adjusting sliders weighted toward various aspects of each. The relative value of humans to animals, men to women, young to old, and others is assumed by the choices you’ve made, but can be altered to more clearly reflect your intent.

Furthermore, you can then write out your thoughts and answer similarly-framed follow-up questions about your likelihood of purchasing an autonomous vehicle, your level of trust in the technology, and other optional demographic details. Finally, the site will compare your choices to the average for all users, including the characters you most often chose to save or kill.

It’s important to understand that vehicular fatalities will never be wholly eliminated, barring some sort of technological revolution that can eliminate the risk of mechanical failure, or somehow guarantee that human error or intervention cannot ever be deadly. The questions posted by Moral Machine are less “if” than “when,” and will have to be answered in a future where we may not always be in control behind the wheel of the metal shells hurtling our families from place to place.

Follow Nate Church @Get2Church on Twitter for the latest news in gaming and technology, and snarky opinions on both.