Moral Machine

The Moral Machine is a thought experiment and online game developed by MIT Media Lab which presents users with a series of dilemmas in which they must choose who should live and who should die. When was Moral Machine created? Moral Machine was created on October 18, 2016.

What morals should drive driverless cars?

The answer to this question depends on the particular context in which driverless cars will be used. In general, however, there are a few key moral principles that should guide the development of driverless car technology:

1. Safety: Driverless cars should be designed with safety as a top priority. This means that the technology should be able to avoid accidents and protect occupants in the event of a collision.

2. Responsibility: Driverless cars should be designed to be responsible users of the road. This means obeying traffic laws, yielding to pedestrians, and generally behaving in a way that minimizes the risk of accidents.

3. Respect: Driverless cars should be designed to respect the rights of all road users. This means giving pedestrians the right of way, not using excessive force when interacting with other vehicles, and so on.

4. Fairness: Driverless cars should be designed to be fair users of the road. This means not jumping ahead in line, not cutting off other drivers, and so on.

5. Transparency: Driverless cars should be designed to be transparent in their operation. This means providing information to occupants about the car's current state and its planned actions, so that they can understand what is happening and why.

How do self-driving cars make decisions?

Self-driving cars use a combination of sensors and software to make decisions. The sensors provide data about the car's surroundings, and the software uses that data to make decisions about how to control the car.

The software is designed to handle a variety of different situations, and it uses a set of rules to decide how to respond in each situation. For example, the software might be programmed to always yield to pedestrians. Or, it might be programmed to drive more aggressively in order to reach its destination faster.

The software is continually learning and improving as it gains more experience. For example, it might learn to better identify pedestrians, or to better predict the behavior of other drivers.

Are moral machines possible?

It is possible to create a machine that can make moral decisions, but the machine would need to be programmed with a set of moral values to guide its decision making. It is also important to note that even if a machine is programmed with a set of moral values, it may not always make the "right" decision, as morality is often subjective. Who created Moral Machine? Moral Machine was developed by a team at the MIT Media Lab, led by Iyad Rahwan.