Humans make decisions jointly with others. They share responsibility for the outcome with their interaction partners. Today, more and more often the partner in a decision is not another human but, instead, a machine. Here we ask whether the type of the partner, machine or human, affects our responsibility, our perception of the choice and the choice itself. As a workhorse we use a modified dictator game with two joint decision makers: either two humans or one human and one machine. We find a no treatment effect on perceived responsibility or guilt. We also find only a small and insignificant effect on actual choices.
"For the future, an open discussion of hybrid-decision situations would be desirable. It might not only be important to address the technical question of what we can achieve by using artiifcial decision making systems such as computer but also how humans perceive them in diferent situations and how this influences human decision-making."
We examine whether decisions made under accountability differ for self and others using lottery choice tasks that contain only positive amounts(gaining lottery), positive and negative amounts(mixed lottery) and mainly negative amounts (losing lottery). Accountability is ensured through letting participants hold up a sign with their seat number and their decision after the experiment. Perhaps surprisingly, we fnd that participants are significantly less risk averse for others than themselves in the gaining lottery, and slightly less risk averse for others in the mixed lottery. In the losing lottery, participants are equally risk averse for themselves as for others.
"People choose significantly less risk averse in the gaining domain but choose quite as risk averse as people who decide for their own outcome in the losing as well as in the mixed domain. A possible reason might be that people who decide for another person start to gamble when facing gains more than people who decide for their own payof."