ArsTechnica

Scientific Method / Science & Exploration

Common antidepressants can send our moral compasses spinning

Researchers find that they can alter individuals' responses to difficult …

Humans tend to flatter themselves by thinking they have the capacity to perform elaborate feats of moral reasoning, deeply considering possible consequences before arriving at an ethical decision. The reality is somewhat less flattering; a number of studies suggest we make moral decisions quickly and with a heavy reliance on our emotional response. Any reasoning that takes place tends to involve after-the-fact attempts to rationalize our decision, while everything from brain damage to neurotransmitter levels can alter our decisions in subtle and not-so-subtle ways. The latest findings in this area indicate that selective serotonin reuptake inhibitors (SSRIs), the class of drugs popularized by Prozac, can alter moral decision making, but only when the individual taking the drugs has a personal, emotional stake in the process.

This isn't the authors' first look at the impact of serotonin signaling. Two years ago, they reported that lowering the levels of the neurotransmitter serotonin changed subjects' behavior in the Ultimatum Game, which measures their willingness to punish peers for acting unfairly. With less serotonin around, individuals became more sensitive to offers that they perceived as unfair, and rejected them at elevated rates.

In the new work, the ultimatum game was used once again—a subject and partner are given a lump of cash, and the partner gets to divide it in any proportion they wish. The participant then gets the chance to reject the offer, in which case nobody gets any money. Rejections start increasing once the split reaches about 70 to 30, and continue to go up as the split becomes more disproportionate. Even though rejecting the money means they get nothing, people seem to be willing to do so in order to punish people for making an offer that they perceive as unfair.

At least, they're willing to punish if their serotonin levels haven't been messed with. A dose of SSRIs, and serotonin levels rose, while rejections of unfair offers declined. This wasn't because they were no longer perceived as unfair, as a follow up survey showed no change in how the offers were rated. People just lost their taste for inflicting a punishment.

To figure out why, the authors gave the SSRIs to another set of subjects and hit them with a set of traditional moral dilemmas, like deciding whether to sacrifice a single person to save five. But they posed them in two very different ways: indirect, in which the person could make the choice by flipping a switch, or direct, in which they actually had to physically push people in order make their decision. Obviously, that latter situation is more emotionally challenging, and this is where increased serotonin had a significant effect: people declined to sacrifice someone when it required their personal involvement, even if doing so would save more people.

Between the two experiments, the authors conclude that elevated serotonin makes people less willing to make a personal, emotional commitment to a moral decision, such as punishing unfairness or pushing someone under a train. Detach the person a bit by removing their personal involvement—have them throw a switch instead of giving a push—and the impact of serotonin goes away.

To confirm this, the subjects were given a survey that measured their level of empathy; these results were then compared to performance on the tests. Those with the highest levels of empathy were more likely to be effected by serotonin, reinforcing the role of an emotional investment in the effect.

Beyond using a placebo as a control, the authors also gave a set of subjects a drug that blocked a neurotransmitter associated with executive decision making. That had no statistically significant effects in any experiment, so the authors conclude that high-level decision making doesn't play a significant role here. That means, in their view, emotions aren't competing with decision making when people face these moral dilemmas.

The work seems solid, but it may be arriving under a bit of a dark cloud. One of the authors is Marc Hauser, who has recently been found to have committed scientific fraud. According to the list of authors' activities on the paper, though, Hauser wasn't involved in gathering any of the data, and simply helped with writing it, which may help ease its acceptance.

PNAS, 2010. DOI: 10.1073/pnas.1009396107  (About DOIs).

Expand full story

You must to comment.

   

You May Also Like