Recently, I came across this clip online, and it was just too interesting not to share (which just goes to show not all my time spent mindless browsing the internet is a waste; full talk can be found here). In this experiment (Brosnan & de Waal, 2003), five female capuchin monkeys are trained to do a simple task: exchange rock tokens they are given with the experimenter for a food reward. The monkey gives the experimenter a token, and it gets a slice of cucumber. Typically, the monkey will then eat the snack without a fuss, which isn’t terribly surprising. However, if the monkey getting the cucumber now witnesses another monkey doing the same task, but getting rewarded with a much more desirable food item – in this case, a grape – not only does the initial monkey get visibly agitated when it receives a cucumber again, but actually starts to refuse the cucumber slices. While the cucumber had been an acceptable reward for the task mere moments ago, it now appears to be insulting.
For the sake of comparison, when both monkeys were exchanging a token for a slice of cucumber, they would only either fail to return the token or reject the food about 5% of the time. When the other monkey was getting a grape for the same exchange, now the female getting the cucumber would reject the food or not return the token roughly 45% of the time. Things got even worse if the monkey getting the grape was getting it for free, without having to even return a token; in that case, the rate of rejections and non-returns jumped to almost 80%. It should be noted that the monkey that got the grapes, however, would happily munch away on them, rather than reject them out of concerns for their unfair, but beneficial, outcomes (Wolkenten et al., 2007).
It would seem monkeys have a sense for what they ought to be getting from the exchange, and that sense is based, in part, by comparing their payoff to what others are getting for similar tasks. A monkey might get the sense that it deserves more, that the payoff isn’t fair, if it isn’t getting as much as another. Further, when a monkey’s sense of what it ought to be getting is violated, they seem to behave as if they were being harmed. Of course, giving a monkey a cucumber isn’t harm; it’s a previously acceptable benefit that seems to get re-conceptualized as harm, since the benefit isn’t as large as someone else’s.
Humans have shown a similar pattern of results in ultimatum games: when confronted with an offer that they don’t find fair enough, people will often reject it, ensuring that they (and their partner) leave the lab with nothing. Of course, if you just gave these subjects that same amount of money without that context of the ultimatum game, rejecting it outright would be rare behavior. This is no doubt because free money isn’t normally conceptualized as harmful. As I wrote about in part 2 of this series, dictator and ultimatum games seem to evoke different responses to the same offer, as judged by the responses receivers write to selfish proposers. When confronted with an ultimatum game, receivers write many more negative messages to selfish proposers, compared to dictator games. While those contexts might not be exactly analogous to the monkey’s, we could imagine situations that are similar, such as finding out that you’re being paid less at work for doing the same job as a co-worker. Even if you were previously happy with your salary, your satisfaction with it may now drop somewhat, and public complaints would begin. However, if you found out you were actually making more than other co-workers, it would probably be rare for you to march into your boss’s office and demand to be paid less to make it fair. I imagine it would be equally rare for you to make that new-found knowledge as public as possible.
So how can we explain these results? Simply stating that some monkeys and people have a “concern for fairness” is clearly not enough. Not only would it not really be an explanation, but it would miss a key finding: concerns for fairness only seem to appear in the context of being disadvantaged. The monkeys receiving a grape did not throw it back at the experimenter because their other monkey partner was not receiving one as well. To drive this point home, consider some research done by Pillutla & Murnighan (1995). In one experiment, subjects were playing an ultimatum game. Participants in this experiment were dividing sums of money that ranged from a low of ten dollars to a high of seventy. However, the receiver of these offers either had complete information (knew how much money was being divided) or incomplete information (did not know the size of the pot being divided). This gave the subjects making the offer the potential for a strategic advantage. Did they use it? Well, out of 33 offerers, 31 used that information asymmetry to their advantage, so, “yes”. Much like the monkeys, people only seemed to have a concern for fairness when it benefited them.
Wolkenten et al. (2007) propose the following explanation: these ostensible fairness concerns can be better conceptualized as a solution for cooperative effort problems – contexts in which organisms cooperate with each other in the service of achieving some goal without knowing ahead of time how the payoffs will be divided. One model for this might be a variant of the Stag Hunt. In this example, there are two hunters who have to decide between hunting rabbits or a stag. While each hunter can successfully hunt rabbits alone, the same cannot be said of stag, and rabbits offer a smaller overall payoff (say, 1). If the hunters work together, they can bring down a stag, which is the larger payoff (say 6). However, in the context of cooperative effort problems, the two hunters won’t know how the stag will be divided until after the kill. If one hunter monopolizes most of the payoff (taking 4), the hunter who got the smaller share can exert pressure on their partner by refusing to cooperate again.
Now, while the hunter who got the smaller share in the example might still be better off cooperating and hunting stag, relative to rabbit, by not hunting stag until he gets a more equitable division he can impose an even greater loss on his partner. The hunter who got the smaller share will only lose out on 1 unit of payoff by not cooperating, while his partner would lose out on 3 because of that refusal. This asymmetry in relative payoff and subsequent loses can be leveraged in social bargaining in order to net a better payoff in the long-term by suffering a short-term cost. As I have written about previously, depression might have a similar function.
That would imply that while it’s in the interests of an individual to maximize their own payoff, that same individual has an interest in maintaining the cooperation of others in the service of that goal. This requires balancing two competing sets of demands: appearing fair, so as to maintain cooperation, while simultaneously being as inequitable as possible. The importance of the “as possible” part of that last sentence really cannot be overstated. The greater the inequity in resource allocation, the greater the chance you might lose support in future ventures by triggering the fairness concerns of others. The costs of being too selfish might not end at ostracism either; selfishness might also invite violent retribution by others who feel cheated. Like all matters relating to selection, there are tradeoffs to be made; risks and rewards to be balanced. In some cases, a more proximately cooperative strategy can end up being more ultimately selfish.
Thankfully (or not so thankfully, depending on your place in a given situation), this feeling of what one ought to get from the payoff – what one deserves – normally depends on many fuzzy variables. For instance, not all cooperators necessarily put in the same amount of effort towards achieving a joint goal. If one cooperator takes on greater risks, or another attempts to free-ride by not investing as much energy as they could have, intuitions about how much an individual deserves tend to change accordingly. This leaves open multiple avenues for attempts to convince others that you deserve more, or others deserve less, as well as counter-adaptations to defend against such claims. It’s a delicate balancing act that we all engage in, playing multiple roles over time and circumstance, and one that’s guaranteed to generate more than a fair share of hypocrisy.
References: Brosnan, S., & de Waal, F. (2003). Monkeys reject unequal pay Nature, 425 (6955), 297-299 DOI: 10.1038/nature01963
Pillutla, M., & Murnighan, J. (1995). BEING FAIR OR APPEARING FAIR: STRATEGIC BEHAVIOR IN ULTIMATUM BARGAINING. Academy of Management Journal, 38 (5), 1408-1426 DOI: 10.2307/256863
van Wolkenten M, Brosnan SF, & de Waal FB (2007). Inequity responses of monkeys modified by effort. Proceedings of the National Academy of Sciences of the United States of America, 104 (47), 18854-9 PMID: 18000045