Jump to content

User:Jordan Prokosch/sandbox

From Wikipedia, the free encyclopedia

Draft replacement for The Evolution of Cooperation

The Evolution of Cooperation
AuthorRobert Axelrod
LanguageEnglish
GenrePhilosophy, sociology
PublisherBasic Books
Publication date
December 5, 2006
Publication placeUnited States
Media typeHardback, paperback, audiobook
Pages241
ISBN0-465-00564-0
OCLC76963800
302 14
LC ClassHM131.A89 1984

The evolution of cooperation can refer to:

This article is about the book The Evolution of Cooperation, which expands on the ideas in the article of the same name.

The Evolution of Cooperation is an academic work intended for non-specialist readers written by Robert Axelrod, a professor of Political Science and Public Policy at the University of Michigan and an expert on game theory, artificial intelligence, mathematical modeling, and complexity theory[1]. The book attempts to explain how cooperation can be advantageous for essentially selfish actors using a competition where professional and amateur game theorists submitted various strategies to a prisoner’s dilemma game.[2] Axelrod then explores the significance of his discoveries, applying them to a wide variety of situations including biology, WWI trench warfare, business relationships and nuclear proliferation.

Widely regarded as an important work for explaining how cooperative behavior can occur between selfish individual actors, The Evolution of Cooperation is an influential and frequently cited text in academics. It was popularized in the bestseller The Selfish Gene.

Background[edit]

Game Theory and the Prisoner’s Dilemma[edit]

Game Theory is the "the study of mathematical models of conflict and cooperation between intelligent rational decision-makers." [3]. Game theory attempts to find optimal decisions or strategies in scenarios where the outcome is dependent on the choices of others. It first addressed zero-sum games but is now applied to a wider variety of scenarios.

The Prisoner’s Dilemma is the main scenario in game theory utilized in The Evolution of Cooperation.[4]. Two members of a criminal gang are arrested and kept in isolation, with no means of communication. They are each offered a bargain, they can either betray the other prisoner by testifying or cooperate with the other prisoner by remaining silent. If they betray each other, they both get 2 years in prison. If one betrays and the other does not, the prisoner who betrayed will be set free and the prisoner who did not betray will get 3 years in prison. If both A and B choose not to betray they will both get 1 year in prison. The best scenario for the prisoners as a group is for them to cooperate. But the best scenario for each individual prisoner is to betray. Either your partner will cooperate and you will go free, or your partner will betray and you will get 2 years instead of 5. However, since both prisoners know this, the result becomes a double betrayal, a worse result for either prisoner than cooperation. An iterated version of this scenario, where two “prisoners” meet many times and are allowed to react to their opponent’s previous decisions forms the basis of Axelrod’s tournament.[5].

The Article[edit]

Before publishing the Evolution of Cooperation, Axelrod co-wrote a 1981 Article published in Science with W. D. Hamilton under the same name. In this article he presents the results of a Prisoner’s Dilemma tournament where a variety of strategies are submitted and those with the highest score “reproduce” over many generations until they meet an equilibrium.[6] He applies this model to biology to explain how cooperation could be beneficial to “selfish” genes. He emphasizes strategies which are stable over time and resistant to invasion by other strategies because only they will persist over time without going extinct. These ideas are re-published and further expanded in chapter 5 of the book.

Content[edit]

Computer Tournaments[edit]

The Evolution of Cooperation centers on the results of two computer based tournaments. For each tournament Axelrod solicited strategies from a variety of participants. Each strategy was a program which must make a simple binary choice: COOPERATE or DEFECT. Both strategies make their choice simultaneously. If both strategies COOPERATE they both get 3 points. If one strategy chooses COOPERATE and the other chooses DEFECT then the COOPERATE strategy gets 0 points and the DEFECT strategy gets 5. The strategies are allowed to remember their opponents previous actions and change their choice accordingly. Some strategies were highly complex pieces of programming that analyzed the entire history of decisions, others were very simple.Cite error: A <ref> tag is missing the closing </ref> (see the help page)..

For the first tournament game theory experts and other academics from psychology, sociology, and political science provided the strategies.[7]. Each pair of strategies went for 200 rounds. The winner was a very simple strategy submitted by Anatol Rapoport called "TIT FOR TAT" that cooperates on the first move, and subsequently echoes (reciprocates) what the other player did on the previous move[8]. TIT FOR TAT never did better than its partner in any pairing, but it accumulated a higher total score than any other strategy.[9].

Axelrod analyzed his results and presented them to a much larger group of participants for a second tournament.[10]. All the participants were aware of the strategies and the rankings of the previous tournament. Despite the presence of a wider variety of strategies TIT FOR TAT also won the second tournament.

In both tournaments all of the high scoring strategies were nice, meaning they never chose DEFECT first. Many of the competitors went to great lengths to gain an advantage over the "nice", and usually simpler, strategies, but to no avail: tricky strategies fighting for a few points generally could not do as well as nice strategies working together. TIT FOR TAT, along with other high scoring nice strategies, "won, not by doing better than the other player, but by eliciting cooperation [and] by promoting the mutual interest rather than by exploiting the others' weakness."[11].

The high scoring strategies were also “retaliatory”, meaning they would respond to their opponent’s DEFECT with at least one DEFECT.[12] Without being retaliatory, strategies run the risk of being suckered by a “mean” strategy like ALWAYS DEFECT. Finally they were all “forgiving”, they would respond to an opponent’s COOPERATE with a COOPERATE, even if there was a history of defection. This allowed them to benefit from encounters with strategies that mostly COOPERATE but occasionally DEFECT.[13]

Axelrod also ran a variety of scenarios where he changed the parameters of the competition but not the strategies. The most important of these scenarios was a generational tournament, where each strategy reproduced, changing the composition of the strategies[14].. He also ran a tournament where each strategy was assigned a geographical position and competed against neighbors, expanding or contracting over generations based on its score.[15]. These scenarios were used to create models for biology and political science.

Principals of Cooperation[edit]

Axelrod developed 4 basic principals correlated with success in the tournament which may also apply to a wide variety of real life scenarios.

  1. Don’t be envious: Game theory was originally designed to analyze zero-sum games, where one participant’s success meant the others' loss. Despite the tournament structure, the iterated prisoner’s dilemma game was not zero sum. TIT FOR TAT never “won” a single encounter with another strategy. Axelrod argues that “most of life is not zero-sum” and “asking how well you are doing compared to how well the other player is doing is not a good standard unless your goal is to destroy the other player.”<[16].
  2. Don’t be the first to defect: Some of the strategies tried very sophisticated methods of exploiting other strategies for a few extra points by defecting. This would often begin a chain reaction of DEFECT choices and lower the score of both strategies. One betrayal can destroy a once profitable relationship. Axelrod makes an important caveat on this point: if the likelihood of another meeting is low then it can pay to DEFECT.[17].
  3. Reciprocate both cooperation and defection: TIT FOR TAT was the ultimate reciprocator. By copying an opponent’s move it was both forgiving and retaliatory. Axelrod suggests that depending on the environment some strategies can perform better than TIT FOR TAT by being more forgiving, such as by ignoring a defection 10% of the time or by allowing two defections before retaliating.[18].
  4. Don't be too clever : The sophisticated strategies didn’t do better than the simple ones in the tournament. “A common problem with these rules is that they used complex methods of making inferences about the other player – and these inferences were wrong”[19]. They appeared random to other strategies and failed to account for the chain of defections they provoked. Strategies which were easy to read and followed predictable patterns were more likely to encourage cooperation with other strategies.

How to Promote Cooperation[edit]

Axelrod uses the Prisoner’s Dilemma game to make three proposals for promoting cooperation in a variety of real world contexts:

  1. Enlarge the shadow of the future: “No form of cooperation is stable when the future is not important enough relative to the present.”[20]. Axelrod argues that there are two ways to enlarge the shadow of the future:
    1. Interactions can be made more frequent, so that a single opportunity for defection will be outweighed by numerous lucrative opportunities in the near future.
    2. Interactions can be made more durable, so that each partner is absolutely certain they will meet again
  2. Change the payoffs: “If you avoid paying your taxes, you must face the possibility of being caught and sent to jail”.[21]. If an outside party creates incentives to cooperate, or disincentives to defect, then the Prisoner’s Dilemma scenario is invalidated. Going back to the original scenario, Axelrod argues that real world gangs enforce cooperation with the knowledge that a defector will receive a much worse punishment than a few extra years in prison.
  3. Teach people to care about each other: “An excellent way to promote cooperation in a society is to teach people to care about the welfare of others”.[22]. Axelrod argues that socialization can be effective, but he also warns that unequivocal kindness may encourage cheaters and ultimately be a detriment to society. It’s better to teach people to be initially cooperative but to retaliate against people who try to take advantage. Quickly reciprocating defections will teach others to be cooperative. In short, he argues that the ideal stance to teach is TIT FOR TAT. “Reciprocity is a better foundation for morality than is unconditional cooperation.[23].

Applications to Biology[edit]

Axelrod spends a chapter re-iterating his article published in Science with the evolutionary biologist Hamilton. By putting the strategies of his tournament in a generational scenario, where higher scoring strategies produce more offspring and lower scoring strategies go extinct, he discovered that TIT FOR TAT is an evolutionarily stable strategy.[24]. From a small starting population it can overcome other strategies and do well with a large number of copies of itself. Once the population is exclusively TIT FOR TAT, no other strategy can “invade” and do better than the TIT FOR TAT strategy.

There are other evolutionarily stable strategies. If present in large enough numbers ALWAYS DEFECT will kill off any isolated cooperative strategy.[25]. With no one to cooperate with, the strategy is exploited on the first move and that starts a chain of DEFECT moves, with the first defector always staying slightly ahead. All scores remain low in this scenario. However, Axelrod found that if TIT FOR TAT invades in a cluster so that TIT FOR TAT has a fair chance to meet a copy of itself and begin a chain of cooperation, then TIT FOR TAT will quickly supplant ALWAYS DEFECT.[26]. This effect is even more pronounced if the cluster is geographically close and TIT FOR TAT can rely on cooperative neighbors.[27].

Axelrod concludes that there are three traits a strategy requires to become prevalent in biological systems.

  1. Robustness: The strategy can thrive in a variety of environments composed of other strategies.
  2. Stability: The strategy can resist invasion from other strategies.
  3. Initial Viability: The strategy is robust and stable even in small numbers.

These ideas help resolve a conundrum in biology. Cooperation is clearly better for the group, but genes are selected on an individual level.[28]. Cite error: A <ref> tag is missing the closing </ref> (see the help page).. The Evolution of Cooperation argues that cooperation is an advantageous and evolutionarily stable strategy so long as there are other individuals to cooperate with, there’s a way to recognize and punish cheaters, and there’s a sufficiently high chance that the cooperators will meet again in the future. If cooperation is paired with retaliation, even one cluster, or family, of cooperators can invade a population of non-cooperators and become the dominant strategy.[29].

Applications to Warfare[edit]

Axelrod uses some events in World War I trench warfare as an example of cooperation between enemy combatants. He frames trench warfare as an iterated prisoner’s dilemma. Each day combatants could choose to shoot to kill, or choose to ignore each other. Since at some point high command would demand a bloody push, the benefits of thinning the numbers of your enemy were great. But if you attack the enemy unit in earnest, they will retaliate just as harshly. Under these conditions, with units facing each other for weeks on end, each capable of dealing great harm, temporary truces and tacit agreements developed between the combatants. Just like the classic prisoner’s dilemma scenario, this example of cooperation didn't require friendship or lines of communication between the participants.[30].

Applications to Business[edit]

The Evolution of Cooperation also frames cooperation between businesses as an iterated prisoner's dilemma. Businesses work together smoothly when they anticipate a long and profitable relationship and both businesses have some power to harm the profit of the other. Axelrod uses a failing business as an example of cooperation breaking down. The business will soon disappear and no longer has the ability to punish defectors with the cessation of the profitable relationship. This leads to the betrayal of former partners, who suddenly complain about defective products and refuse to honor contracts.[31].

According to Axelrod, cooperation is not always desirable in society; price fixing is a form of mutually beneficial cooperation between businesses that hurts consumers. He argues that we can use his discoveries to discourage cooperation. By disrupting stable relationships, changing payoffs to weight more for short term gains, and removing the ability for partners to punish each other for defection we can discourage unwanted collusions.[32].

Applications to Politics[edit]

Axelrod briefly applies his ideas to the Cold War and nuclear disarmament. He notes that cooperation is possible when the relationship is durable and both sides have the ability to retaliate. The relationship does not have to be friendly, both sides only have to be confident that they will be dealing with each other in the future. Applying the principals of cooperation derived from the tournaments, he argues that it’s ideal to be initially friendly and forgiving, but to also be quick to retaliate to any provocation. This behavior should encourage cooperation but not exploitation and allow for peaceful relationships between powers in the absence of a central authority. [33].

Reception[edit]

In 1984 Axelrod estimated that there were "hundreds of articles on the Prisoner's Dilemma cited in Psychological Abstracts"[34]. and estimated that citations to The Evolution of Cooperation alone were "growing at the rate of over 300 per year".[35] As of 2015 The Evolution of Cooperation has been cited more than 37,000 times in a variety of fields, making it extremely influential in academics.[36] Axelrod’s work was summarized in the bestseller The Selfish Gene.[37].

Criticism[edit]

Boyd and Lorberbaum criticized Axelrod's conclusion that TIT FOR TAT is an evolutionarily stable strategy, arguing the no single strategy is truly evolutionarily stable in a game dependent on the strategies of others.[38]. TIT FOR TAT can be invaded by any "nice" strategy, such as TIT FOR TWO TATS, which forgives twice. Both strategies will be equally successful, the only difference being how they react to other invaders in the future. TIT FOR TAT would do better against meaner strategies like ALWAYS DEFECT, but worse against strategies that mostly COOPERATE but always retaliate to DEFECT with DEFECT and have a small chance of choosing an unprovoked DEFECT. They conclude that no pure strategy can be stable in all circumstances. Bendor and Swistak resolve this debate by defining two different kinds of stability: strong stability is when all invaders decline in frequency and weak stability is when all invaders in numbers smaller than the native population cannot increase in frequency at the expense of the native population.[39] They agree that no pure strategy is strongly evolutionarily stable and they point out that a wide variety of strategies can be weakly stable. These strategies do vary by how many copies of the strategy must be present before the can be weakly stable, a trait they term robustness.

Subsequent Work[edit]

Axelrod has a subsequent book, The Complexity of Cooperation,[40] which he considers a sequel to The Evolution of Cooperation. Other work on the evolution of cooperation has expanded to cover prosocial behavior generally,[41] other mechanisms for generating cooperation,[42] the Iterated Prisoner's Dilemma under different conditions and assumptions, [43] and the use of other games such as the Public Goods and Ultimatum games to explore deep-seated notions of fairness and fair play.[44] It has also been used to challenge the rational and self-regarding "economic man" model of economics,[45]and as a basis for replacing Darwinian sexual selection theory with a theory of social selection. [46]

Notes[edit]

  1. ^ See personal website for Curriculum Vitae http://www-personal.umich.edu/~axe/
  2. ^ Axelrod 1984, p. 8
  3. ^ Myerson 1997
  4. ^ Axelrod 1984, p. 8
  5. ^ Axelrod 1984, p. 8
  6. ^ Axelrod & Hamilton 1981
  7. ^ Axelrod 1984, p. 31
  8. ^ Axelrod 1984, p. 31
  9. ^ Axelrod 1984, p. 112
  10. ^ Axelrod 1984, p. 41
  11. ^ Axelrod 1984, p. 32
  12. ^ Axelrod 1984, pp. 44
  13. ^ Axelrod 1984, pp. 42
  14. ^ Axelrod 1984, p. 49-54
  15. ^ Axelrod 1984, p. 161
  16. ^ Axelrod 1984, p. 110-111
  17. ^ Axelrod 1984, p. 113
  18. ^ Axelrod 1984, p. 118
  19. ^ Axelrod 1984, p. 110
  20. ^ Axelrod 1984, p. 129
  21. ^ Axelrod 1984, p. 133
  22. ^ Axelrod 1984, p. 134
  23. ^ Axelrod 1984, p. 136
  24. ^ Axelrod 1984, p. 96
  25. ^ Axelrod 1984, p. 96
  26. ^ Axelrod 1984, p. 96
  27. ^ Axelrod 1984, p. 160
  28. ^ Dawkins 1977
  29. ^ Axelrod 1984, p. 94-101
  30. ^ Axelrod 1984, p. 73-87
  31. ^ Axelrod 1984, p. 179-183
  32. ^ Axelrod 1984, p. 18-19
  33. ^ Axelrod 1984, p. 184-187
  34. ^ Axelrod 1984, p. 28
  35. ^ Axelrod 1984, p. 3
  36. ^ See Google Scholar for current citation statistics http://scholar.google.com/citations?user=c_aA0BwAAAAJ&hl=en}}
  37. ^ Dawkins 1977
  38. ^ Boyd & Lorberbaum 1987
  39. ^ Bendor & Swistak 1995
  40. ^ Axelrod 1984, p. 3
  41. ^ Bowles 2006
  42. ^ Nowak 2006
  43. ^ Axelrod & Douglas 1988
  44. ^ Nowak 2000
  45. ^ Camerer 2006
  46. ^ Roughgarden 2006

References[edit]

  • Axelrod, Robert (1984), The Evolution of Cooperation, Basic Books, ISBN 0-465-02122-0
  • Axelrod, Robert (1997), The Complexity of Cooperation : Agent-Based Models of Competition and Collaboration, Princeton University Press, ISBN 0-691-01567-8
  • Bowles, Samuel (8 December 2006), "Group Competition, Reproductive Leveling, and the Evolution of Human Altruism", Science, 314: 1569–1572, doi:10.1126/science.1134829
  • Boyd, Robert; Lorberbaum, Jeffrey (7 March 1987), "Types of Evolutionary Stability and the Problem of Cooperation", Nature, 327: 58–59, doi:10.1038/327058a0
  • Camerer, Colin; Fehr, Ernst (6 January 2006), "When Does "Economic Man" Dominate Social Behavior?", Science, 311: 47–52, doi:10.1126/science.1110600
  • Dawkins, Richard (1977), The Selfish Gene, Oxford University Press, ISBN 0-19-857519-X
  • Myerson, Roger (1977), Game Theory: Analysis of Conflict, 1st Harvard University Press, ISBN 978-0674341166
  • Nowak, Martin (8 December 2006), "Five Rules for the Evolution of Cooperation", Science, 314: 1560–1563, doi:10.1126/science.1133755
  • Roughgarden, Joan; Oishi, Meeko; Akcay, Erol (17 February 2006), "Reproductive Social Behavior: Cooperative Games to Replace Sexual Selection", Science, 311: 965–969, doi:10.1126/science.1110105
  • Williams, George (1996), Adaptation and Natural Selection : A Critique of Some Current Evolutionary Thought, Princeton University Press, ISBN 0-691-02615-7