|Question||Consider a simultaneous game in which both players choose between the actions “Cooperate”, denoted by C, and “Defect”, denoted by D.
A: Suppose that the payoffs in the game are as follows: If both players play C, each gets a payoff of 1; if both play D, both players get 0; and if one player plays C and the other plays D, the cooperating player gets ? while the defecting player gets ?.
(a) Illustrate the pay off matrix for this game.
(b) What restrictions on ? and ? would you have to impose in order for this game to be a Prisoners’ dilemma? Assume from now on that these restrictions are in fact met.
B: Now consider a repeated version of this game in which players 1 and 2meet 2 times. Suppose you were player 1 in this game, and suppose that you knew that player 2 was a “Tit-for-Tat” player — i.e. a player that does not behave strategically but rather is simply programmed to play the Tit-for-Tat strategy.
(a) Assuming you do not discount the future, would you ever cooperate with this player?
(b) Suppose you discount a dollar in period 2 by ? where 0 < ? < 1. Under what condition will you cooperate in this game? (c) Suppose instead that the game was repeated 3 rather than 2 times. Would you ever cooperate with this player (assuming again that you don’t discount the future)? (d) In the repeated game with 3 encounters, what is the intuitive reason why you might play D in the first stage? (e) If player 2 is strategic, would he ever play the “Tit-for-Tat” strategy in either of the two repeated games? (f) Suppose that each time the two players meet, they know they will meet again with probability ? > 0. Explain intuitively why “Tit-for-Tat” can be an equilibrium strategy for both players if ? is relatively large (i.e. close to 1) but not if it is relatively small (i.e. close to 0).