# The Art of Strategy ## Metadata * Author: [Avinash K. Dixit and Barry J. J. Nalebuff](https://www.amazon.comundefined) * ASIN: B001FA0NOM * Reference: https://www.amazon.com/dp/B001FA0NOM * [Kindle link](kindle://book?action=open&asin=B001FA0NOM) ## Highlights A game is a situation of strategic interdependence: the outcome of your choices (strategies) depends upon the choices of one or more other persons acting purposely. — location: [2182](kindle://book?action=open&asin=B001FA0NOM&location=2182) ^ref-49907 --- The decision makers involved in a game are called players, and their choices are called moves. — location: [2183](kindle://book?action=open&asin=B001FA0NOM&location=2183) ^ref-312 --- The interests of the players in a game may be in strict conflict; one person’s gain is always another’s loss. Such games are called zero-sum. — location: [2184](kindle://book?action=open&asin=B001FA0NOM&location=2184) ^ref-31673 --- The moves in a game may be sequential or simultaneous. — location: [2187](kindle://book?action=open&asin=B001FA0NOM&location=2187) ^ref-23537 --- In a game of sequential moves, there is a linear chain of thinking: If I do this, my rival can do that, and in turn I can respond in the following way. Such a game is studied by drawing a game tree. — location: [2188](kindle://book?action=open&asin=B001FA0NOM&location=2188) ^ref-59293 --- Rule 1: Look forward and reason backward. — location: [2190](kindle://book?action=open&asin=B001FA0NOM&location=2190) ^ref-64451 --- In a game with simultaneous moves, there is a logical circle of reasoning: I think that he thinks that I think that…and so on. — location: [2191](kindle://book?action=open&asin=B001FA0NOM&location=2191) ^ref-62266 --- To tackle such a game, construct a table that shows the outcomes corresponding to all conceivable combinations of choices. — location: [2192](kindle://book?action=open&asin=B001FA0NOM&location=2192) ^ref-63100 --- Begin by seeing if either side has a dominant strategy—one that outperforms all of that side’s other strategies, irrespective of the rival’s choice. — location: [2194](kindle://book?action=open&asin=B001FA0NOM&location=2194) ^ref-13328 --- Rule 2: If you have a dominant strategy, use it. — location: [2195](kindle://book?action=open&asin=B001FA0NOM&location=2195) ^ref-54998 --- If you don’t have a dominant strategy, but your rival does, then count on his using it, and choose your best response accordingly. — location: [2196](kindle://book?action=open&asin=B001FA0NOM&location=2196) ^ref-8543 --- Next, if neither side has a dominant strategy, see if either has a dominated strategy—one that is uniformly worse for the side playing it than all the rest of its strategies. — location: [2197](kindle://book?action=open&asin=B001FA0NOM&location=2197) ^ref-5760 --- Rule 3: Eliminate dominated strategies from consideration. — location: [2199](kindle://book?action=open&asin=B001FA0NOM&location=2199) ^ref-38646 --- If during the process any dominant strategies emerge in the smaller games, they should be chosen. If this procedure ends in a unique solution, you have found the prescriptions of action for the players and the outcome of the game. — location: [2199](kindle://book?action=open&asin=B001FA0NOM&location=2199) ^ref-20395 --- Finally, if there are neither dominant nor dominated strategies, or after the game has been simplified as far as possible using the second step, — location: [2202](kindle://book?action=open&asin=B001FA0NOM&location=2202) ^ref-25305 --- Rule 4: Look for an equilibrium, a pair of strategies in which each player’s action is the best response to the other’s. — location: [2203](kindle://book?action=open&asin=B001FA0NOM&location=2203) ^ref-57024 --- If there is a unique equilibrium of this kind, there are good arguments why all players should choose it. — location: [2204](kindle://book?action=open&asin=B001FA0NOM&location=2204) ^ref-20643 --- If there are many such equilibria, one needs a commonly understood rule or convention for choosing one over the others. If there is no such equilibrium, that usually means that any systematic behavior can be exploited by one’s rivals, which indicates the need for mixing one’s plays, — location: [2204](kindle://book?action=open&asin=B001FA0NOM&location=2204) ^ref-43521 --- There is no end to this circle of logic.* — location: [2229](kindle://book?action=open&asin=B001FA0NOM&location=2229) ^ref-9054 --- When the two choose opposite sides, the kicker’s success rate is almost the same whether the side is natural or not; the only reason for failure is a shot that goes too wide or too high. Within the pair of outcomes when the two choose the same side, the kicker’s payoff is higher when he chooses his natural side than when he chooses his non-natural side. All of this is quite intuitive. — location: [2266](kindle://book?action=open&asin=B001FA0NOM&location=2266) ^ref-24156 --- we will refer to the strategies originally specified—Left and Right for each player—as the pure strategies. — location: [2277](kindle://book?action=open&asin=B001FA0NOM&location=2277) ^ref-22143 --- Many people’s raw intuition about games, derived from their experience of sports just like this one, is that each game must have a winner and a loser. However, in the general world of games of strategy, such games of pure conflict are relatively rare. — location: [2280](kindle://book?action=open&asin=B001FA0NOM&location=2280) ^ref-24166 --- Games in economics, where the players engage in voluntary trade for mutual benefit, can have outcomes where everyone wins. Prisoners’ dilemmas illustrate situations where everyone can lose. And bargaining and chicken games can have lopsided outcomes in which one side wins at the expense of the other. So most games involve a mixture of conflict and common interest. — location: [2282](kindle://book?action=open&asin=B001FA0NOM&location=2282) ^ref-6074 --- constant-sum, as in the present case, where the two players’ payoffs always sum to 100. — location: [2287](kindle://book?action=open&asin=B001FA0NOM&location=2287) ^ref-1246 --- Can you do better? Suppose you choose Left or Right at random in proportions of 50:50. For example, as you stand ready to run up and kick, you toss a coin in the palm of your hand out of the goalie’s sight and choose Left if the coin shows tails and Right if it shows heads. If the goalie chooses his Left, your mixture will succeed 1/2 × 58 + 1/2 × 93 = 75.5 percent of the time; if the goalie chooses his Right, your mixture will succeed 1/2 × 95 + 1/2 × 70 = 82.5 percent of the time. If the goalie believes you are making your choice according to such a mixture, he will choose his Left to hold your success rate down to 75.5 percent. But that is still better than the 70 you would have achieved by using the better of your two pure strategies. — location: [2295](kindle://book?action=open&asin=B001FA0NOM&location=2295) ^ref-14715 --- An easy way to check whether randomness is needed is to ask whether there is any harm in letting the other player find out your actual choice before he responds. When this would be disadvantageous to you, there is advantage in randomness that keeps the other guessing. — location: [2301](kindle://book?action=open&asin=B001FA0NOM&location=2301) ^ref-3296 --- Is 50:50 the best mixture for you? No. Try a mixture where you choose your Left 40 percent of the time and your Right 60 percent of the time. — location: [2303](kindle://book?action=open&asin=B001FA0NOM&location=2303) ^ref-19777 --- Now the success rate of your mixture against the goalie’s Left is 0.4 × 58 + 0.6 × 93 = 79, and against the goalie’s Right it is 0.4 × 95 + 0.6 × 70 = 80. The goalie can hold you down to 79 by choosing his Left, but that is better than the 75.5 percent you could have achieved with a 50:50 mix. — location: [2306](kindle://book?action=open&asin=B001FA0NOM&location=2306) ^ref-14736 --- Observe how the successively better mixture proportions for the kicker are narrowing the difference between the success rates against the goalie’s Left and Right choices: from the 93 to 70 difference for the better of the kicker’s two pure strategies, to the 82.5 to 75.5 difference for the 50:50 mix, to the 80 to 79 difference for the 40:60 mix. — location: [2308](kindle://book?action=open&asin=B001FA0NOM&location=2308) ^ref-50821 --- It should be intuitively clear that your best mixture proportion achieves the same rate of success whether the goalie chooses his Left or his Right. — location: [2310](kindle://book?action=open&asin=B001FA0NOM&location=2310) ^ref-15989 --- That also fits with the intuition that mixing moves is good because it prevents the other player from exploiting any systematic choice or pattern of choices. — location: [2311](kindle://book?action=open&asin=B001FA0NOM&location=2311) ^ref-9737 --- minimax theorem, — location: [2323](kindle://book?action=open&asin=B001FA0NOM&location=2323) ^ref-39737 --- The theorem states that in zero-sum games in which the players’ interests are strictly opposed (one’s gain is the other’s loss), one player should attempt to minimize his opponent’s maximum payoff while his opponent attempts to maximize his own minimum payoff. When they do so, the surprising conclusion is that the minimum of the maximum (minimax) payoffs equals the maximum of the minimum (maximin) payoffs. — location: [2326](kindle://book?action=open&asin=B001FA0NOM&location=2326) ^ref-45770 --- RULE 5: In a game of pure conflict (zero-sum game), if it would be disadvantageous for you to let the opponent see your actual choice in advance, then you benefit by choosing at random from your available pure strategies. The proportions in your mix should be such that the opponent cannot exploit your choice by pursuing any particular pure strategy from the ones available to him—that is, you get the same average payoff when he plays any of his pure strategies against your mixture. — location: [2342](kindle://book?action=open&asin=B001FA0NOM&location=2342) ^ref-45143 --- When one player follows this rule, the other cannot do any better by using one of his own pure strategies than another. — location: [2346](kindle://book?action=open&asin=B001FA0NOM&location=2346) ^ref-18406 --- “Critics of this strategy insist that there is no such thing as a random throw. Human beings will always use some impulse or inclination to choose a throw, and will therefore settle into unconscious but nonetheless predictable patterns. The Chaos School has been dwindling in recent years as tournament statistics show the greater effectiveness of other strategies.” — location: [2382](kindle://book?action=open&asin=B001FA0NOM&location=2382) ^ref-3778 --- treatment of experimental economics declared flatly: “Subjects in experiments are rarely (if ever) observed flipping coins.” — location: [2398](kindle://book?action=open&asin=B001FA0NOM&location=2398) ^ref-37892 --- However, the fact remains that even when the laboratory games were structured to be similar to soccer penalty kicks, where the value of mixing moves is evident, the subjects do not seem to have used randomization either correctly or appropriately over time. — location: [2409](kindle://book?action=open&asin=B001FA0NOM&location=2409) ^ref-19420 --- Randomization does not mean alternating between the pure strategies. If a pitcher is told to mix fastballs and forkballs in equal proportions, he should not throw a fastball, then a forkball, then a fastball again, and so on in strict rotation. — location: [2414](kindle://book?action=open&asin=B001FA0NOM&location=2414) ^ref-59547 --- Psychologists have found that people tend to forget that heads is just as likely to be followed by heads as by tails; therefore they have too many reversals, and too few strings of heads, in their successive guesses. — location: [2423](kindle://book?action=open&asin=B001FA0NOM&location=2423) ^ref-57752 --- The player who throws Paper thrice in succession is looking for the opponent to think that a fourth Paper is unlikely, and the player who leaves out one of the throws and mixes among just the other two in many successive plays is trying to exploit the opponent’s thinking that the missing throw is “due.” — location: [2428](kindle://book?action=open&asin=B001FA0NOM&location=2428) ^ref-39665 --- To avoid getting caught putting order into the randomness, you need a more objective or independent mechanism. — location: [2430](kindle://book?action=open&asin=B001FA0NOM&location=2430) ^ref-35119 --- THE EIGHTFOLD PATH TO CREDIBILITY — location: [3119](kindle://book?action=open&asin=B001FA0NOM&location=3119) ^ref-36007 --- The first principle is to change the payoffs of the game. The idea is to make it in your interest to follow through on your commitment: turn a threat into a warning, a promise into an assurance. This can be done through two broad classes of tactics: Write contracts to back up your resolve. Establish and use a reputation. Both these tactics make it more costly to break the commitment than to keep it. — location: [3133](kindle://book?action=open&asin=B001FA0NOM&location=3133) ^ref-15826 --- A second avenue is to change the game by limiting your ability to back out of a commitment. In this category, we consider three possibilities: 3. Cut off communication. 4. Burn bridges behind you. 5. Leave the outcome beyond your control, or even to chance. — location: [3137](kindle://book?action=open&asin=B001FA0NOM&location=3137) ^ref-39526 --- If a large commitment is broken down into many smaller ones, then the gain from breaking a little one may be more than offset by the loss of the remaining contract. Thus we have: 6. Move in small steps. — location: [3140](kindle://book?action=open&asin=B001FA0NOM&location=3140) ^ref-39326 --- A third route is to use others to help you maintain commitment. A team may achieve credibility more easily than an individual. Or you may simply hire others to act in your behalf. 7. Develop credibility through teamwork. 8. Employ mandated agents. — location: [3142](kindle://book?action=open&asin=B001FA0NOM&location=3142) ^ref-21578 --- Contracts A straightforward way to make your commitment credible is to agree to pay a penalty if you fail to follow through. If your kitchen remodeler gets a large payment up front, he is tempted to slow down the work. But a contract that specifies payment linked to the progress of the work and penalty clauses for delay can make it in his interest to stick to the schedule. The contract is the device that makes the remodeler’s promise of completion credible. — location: [3146](kindle://book?action=open&asin=B001FA0NOM&location=3146) ^ref-29550 --- Reputation If you try a strategic move in a game and then back off, you may lose your reputation for credibility. — location: [3237](kindle://book?action=open&asin=B001FA0NOM&location=3237) ^ref-42371 --- Cutting Off Communication Cutting off communication succeeds as a credible commitment device because it can make an action truly irreversible. — location: [3260](kindle://book?action=open&asin=B001FA0NOM&location=3260) ^ref-20396 --- Burning Bridges behind You Armies often achieve commitment by denying themselves an opportunity to retreat. Although Xenophon did not literally burn his bridges behind him, he did write about the advantages of fighting with one’s back against a gully.15 Sun Tzu recognized the reverse strategy, namely the advantage of leaving an opponent an escape route to reduce his resolve to fight. The Trojans, however, got it all backward when the Greeks arrived in Troy to rescue Helen. The Trojans tried to burn the Greek ships. They did not succeed, but if they had succeeded, that would simply have made the Greeks all the more determined opponents. The strategy of burning bridges (or boats) was used by several others. William the Conqueror’s army, invading England in 1066, burned its own ships, thus making an unconditional commitment to fight rather than retreat. Hernán Cortés followed the same strategy in his conquest of Mexico, giving orders upon arrival that all but one of his ships be burned or disabled. Although his soldiers were vastly outnumbered, they had no choice but to fight and win. “Had [Cortés] failed, it might well seem an act of madness…. Yetit was the fruit of deliberate calculation. There was no alternative in his mind but to succeed or perish.”16 — location: [3301](kindle://book?action=open&asin=B001FA0NOM&location=3301) ^ref-30562 --- Leaving the Outcome beyond Your Control or to Chance Returning to Dr. Strangelove, President Merkin Muffley invites the Soviet ambassador into the Pentagon war room to let him see the situation with his own eyes and be convinced that this was not a general U.S. attack on his country. — location: [3331](kindle://book?action=open&asin=B001FA0NOM&location=3331) ^ref-30758 --- However, this strategic advantage does not come without a cost. There might be a small accident or unauthorized attack, after which the Soviets would not want to carry out their dire threat but have no choice, as execution is out of their control. — location: [3346](kindle://book?action=open&asin=B001FA0NOM&location=3346) ^ref-6123 --- Moving in Steps Although two parties may not trust each other when the stakes are large, if the problem of commitment can be reduced to a small enough scale, then the issue of credibility will resolve itself. — location: [3361](kindle://book?action=open&asin=B001FA0NOM&location=3361) ^ref-53368 --- Teamwork Often others can help us achieve credible commitment. Although people may be weak on their own, they can build resolve by forming a group. The successful use of peer pressure to achieve commitment has been made famous by Alcoholics Anonymous (AA) and diet centers. — location: [3378](kindle://book?action=open&asin=B001FA0NOM&location=3378) ^ref-41849 --- Mandated Negotiating Agents If a worker says he cannot accept any wage increase less than 5 percent, why should the employer believe that he will not subsequently back down and accept 4 percent? Money on the table induces people to try negotiating one more time. — location: [3399](kindle://book?action=open&asin=B001FA0NOM&location=3399) ^ref-56148 --- UNDERMINING YOUR OPPONENT’S CREDIBILITY If you stand to gain by making your strategic moves credibly, then similarly you will benefit by preventing other players from making their strategic moves credible. Right? No, not so fast. — location: [3418](kindle://book?action=open&asin=B001FA0NOM&location=3418) ^ref-59032 --- Here are a few suggestions for practicing that art. — location: [3430](kindle://book?action=open&asin=B001FA0NOM&location=3430) ^ref-23517 --- Contracts: Mr. Russo in our story had two selves, one before chocolate éclairs appear on the dessert trolley (BCE) and the other after (ACE). The BCE self sets up the contract to defeat ACE’s temptation, but the ACE self can render the contract ineffective by proposing a renegotiation that will benefit all the parties that are present at that point. The BCE self would have refused ACE’s proposal, but BCE is no longer there. If all the distinct parties to the original contract are still present, then to get around a contract you have to propose a new deal that will be in the interests of everyone at that point. Gaining unanimous consent is difficult, but not impossible. Suppose you are playing a repeated prisoners’ dilemma game. An explicit or implicit contract says that everyone should cooperate until someone cheats; after that, cooperation will break down and everyone will choose the selfish action. You can try to get away with cheating once by pleading that it was just an innocent error and that all the available gains from future cooperation should not be wasted just because the contract says so. You cannot hope to pull this trick too often, and even the first time others may be suspicious. — location: [3432](kindle://book?action=open&asin=B001FA0NOM&location=3432) ^ref-15152 --- Reputation: You are a student trying to get a deadline extension from your professor. He wants to maintain his reputation and tells you: “If I do this for you, I will have to do the same for everyone in the future.” You can come back with: “No one will ever know. It is not in my interest to tell them; if they write better assignments… — location: [3442](kindle://book?action=open&asin=B001FA0NOM&location=3442) ^ref-10698 --- Communication: Cutting off communication may protect the player making a strategic move by making his action irreversible. But if the other player is unavailable to receive the information about the opponent’s commitment or threat in the first place, the strategic move is pointless. A parent’s threat—“If you don’t stop crying you won’t… — location: [3448](kindle://book?action=open&asin=B001FA0NOM&location=3448) ^ref-24643 --- Burning Bridges: Recall the advice of Sun Tzu: “When you surround an enemy, leave an outlet free.”20 One leaves an outlet free not so that the enemy may actually escape but so that the enemy may believe there is a road to safety.* If the enemy does not see an escape outlet, he will fight with the courage of desperation. Sun Tzu aimed to deny the… — location: [3452](kindle://book?action=open&asin=B001FA0NOM&location=3452) ^ref-9815 --- Moving in Steps: The credibility of mutual promises can be enhanced by breaking large actions into a sequence of small ones. But you can try to destroy the credibility of an opponent’s threat by going against his wishes in small steps. Each step should be so small in relation to the threatened costly action that it is not in the interests of the other to invoke it. As previously discussed, this method is called salami tactics; you defuse the threat one slice at a time. The best example comes from Schelling: “Salami tactics, we can be sure, were invented by a child…. Tell a child not to go in the water and he’ll sit on the bank and submerge his bare feet; he is not yet ‘in’ the water. Acquiesce, and he’ll stand up; no more of him is in the water than before. Think it over, and he’ll start wading, not going any deeper; take a moment to decide whether this… — location: [3457](kindle://book?action=open&asin=B001FA0NOM&location=3457) ^ref-40797 --- Mandated Agents: If the other player aims to achieve credibility for an inflexible negotiating position by using a mandated agent, you might simply refuse to deal with the agent and demand to speak directly to the principal. A channel of communication must be open between… — location: [3468](kindle://book?action=open&asin=B001FA0NOM&location=3468) ^ref-9625 --- Tell It Like It Is? Why can’t we just rely on others to tell the truth? The answer is obvious: because it might be against their interests. — location: [3584](kindle://book?action=open&asin=B001FA0NOM&location=3584) ^ref-21320 --- Much of the time, people’s interests and communications are aligned. When you order a steak medium rare, the waiter can safely assume that you really want the steak medium rare. The waiter is trying to please you and so you do best by telling the truth. Things get a bit trickier when you ask for a recommended entrée or advice on wine. Now the waiter might want to steer you to a more expensive item and thereby increase the likely tip. — location: [3586](kindle://book?action=open&asin=B001FA0NOM&location=3586) ^ref-5694 --- The greater the conflict, the less the message can be trusted. — location: [3593](kindle://book?action=open&asin=B001FA0NOM&location=3593) ^ref-32144 --- IS THE QUALITY GUARANTEED? Suppose you are in the market to buy a used car. You find two that seem to have the same quality, as far as you can judge. But the first comes with a warranty and the second does not. You surely prefer the first, and are willing to pay more for it. — location: [3659](kindle://book?action=open&asin=B001FA0NOM&location=3659) ^ref-43226 --- You believe that things are less likely to go wrong with the car under warranty in the first place. Why? To answer that, you have to think about the seller’s strategy. The seller has a much better idea of the quality of the car. If he knows that the car is in good condition and not likely to need costly repairs, offering the warranty is relatively costless to him. However, if he knows that the car is in poor condition, he expects to have to incur a lot of cost to fulfill the warranty. Therefore, even after taking into account the higher price that a car under a warranty may fetch, the worse the quality of the car, the more likely the warranty is to be a losing proposition to the seller. Therefore the warranty becomes an implied statement by the seller: “I know the quality of the car to be sufficiently good that I can afford to offer the warranty.” — location: [3663](kindle://book?action=open&asin=B001FA0NOM&location=3663) ^ref-9312 --- Actions that are intended to convey a player’s private information to other players are called signals. — location: [3673](kindle://book?action=open&asin=B001FA0NOM&location=3673) ^ref-4450 --- For a signal to be a credible carrier of a specific item of information, it must be the case that the action is optimal for the player to take if, but only if, he has that specific information. Thus we are saying that offering a warranty can be a credible signal of the quality of the car. — location: [3674](kindle://book?action=open&asin=B001FA0NOM&location=3674) ^ref-24948 --- Screening comes into play when the less-informed player requires the more-informed player to take such an information-revealing action. — location: [3685](kindle://book?action=open&asin=B001FA0NOM&location=3685) ^ref-2056 --- SCREENING AND SIGNALING You are the chief personnel officer of a company, looking to recruit bright young people who have natural-born talent as managers. Each candidate knows whether he or she has this talent, but you don’t. Even those lacking the talent look for jobs in your firm, hoping to make a good salary until they are found out. A good manager can generate several million dollars in profits, but a poor one can rack up large losses quickly. Therefore you are on the lookout for evidence of the necessary talent. Unfortunately, such signs are hard to come by. Anyone can come to your interview wearing the right dress and professing the right attitudes; both are widely publicized and easy to imitate. Anyone can get parents, relatives, and friends to write letters attesting to one’s leadership skills. You want evidence that is credible and hard to mimic. What if some candidates can go to a business school and get an MBA? It costs around $200,000 to get one (when you take into account both tuition and foregone salary). College graduates without an MBA, working in an environment where the specialized managerial talent is irrelevant, can earn $50,000 per year. Supposing people need to amortize the expense incurred in earning an MBA over five years, you will have to pay at least an extra $40,000 a year—that is, a total of $90,000 a year—to a candidate with an MBA. However, this will make no difference if someone who lacks managerial talent can get an MBA just as easily as someone with this talent. Both types will show up with the certificates, expecting to earn enough to pay off the extra expense and still get more money than they could in other occupations. An MBA will serve to discriminate between the two types only if those with managerial talent somehow find it easier or cheaper to earn this degree. Suppose that anyone possessing this talent is sure to pass their courses and get an MBA, but anyone without the talent has only a 50 percent chance of success. Now suppose you offer a little more than $90,000 a year, say $100,000, to anyone with an MBA. The truly talented find it worthwhile to go and get the degree. What about the untalented? They have a 50 percent chance of making the grade and getting the $100,000 and a 50 percent chance of failing and having to take another job for the standard $50,000. With only a 50 percent chance of doubling their salary, an MBA would net them only $25,000 extra salary on average, so they cannot expect to amortize their MBA expenses over five years. Therefore they will calculate that it is not to their advantage to try for the MBA. — location: [3764](kindle://book?action=open&asin=B001FA0NOM&location=3764) ^ref-34277 --- ONE REASON TO GET AN MBA: A prospective employer may be concerned about hiring and training a young woman only to find that she leaves the labor force to have children. Whether legal or not, such discrimination still arises. How does an MBA help solve the problem? An MBA serves as a credible signal that the person intends to work for several years. If she was planning to drop out of the labor force in a year, it would not have made sense to have invested the two years in getting an MBA. She would have done much better to have worked for those two years and one more. Practically speaking, it likely takes at least five years to recover the cost of the MBA in terms of tuition and lost salary. Thus you can believe an MBA when she says that she plans to stick around. — location: [3804](kindle://book?action=open&asin=B001FA0NOM&location=3804) ^ref-2303 --- Signaling via Bureaucracy — location: [3831](kindle://book?action=open&asin=B001FA0NOM&location=3831) ^ref-36827 --- According to Stan Long, CEO of Oregon’s state-owned Workers’ Compensation insurer, “If you run a system where you give money to everybody who asks, you are going to get a lot of people asking for money.”4 — location: [3836](kindle://book?action=open&asin=B001FA0NOM&location=3836) ^ref-64358 --- People often think of bureaucratic delays and inconveniences as proof of the inefficiency of government, but they may sometimes be valuable strategies to cope with informational problems. — location: [3845](kindle://book?action=open&asin=B001FA0NOM&location=3845) ^ref-58013 --- Economists usually argue that cash is superior to transfers in kind, because the recipients can make their own optimal decisions to spend cash in the way that best satisfies their preferences, but in the context of asymmetric information, in-kind benefits can be superior because they serve as screening devices.5 — location: [3849](kindle://book?action=open&asin=B001FA0NOM&location=3849) ^ref-60539 --- Signaling by Not Signaling “Is there any point to which you would wish to draw my attention?” “To the curious incident of the dog in the night-time.” “The dog did nothing in the night-time.” “That was the curious incident,” remarked Sherlock Holmes. In the case of Sherlock Holmes in “Silver Blaze,” the fact that the dog didn’t bark meant that the intruder was familiar. In the case where someone doesn’t send a signal, that, too, conveys information. Usually it is bad news, but not always. — location: [3852](kindle://book?action=open&asin=B001FA0NOM&location=3852) ^ref-63858 --- College students can take many courses for a letter grade (A to F) or on a pass/fail (P or F) basis. Many students think that a P on their transcript will be interpreted as the average passing grade from the letter scale. With grade inflation as it now exists in the United States, this is at least a B+, more likely an A–. Therefore the pass/fail option looks good. Graduate schools and employers look at transcripts more strategically. They know that each student has a pretty good estimate of his or her own ability. Those who are so good that they are likely to get an A+ have a strong incentive to signal their ability by taking the course for a letter grade and thereby distinguishing themselves from the average. With many A+ students no longer taking the pass/fail option, the group choosing pass/fail loses much of its upper end. The average grade over this limited pool is no longer an A–, but, say, only a B+. Then those who know they are likely to get an A acquire more of an incentive to distinguish themselves from the herd by taking the course for a letter grade. The pool of pass/fails loses more of its upper end. This process can continue to a point where mostly only those who know they are likely to get a C or worse will choose the pass/fail option. That is how strategic readers of transcripts will interpret a P. — location: [3860](kindle://book?action=open&asin=B001FA0NOM&location=3860) ^ref-40908 --- Countersignaling You would think, based on the previous section, that if you have the ability to signal your type, you should. That way, you differentiate yourself from those who can’t make the same signal. And yet, some of the people most able to signal refrain from doing so. As Feltovich, Harbaugh, and To explain: The nouveau riche flaunt their wealth, but the old rich scorn such gauche displays. Minor officials prove their status with petty displays of authority, while the truly powerful show their strength through gestures of magnanimity. People of average education show off the studied regularity of their script, but the well educated often scribble illegibly. Mediocre students answer a teacher’s easy questions, but the best students are embarrassed to prove their knowledge of trivial points. Acquaintances show their good intentions by politely ignoring one’s flaws, while close friends show intimacy by teasingly highlighting them. People of moderate ability seek formal credentials to impress employers and society, but the talented often downplay their credentials even if they have bothered to obtain them. A person of average reputation defensively refutes accusations against his character, while a highly respected person finds it demeaning to dignify accusations with a response.6 — location: [3877](kindle://book?action=open&asin=B001FA0NOM&location=3877) ^ref-32019 --- The larger point is simple. We have ways to figure out people’s types besides what they signal. The very fact that they are signaling is a signal that they are trying to differentiate themselves from some other type that can’t afford to make the same signal. In some circumstances, the most powerful signal you can send is that you don’t need to signal.* — location: [3901](kindle://book?action=open&asin=B001FA0NOM&location=3901) ^ref-45773 --- The feature common to these situations is that success is determined by relative rather than absolute performance. When one participant improves his own ranking, he necessarily worsens everyone else’s ranking. — location: [4134](kindle://book?action=open&asin=B001FA0NOM&location=4134) ^ref-52945 --- The source of the problem of why (some) students study too much is that they do not have to pay a price or compensation to the others. — location: [4138](kindle://book?action=open&asin=B001FA0NOM&location=4138) ^ref-64437 --- There are two main ways to commute from Berkeley to San Francisco. One is driving over the Bay Bridge, and the other is taking public transportation, the Bay Area Rapid Transit train (BART). Crossing the bridge is the shortest route, and with no traffic, a car can make the trip in 20 minutes. But that is rarely the case. The bridge has only four lanes and is easily congested.* We suppose that each additional 2,000 cars (per hour) causes a 10-minute delay for everyone on the road. For example, with 2,000 cars the travel time rises to 30 minutes; at 4,000 cars, to 40 minutes. — location: [4155](kindle://book?action=open&asin=B001FA0NOM&location=4155) ^ref-14060 --- If, during rush hour, 10,000 commuters want to go from Berkeley to San Francisco, how will the commuters be distributed over the two routes? — location: [4162](kindle://book?action=open&asin=B001FA0NOM&location=4162) ^ref-61321 --- when there are 4,000 drivers on the Bay Bridge and 6,000 on the train, no one can gain by switching: the commuters have reached an equilibrium. — location: [4168](kindle://book?action=open&asin=B001FA0NOM&location=4168) ^ref-50080 --- Is this equilibrium good for the commuters as a whole? Not really. It is easy to find a better pattern. Suppose only 2,000 take the Bay Bridge. Each of them saves 10 minutes. The 2,000 who switch to the train are still spending the same time as they did before, namely 40 minutes. So are the 6,000 who were already taking the train. We have just saved 20,000 person-minutes (or almost two weeks) from the total travel time. — location: [4175](kindle://book?action=open&asin=B001FA0NOM&location=4175) ^ref-29622 --- How can the best pattern be achieved? Devotees of central planning will think of issuing 2,000 licenses to use the Bay Bridge. If they are worried about the inequity of allowing those with licenses to travel in 30 minutes while the other 8,000 must take the train and spend 40 minutes, they will devise an ingenious system of rotating the licenses among the population every month. A market-based solution charges people for the harm they cause to others. Suppose each person values an hour of time at $12, that is, each would be willing to pay $12 to save an hour. Then charge a toll for driving on the Bay Bridge; set the toll $2 above the BART fare. By our supposition, people regard an extra $2 cost as equivalent to 10 minutes of time. Now the equilibrium commuting pattern will have 2,000 cars on the Bay Bridge and 8,000 riders on BART. Each user of the Bay Bridge spends 30 minutes plus an extra $2 in commuting costs; each BART rider spends 40 minutes. The total effective costs are the same, and no one wants to switch to the other route. In the process we have collected $4,000 of toll revenue (plus an additional 2,000 BART fares), which can then go into the county’s budget, thus benefiting everyone because taxes can be lower than they would otherwise be. — location: [4186](kindle://book?action=open&asin=B001FA0NOM&location=4186) ^ref-9110 --- The invisible hand guides people to an optimal commuting pattern only when the good “commuting time” is priced. With the profit-maximizing toll on the bridge, time really is money. — location: [4197](kindle://book?action=open&asin=B001FA0NOM&location=4197) ^ref-23579 --- In the late 1800s, there was no standard pattern for the arrangement of letters on the typewriter keyboard. Then in 1873 Christopher Scholes helped design a “new, improved” layout. The layout became known as QWERTY, after the letter arrangement of the first six letters in the top row. QWERTY was chosen to maximize the distance between the most frequently used letters. This was a good solution in its day; it deliberately slowed down the typist, and reduced the jamming of keys on manual typewriters. By 1904, the Remington Sewing Machine Company of New York was mass-producing typewriters with this layout, and it became the de facto industry standard. — location: [4208](kindle://book?action=open&asin=B001FA0NOM&location=4208) ^ref-37256 --- Engineers developed new keyboard layouts, such as DSK (Dvorak’s Simplified Keyboard), which reduced the distance typists’ fingers traveled by over 50 percent. The same material can be typed in 5 to 10 percent less time using DSK than QWERTY.1 But QWERTY is the established system. Almost all keyboards use it, so we all learn it and are reluctant to learn a second layout. Keyboard manufacturers continue, therefore, with QWERTY. The vicious circle is complete.2 — location: [4213](kindle://book?action=open&asin=B001FA0NOM&location=4213) ^ref-33571 --- There is a lot of inertia, in the form of machines, keyboards, and trained typists, behind QWERTY. Is it worthwhile to retool? From the point of view of society as a whole, the answer would seem to be yes. During — location: [4219](kindle://book?action=open&asin=B001FA0NOM&location=4219) ^ref-29317 --- No individual user can change the social convention. The uncoordinated decisions of individuals keep us tied to QWERTY. — location: [4232](kindle://book?action=open&asin=B001FA0NOM&location=4232) ^ref-61608 --- The problem is called a bandwagon effect — location: [4233](kindle://book?action=open&asin=B001FA0NOM&location=4233) ^ref-57536 --- When the fraction using each technology is constant over time, we are at an equilibrium of the game. — location: [4240](kindle://book?action=open&asin=B001FA0NOM&location=4240) ^ref-36763 --- If the fraction of typists using QWERTY exceeds 72 percent, there is the expectation that an even greater fraction of people will learn QWERTY. The prevalence of QWERTY expands until it reaches 98 percent. At that point, the fraction of new typists learning QWERTY just equals its predominance in the population, 98 percent, and so there is no more upward pressure.* — location: [4243](kindle://book?action=open&asin=B001FA0NOM&location=4243) ^ref-33419 --- Conversely, if the fraction of typists using QWERTY falls below 72 percent, then there is the expectation that DSK will take over. Fewer than 72 percent of the new typists learn QWERTY, and the subsequent fall in its usage gives new typists an even greater incentive to learn the superior layout of DSK. Once all typists are using DSK there is no reason for a new typist to learn QWERTY, and QWERTY will die out. — location: [4248](kindle://book?action=open&asin=B001FA0NOM&location=4248) ^ref-38923 --- The adoption of QWERTY, gasoline engines, and light-water reactors are but three demonstrations of how history matters in determining today’s technology choices, though the historical reasons may be irrelevant considerations in the present. Typewriter-key jamming, hoof-and-mouth disease, and submarine space constraints are not relevant to today’s trade-offs between the competing technologies. — location: [4290](kindle://book?action=open&asin=B001FA0NOM&location=4290) ^ref-61827 --- The important insight from game theory is to recognize early on the potential for future lock-in—once one option has enough of a head start, superior technological alternatives may never get the chance to develop. — location: [4293](kindle://book?action=open&asin=B001FA0NOM&location=4293) ^ref-50969 --- If nobody is abiding by the law, then you have two reasons to break it too. First, some experts argue that it is actually safer to drive at the same speed as the flow of traffic.9 On most highways, anyone who tries to drive at fifty-five miles per hour creates a dangerous obstacle that everyone else must go around. Second, when you tag along with the other speeders, your chances of getting caught are almost zero. The police simply cannot pull over more than a small percentage of the speeding cars. As long as you go with the flow of traffic, there is safety in numbers.* As more people become law-abiding, both reasons to speed vanish. It becomes more dangerous to speed, since this requires weaving in and out of traffic. And your chances of getting caught increase dramatically. — location: [4299](kindle://book?action=open&asin=B001FA0NOM&location=4299) ^ref-45467 --- What can lawmakers learn from this if they want to encourage people to drive at the speed limit? It is not necessary to set the speed limit so high that everyone is happy to obey it. The key is to get a critical mass of drivers obeying the speed limit. Thus a short phase of extremely strict enforcement and harsh penalties can change the behavior of enough drivers to generate the momentum toward full compliance. The equilibrium moves from one extreme (where everyone speeds) to the other (where everyone complies). With the new equilibrium, the police can cut back on enforcement, and the compliance behavior is self-sustaining. — location: [4324](kindle://book?action=open&asin=B001FA0NOM&location=4324) ^ref-36624 --- More generally, what this suggests is that short but intense enforcement can be significantly more effective than the same total effort applied at a more moderate level for a longer time.10 — location: [4328](kindle://book?action=open&asin=B001FA0NOM&location=4328) ^ref-11463 --- IT CAN BE LONELY AT THE TOP — location: [4408](kindle://book?action=open&asin=B001FA0NOM&location=4408) ^ref-12955 --- At the time of the annual partnership decision, the abilities of the ten junior associates were rated from 1 to 10, with 10 being the best. The junior associates were told their rating privately. Then they were ushered into a meeting room where they were to decide by majority vote the cutoff level for partnership. They all agreed that everyone making partner was a good idea and certainly preferable to the old days when nobody made partner. So they began with a cutoff of 1. Then some high-rated junior associate suggested that they raise the cutoff to 2. He argued that this would improve the average quality of the partnership. Eight junior associates agreed with him. The sole dissenting vote came from the least able member, who would no longer make partner. Next, someone proposed that they raise the standard from 2 to 3. Eight people were still above this standard, and they all voted for this improvement in the quality of the partnership. The person ranked 2 voted against, as this move deprived him of partnership. What was surprising was that the lowest-rated junior associate was in favor of this raising of the standards. In neither case would he make partner. But at least in the latter he would be grouped with someone who had ability 2. Therefore, upon seeing that he was not selected, other law firms would not be able to infer his exact ability. They would guess that he was either a 1 or a 2, a level of uncertainty that would be to his advantage. The proposal to raise the standard to 3 passed 9:1. — location: [4412](kindle://book?action=open&asin=B001FA0NOM&location=4412) ^ref-19495 --- And so it went, until the standard was raised all the way up to 10. Finally, someone proposed that they raise the standard to 11 so that nobody would make partner. Everybody rated 9 and below thought that this was a fine proposal, since once more this improved the average quality of those rejected. Outsiders would not take it as a bad sign that they didn’t make partner, as nobody made partner at this law firm. The sole voice against was the most able junior associate, who lost his chance to make partner. But he was outvoted 9:1. — location: [4425](kindle://book?action=open&asin=B001FA0NOM&location=4425) ^ref-19792 --- Suppose the voters range uniformly over the spectrum. For concreteness, number the political positions from 0 to 100, where 0 represents radical left and 100 represents arch-conservative. If the incumbent chooses a position such as 48, slightly more liberal than the middle of the road, the challenger will take a position between that and the middle—say, 49. Then voters with preferences of 48 and under will vote for the incumbent; all others, making up just over 51 percent of the population, will vote for the challenger. The challenger will win. If the incumbent takes a position above 50, then the challenger will locate between that and 50. Again this will get him more than half the votes. — location: [4460](kindle://book?action=open&asin=B001FA0NOM&location=4460) ^ref-27628 --- This median is not necessarily the average position. The median position is determined by where there are an equal number of voices on each side, while the average gives weight to how far the voices are away. — location: [4467](kindle://book?action=open&asin=B001FA0NOM&location=4467) ^ref-34843 --- Would the excess homogeneity persist if there were three parties? Suppose they take turns to choose and revise their positions, and have no ideological baggage to tie them down. A party located on the outside will edge closer to its neighbor to chip away some of its support. This will squeeze the party in the middle to such an extent that when its turn comes, it will want to jump to the outside and acquire a whole new and larger base of voters. This process will then continue, and there will be no equilibrium. In practice, parties have enough ideological baggage, and voters have enough party loyalty, to prevent such rapid switches. — location: [4475](kindle://book?action=open&asin=B001FA0NOM&location=4475) ^ref-55032 --- The move from one equilibrium to a better one can be most effectively accomplished via a short and intense campaign. The trick is to get a critical mass of people to switch, and then the bandwagon effect makes the new equilibrium self-sustaining. — location: [4506](kindle://book?action=open&asin=B001FA0NOM&location=4506) ^ref-53321 --- In a Vickrey auction, all the bids are placed in a sealed envelope. When the envelopes are opened to determine the winner, the highest bid wins. But there’s a twist. The winner doesn’t pay his or her bid. Instead, the winner only has to pay the second highest bid. — location: [4622](kindle://book?action=open&asin=B001FA0NOM&location=4622) ^ref-23842 --- What is remarkable, even magical, about this auction is that all the bidders have a dominant strategy: bid their true valuation. — location: [4624](kindle://book?action=open&asin=B001FA0NOM&location=4624) ^ref-52458 --- In a Vickrey auction, all you have to do is figure out what the item is worth to you and then write down that amount. — location: [4628](kindle://book?action=open&asin=B001FA0NOM&location=4628) ^ref-14672 --- TRIP TO THE GYM NO. 6 Imagine that you could find out how much the other bidders were submitting in a Vickrey auction before you had to put in your bid. Ignoring the ethical issues for a moment, how much would this be worth to you? — location: [4636](kindle://book?action=open&asin=B001FA0NOM&location=4636) ^ref-39364 --- If someone else bids $63 or $70 or anything else above $60, then both $50 and $60 are losing bids. Hence there is no difference between them. In both cases, you lose the auction and walk away with nothing. — location: [4639](kindle://book?action=open&asin=B001FA0NOM&location=4639) ^ref-62481 --- The $50 and $60 bids also lead to identical (but this time happier) outcomes if the highest other bid is below $50, say $43. If you bid $60, then you win and pay $43. If you had bid $50, you would also have won and paid $43. The reason is that in both cases you are the highest bidder and what you pay is the second highest bid, which is $43. Bidding $50 doesn’t save you any money (compared to bidding $60) when the second highest bid is $43 or anything below $50. — location: [4641](kindle://book?action=open&asin=B001FA0NOM&location=4641) ^ref-54129 --- There’s no difference if any rival bid is above $60 or all are below $50. The only remaining case is where the highest competitive bid is between $50 and $60, say $53. If you bid $60, then you will win and pay $53. If you were to have bid $50, then you would lose. Since your value is $60, you would rather win and pay $53 than lose. — location: [4646](kindle://book?action=open&asin=B001FA0NOM&location=4646) ^ref-39433 --- The auction houses tack on a 20 percent buyer’s premium. If you win the auction with a $1,000 bid, you will be expected to write them a check for $1,200. — location: [4671](kindle://book?action=open&asin=B001FA0NOM&location=4671) ^ref-52382 --- Okay, it isn’t the buyer who pays—it’s the seller. To get this result all we need to assume is that the buyer is aware of this rule and takes it into account when bidding. Put yourself in the position of a collector who is willing to pay $600. How high will you bid? Your top bid should be $500—as you can anticipate that saying $500 really means that you have to pay $600 after the buyer’s premium. You can think of the buyer’s premium as being nothing more than a currency conversion or a code. When you say $100, you really mean $120.* Each bidder scales back his bid accordingly. — location: [4673](kindle://book?action=open&asin=B001FA0NOM&location=4673) ^ref-1698 --- If everyone had to submit their proxy bids at the same time and once and for all, the game truly would be the same as a Vickrey auction and we could advise everyone to play it straight and bid their true value. Bidding the truth would be a dominant strategy. But the game isn’t quite played that way, and these little hiccups lead people to get fancy with their bids. — location: [4696](kindle://book?action=open&asin=B001FA0NOM&location=4696) ^ref-56768 --- The reason to snipe is to keep others in the dark about their own valuations. — location: [4731](kindle://book?action=open&asin=B001FA0NOM&location=4731) ^ref-54105 --- A powerful idea in game theory is the concept of acting like a consequentialist. — location: [4734](kindle://book?action=open&asin=B001FA0NOM&location=4734) ^ref-23172 --- It turns out that this perspective is critical in auctions and in life. It is the key tool to avoid the winner’s curse. — location: [4736](kindle://book?action=open&asin=B001FA0NOM&location=4736) ^ref-34102 --- To make this concrete, imagine that you ask someone to marry you. The person can say yes or no. If the answer is no, then there’s nothing for you to go through with. But if the answer is yes, then you are on your way to getting hitched. Our point is that you should presume the answer will be yes at the time you pop the question. — location: [4737](kindle://book?action=open&asin=B001FA0NOM&location=4737) ^ref-23816 --- You are a potential buyer for ACME. Because of your extensive knowledge of game theory, you will be able to increase ACME’s value by 50 percent, whatever it is. The problem is that you have some doubts as to the current value. After completing your due diligence, you place the value at somewhere between $2 million and $12 million. The average value is $7 million and your view is that all the options in the $2 to $12 million range are equally likely. — location: [4745](kindle://book?action=open&asin=B001FA0NOM&location=4745) ^ref-19562 --- When faced with this problem, most people reason as follows: On average the company is worth $7 million. I can make it worth 50 percent more, or $10.5 million. Thus I can bid up to $10.5 million and still not expect to lose money. — location: [4755](kindle://book?action=open&asin=B001FA0NOM&location=4755) ^ref-51972 --- Is $10.5 million where you came out? We hope not. — location: [4757](kindle://book?action=open&asin=B001FA0NOM&location=4757) ^ref-43668 --- If you offer $10.5 million and the owners say yes, then you’ve learned some bad news. You now know that the company is not worth $11 million or $12 million today. When the owners say yes to an offer of $10.5 million, the company will be worth somewhere between $2 million and $10.5 million, or $6.25 million on average. The problem is that even with your 50 percent increase in performance, that only brings the value up to $9.375 million, well below the $10.5 million you offered. — location: [4759](kindle://book?action=open&asin=B001FA0NOM&location=4759) ^ref-56904 --- An offer of $6 million just does the trick. You can anticipate that when the seller says yes, the company is worth between $2 million and $6 million, for an average value of $4 million. The 50 percent premium brings the value to you back up to $6 million or breakeven. The fact that the seller says yes is bad news but not fatal to the deal. You have to adjust down your offer to take into account the circumstances under which a seller will say yes to you. — location: [4766](kindle://book?action=open&asin=B001FA0NOM&location=4766) ^ref-45794 --- This idea of presuming you’ve won is a critical ingredient to making the right bid in a sealed-bid auction. — location: [4773](kindle://book?action=open&asin=B001FA0NOM&location=4773) ^ref-64165 --- The tricky part of a sealed-bid auction is determining how much to bid. For starters, you should never bid your valuation (or worse, something more). If you do so, you are guaranteed to break even at best. — location: [4777](kindle://book?action=open&asin=B001FA0NOM&location=4777) ^ref-37382 --- This strategy is dominated by shading your bid to some amount below your valuation. That way, at least you have a chance to come out ahead.† — location: [4778](kindle://book?action=open&asin=B001FA0NOM&location=4778) ^ref-57533 --- But what they bid depends on what they expect you to bid. The key step to cutting through this infinite loop of expectations is to always bid as if you’ve won. — location: [4781](kindle://book?action=open&asin=B001FA0NOM&location=4781) ^ref-26090 --- When putting down your bid, you should always assume that all of the other bidders are below you. And then with that assumption, you should ask if this is your best bid. — location: [4782](kindle://book?action=open&asin=B001FA0NOM&location=4782) ^ref-23256 --- Of course, you will often be wrong when making that assumption. But when you’re wrong, it won’t matter—others will have outbid you and so you won’t have won the auction. But when you’re right, you’ll be the winning bidder and thus have made the correct assumption. — location: [4783](kindle://book?action=open&asin=B001FA0NOM&location=4783) ^ref-11041 --- What this all says is that when you are thinking about how much to bid, you should pretend that all of the other bidders are somewhere below your bid. Armed with this assumption, you then consider your best bid. — location: [4796](kindle://book?action=open&asin=B001FA0NOM&location=4796) ^ref-57176 --- In the Dutch auction, prices start high and fall until the first bidder indicates his or her participation. — location: [4806](kindle://book?action=open&asin=B001FA0NOM&location=4806) ^ref-42616 --- At your optimal bid, the savings from paying a lower bid is no longer worth the increased risk of losing the prize. — location: [4814](kindle://book?action=open&asin=B001FA0NOM&location=4814) ^ref-60422 --- When you write down your bid in a sealed-bid auction, you only find out later if you’ve won or not. But remember our guidance. In a sealed-bid auction, you are supposed to bid as if you’ve won. You’re supposed to pretend that all of the other bidders are somewhere below you. This is exactly the situation you are in when competing in a Dutch auction. — location: [4818](kindle://book?action=open&asin=B001FA0NOM&location=4818) ^ref-49041 --- The answer for how much to bid comes from one of the most remarkable results in auction theory: the revenue equivalence theorem. — location: [4827](kindle://book?action=open&asin=B001FA0NOM&location=4827) ^ref-34448 --- It turns out that when the valuations are private and the game is symmetric, the seller makes the same amount of money on average whether the auction type is English, Vickrey, Dutch, or sealed-bid.* — location: [4828](kindle://book?action=open&asin=B001FA0NOM&location=4828) ^ref-6721 --- What that means is that there is a symmetric equilibrium to the Dutch and sealed-bid auctions where the optimal bidding strategy is to bid what you think the next highest person’s value is given the belief that you have the highest value. — location: [4830](kindle://book?action=open&asin=B001FA0NOM&location=4830) ^ref-62336 --- For example, everyone might think that each bidder’s value is equally likely to be anything between 0 and 100. In this case, whether the auction is Dutch or a sealed-bid, you should bid what you expect the next highest bidder’s value to be given that all of the other values are below your own. If your value is 60, you should bid 30 if there is only one other bidder. You should bid 40 if there are two other bidders and 45 if there are three other bidders.* — location: [4832](kindle://book?action=open&asin=B001FA0NOM&location=4832) ^ref-35550 --- You can see that this would lead to revenue equivalence. In a Vickrey auction, the person with the highest value wins but only pays the second highest bid, which is the second highest valuation. In a sealed-bid auction, everyone bids what they think the second highest valuation is (given they are the highest). The person with the truly highest valuation will win and the bid will be the same on average as the result in a Vickrey auction. — location: [4837](kindle://book?action=open&asin=B001FA0NOM&location=4837) ^ref-47517 --- Did you bid 10 in NY and 9 for LA? If so, you’ve certainly won both auctions. But you’ve made no profit at all. — location: [4994](kindle://book?action=open&asin=B001FA0NOM&location=4994) ^ref-47062 --- Taking this perspective into account, you can see that bidding 10 for NY and 9 for LA is actually a case of a (weakly) dominated strategy. With this strategy, you are guaranteed to end up with zero. This is your payoff whether you win or lose. Any strategy that gives you a chance of doing better than zero while never losing any money will weakly dominate the strategy of bidding 10 and 9 right off the bat. Perhaps you bid 9 in NY and 8 for LA. If so, you’ve certainly done better than bidding 10 and 9. Based on our bid, you’ll win both auctions. (We won’t bid more than our valuations.) So, congratulations. How did you do? You made a profit of 1 in each city or 2 in total. The key question is whether you can do better. You obviously can’t do better bidding 10 and 9. Nor can you do better repeating your bids of 9 and 8. What other strategies might you consider? Let’s assume that you bid 5 and 5. (The way the game will play out for other bids will be quite similar.) Now it’s time for us to reveal our bid: we started with 0 (or no bid) in NY and 1 in LA. Given the way the first round of bidding has turned out, you are the high bidder in both cities. Thus you can’t bid this round (as there is no point in having you top your own bid). Since we are losing out in both cities, we will bid again. Think of the situation from our shoes. We can’t go back home empty-handed to our CEO and say that we dropped out of the auction when the bids were at 5. We can only go home empty-handed if the prices have escalated to 9 and 8, so that it isn’t worth our while to bid anymore. Thus we’ll raise our bid in LA to 6. Since we just outbid you, the auction is extended another period. (Remember that the auction is extended another round whenever someone bids.) What will you do? Imagine that you raise us in LA with a bid of 7. When it comes time for us to bid in the next round, we’ll bid in NY this time with an offer of 6. We’d rather win NY at 6 than LA at 8. Of course, you can then outbid us back in NY. You can see where this is all headed. Depending on who bids when, you will win both licenses at prices of 9 or 10 in NY and 8 or 9 in LA. This is certainly no better than the result when you just started out with a bid of 9 in NY and 8 in LA. It doesn’t appear that our experiment has led to any improvement in payoffs. That happens. As you try out different strategies you can’t expect them all to work. But was there something else you could have done that would have led to a profit greater than 2? Let’s go back and replay the last auction. What else might you have done after we bid 6 for LA? Recall that at that time, you were the high bidder in NY at a price of 5. Actually, you could have done nothing. You could have stopped bidding. We had no interest in outbidding you in NY. We were plenty happy to win the LA license at a price of 6. The only reason we bid again is that we couldn’t go away empty-handed—unless, of course, prices escalated to 9 and 8. If you had stopped… — location: [5001](kindle://book?action=open&asin=B001FA0NOM&location=5001) ^ref-59229 --- Before we walk away with no license, we will bid all the way up to 9 and 8. If you intend to deny us any license, you have to be prepared to bid a total of 17. Right now you have one license at a price of 1. Thus the true cost of winning the second license is 16, which is well in excess of your value. You have a choice. You can win one license at a price of 1 or two licenses at a combined price of 17. Winning one is the better option. Just because you can beat us in both auctions doesn’t mean that you should. — location: [5033](kindle://book?action=open&asin=B001FA0NOM&location=5033) ^ref-62657 --- You might also be wondering if this is collusion. Strictly speaking, the answer is no. While it is true that the two firms both end up better off (and the seller is the big loser), observe that neither party needs to make an agreement with the other. Each side is acting in its own best interest. — location: [5040](kindle://book?action=open&asin=B001FA0NOM&location=5040) ^ref-8461 --- Outbidding MCI on LA can raise the price in both LA and NY. The true cost of winning the second license is 16, more than its value. — location: [5044](kindle://book?action=open&asin=B001FA0NOM&location=5044) ^ref-23358 --- What we see here is often called tacit cooperation. — location: [5045](kindle://book?action=open&asin=B001FA0NOM&location=5045) ^ref-31686 --- The larger lesson here is that when two games are combined into one, this creates an opportunity to employ strategies that go across the two games. — location: [5049](kindle://book?action=open&asin=B001FA0NOM&location=5049) ^ref-35706 --- When Fuji entered the U.S. film market, Kodak had the opportunity to respond in the United States or in Japan. While starting a price war in the United States would have been costly to Kodak, doing so in Japan was costly to Fuji (and not to Kodak, who had little share in Japan). Thus the interaction between multiple games played simultaneously creates opportunities for punishment and cooperation that might otherwise be impossible, at least without explicit collusion. — location: [5050](kindle://book?action=open&asin=B001FA0NOM&location=5050) ^ref-23658 --- Moral: If you don’t like the game you are playing, look for the larger game. — location: [5054](kindle://book?action=open&asin=B001FA0NOM&location=5054) ^ref-20676 --- We begin by recapitulating the basic idea in the context of union-management negotiation over wages. To look forward and reason backward, it helps to start at a fixed point in the future, so let us think of an enterprise with a natural conclusion, such as a hotel in a summer resort. The season lasts 101 days. Each day the hotel operates, it makes a profit of $1,000. At the beginning of the season, the employees’ union confronts the management over wages. The union presents its demand. The management either accepts this or rejects it and returns the next day with a counteroffer. The hotel can open only after an agreement is reached. — location: [5068](kindle://book?action=open&asin=B001FA0NOM&location=5068) ^ref-63955 --- Now look at the day before the last day of the season, when it is the management’s turn to make an offer. It knows that the union can always reject this, let the process go on to the last day, and get $1,000. Therefore the management cannot offer any less. And the union cannot do any better than get $1,000 on the last day, so the management need not offer any more on the day before.† Therefore the management’s offer at this stage is clear: of the $2,000 profit over the last two days, it asks half. Each side gets $500 per day. — location: [5077](kindle://book?action=open&asin=B001FA0NOM&location=5077) ^ref-63005 --- looking ahead and reasoning backward leads to a simple and appealing rule: split the total down the middle. — location: [5091](kindle://book?action=open&asin=B001FA0NOM&location=5091) ^ref-6713 --- There is a second prediction of the theory: the agreement will occur on the first day of the negotiation process. Because the two sides look ahead to predict the same outcome, there is no reason why they should fail to agree and jointly lose $1,000 a day. — location: [5092](kindle://book?action=open&asin=B001FA0NOM&location=5092) ^ref-5271 --- The general idea is that the better a party can — location: [5114](kindle://book?action=open&asin=B001FA0NOM&location=5114) ^ref-9465 --- do by itself in the absence of an agreement, the larger its share of the bargaining pie will be. — location: [5114](kindle://book?action=open&asin=B001FA0NOM&location=5114) ^ref-50492 --- The first step in any negotiation is to measure the pie correctly. — location: [5116](kindle://book?action=open&asin=B001FA0NOM&location=5116) ^ref-7570 --- In this case, the best way to think about the size of the pie is that it is $200. — location: [5119](kindle://book?action=open&asin=B001FA0NOM&location=5119) ^ref-60040 --- More generally, the size of the pie is measured by how much value is created when the two sides reach an agreement compared to when they don’t. — location: [5120](kindle://book?action=open&asin=B001FA0NOM&location=5120) ^ref-38311 --- In the lingo of bargaining, the fallback numbers of $300 for the union and $500 for management are called BATNAs, a term coined by Roger Fisher and William Ury. It stands for Best Alternative to a Negotiated Agreement. — location: [5121](kindle://book?action=open&asin=B001FA0NOM&location=5121) ^ref-59503 --- Since everyone can get their BATNA without having to negotiate, the whole point of the negotiation is how much value can be created above and beyond the sum of their BATNAs. — location: [5124](kindle://book?action=open&asin=B001FA0NOM&location=5124) ^ref-38247 --- The simplest approach would be to split the airfare in two: $1,409 to each of Houston and San Francisco. — location: [5133](kindle://book?action=open&asin=B001FA0NOM&location=5133) ^ref-41405 --- It would have been cheaper for Houston to have paid for the round trip to Houston all by itself. That fare is only twice $666, or $1,332. Houston would never agree to such a split. — location: [5135](kindle://book?action=open&asin=B001FA0NOM&location=5135) ^ref-2839 --- Another approach is to have Houston pay for the NY–Houston leg, to have SF pay for the SF–NY leg, and for the two to split the Houston–SF leg. Under that approach, SF would pay $1,697.50 and Houston would pay $1,120.50. — location: [5136](kindle://book?action=open&asin=B001FA0NOM&location=5136) ^ref-42710 --- The two companies could also agree to split the total costs proportionately, using the same ratio as their two round-trip fares. Under this plan, SF would pay $1,835, about twice as much as Houston, who would pay $983. — location: [5138](kindle://book?action=open&asin=B001FA0NOM&location=5138) ^ref-59882 --- This is the key point: the extra cost of doing the two round-trips over the triangle route is $1,000. That is the pie. — location: [5143](kindle://book?action=open&asin=B001FA0NOM&location=5143) ^ref-46184 --- Each party saves $500 over the round-trip fare: Houston pays $832 and SF pays $1,986. You can see that this is a much lower number for Houston than any of the other approaches. — location: [5146](kindle://book?action=open&asin=B001FA0NOM&location=5146) ^ref-48662 --- It suggests that the division between two parties should not be based on the mileage or the relative airfares. — location: [5147](kindle://book?action=open&asin=B001FA0NOM&location=5147) ^ref-32734 --- In other cases, the BATNAs are not fixed. That opens up the strategy of influencing the BATNAs. — location: [5154](kindle://book?action=open&asin=B001FA0NOM&location=5154) ^ref-13973 --- When a strategic bargainer observes that a better outside opportunity translates into a better share in a bargain, he will look for strategic moves that improve his outside opportunities. Moreover, he will notice that what matters is his outside opportunity relative to that of his rival. He will do better in the bargaining even if he makes a commitment or a threat that lowers both parties’ outside opportunities, so long as that of the rival is damaged more severely. In our example, when the union members could earn $300 a day on the outside while the management could make a profit of $500 a day using scab labor, the result of the bargaining was $400 for the union and $600 for the management. Now suppose the union members give up $100 a day of outside income to intensify their picketing, and this reduces the management’s profit by $200 a day. Then the bargaining process gives the union a starting point of $200 ($300 minus $100) and the management $300 ($500 minus $200). The two starting points add up to $500, and the remaining $500 of daily profit from regular operation of the hotel is split equally between them. Therefore the union gets $450 and the management gets $550. — location: [5158](kindle://book?action=open&asin=B001FA0NOM&location=5158) ^ref-33621 --- The union’s threat of hurting both (but hurting the management more) has earned it an extra $50. — location: [5166](kindle://book?action=open&asin=B001FA0NOM&location=5166) ^ref-29158 --- Major League Baseball players employed just such a tactic in their wage negotiations in 1980. — location: [5167](kindle://book?action=open&asin=B001FA0NOM&location=5167) ^ref-30422 --- Although both sides may want the agreement to succeed, they may have different ideas about what constitutes success. The two parties do not always look forward and see the same end. — location: [5184](kindle://book?action=open&asin=B001FA0NOM&location=5184) ^ref-8403 --- Since a side with a low waiting cost does better, it is to each side’s advantage to claim its cost is low. — location: [5186](kindle://book?action=open&asin=B001FA0NOM&location=5186) ^ref-35728 --- The way to prove one’s waiting costs are low is to begin incurring the costs and then show you can hold out longer, or to take a greater risk of incurring the costs—lower costs make higher risks acceptable. — location: [5187](kindle://book?action=open&asin=B001FA0NOM&location=5187) ^ref-65293 --- brinkmanship as the chance that both sides would fall together down the slippery slope. As the conflict continues, both sides risk a large loss with a small but increasing probability. It is this increasing exposure to risk that induces one side to back down. — location: [5213](kindle://book?action=open&asin=B001FA0NOM&location=5213) ^ref-7601 --- Instead of a small chance of a large loss, there is a large chance, even certainty, of a small loss when a strike begins. — location: [5216](kindle://book?action=open&asin=B001FA0NOM&location=5216) ^ref-61383 --- the weakness of negotiating while working under an expired contract. — location: [5222](kindle://book?action=open&asin=B001FA0NOM&location=5222) ^ref-57805 --- The workers would rather have health coverage than an extra $1,500 a year in wages, and the company would rather offer health coverage than an extra $1,500 in wages, too. It would seem that the negotiators should throw all the issues of mutual interest into a common bargaining pot, and exploit the difference in their relative valuations to achieve outcomes that are better for everyone. This works in some instances; for example, broad negotiations toward trade liberalization in the General Agreement on Tariffs and Trade (GATT) and its successor, the World Trade Organization (WTO), have had better success than ones narrowly focused on particular sectors or commodities. — location: [5235](kindle://book?action=open&asin=B001FA0NOM&location=5235) ^ref-22910 --- It turns out that for more than fifty years there has been a clever idea to virtually eliminate all of the waste of strikes and lockouts without altering the relative bargaining power of labor and management.6 Instead of a traditional strike, the idea is to have a virtual strike (or virtual lockout), in which the workers keep — location: [5254](kindle://book?action=open&asin=B001FA0NOM&location=5254) ^ref-557 --- working as normal and the firm keeps producing as normal. The trick is that during the virtual strike neither side gets paid. — location: [5257](kindle://book?action=open&asin=B001FA0NOM&location=5257) ^ref-18973 --- Our point is to replicate the costs and benefits of the negotiation to the parties involved while at the same time leaving everyone else unharmed. — location: [5281](kindle://book?action=open&asin=B001FA0NOM&location=5281) ^ref-46426 --- So long as the two sides have the same BATNAs in the virtual strike as they do in the real one, they have no advantage in employing the real strike over a virtual one. — location: [5282](kindle://book?action=open&asin=B001FA0NOM&location=5282) ^ref-43601 --- To predict the bargaining outcome, we start at the end and work backward. On the last day there is no value in continuing, so labor should be willing to accept any positive amount, say $1. On the penultimate day, labor recognizes that rejecting today’s offer will bring only $1 tomorrow; hence they prefer to accept $2 today. The argument continues right up to the first day of the season. Management proposes to give labor $101, and labor, seeing no better alternative in the future, accepts. This suggests that in the case of making offers, ’tis better to give than to receive. — location: [5300](kindle://book?action=open&asin=B001FA0NOM&location=5300) ^ref-4909 --- To the extent that labor cares not only about its payments but also how these payments compare to management’s, this type of radically unequal division will not be possible. But that does not mean we must return to an even split. — location: [5306](kindle://book?action=open&asin=B001FA0NOM&location=5306) ^ref-22142 --- The possibility that the other side could prove your analysis wrong makes this repeated version of the game different from the one-shot version. In the one-shot version of divide the $100, you can assume that the receiver will find it enough in his interest to accept $20 so that you can get $80. If you end up wrong in this assumption, the game is over and it is too late to change your strategy. Thus the other side doesn’t have an opportunity to teach you a lesson with the hope of changing your future strategy. In contrast, when you play 101 iterations of the ultimatum game, the side receiving the offer might have an incentive to play tough at first and thereby establish that he is perhaps irrational (or at least has a strong conviction for the 50:50 norm). — location: [5315](kindle://book?action=open&asin=B001FA0NOM&location=5315) ^ref-38992 --- What should you do if you offer an 80:20 split on day one and the other side says no? This question is easiest to answer in the case where there are only two days total so that the next iteration will be the last. — location: [5322](kindle://book?action=open&asin=B001FA0NOM&location=5322) ^ref-28241 --- If the other party says yes, he will get 200 for both days, for a total of 400. Even a cold, calculating machine would say no to 80:20 if he thought that doing so would get him an even split in the last period, or 500. But if this is just a bluff, you can stick with 80:20 in the final round and be confident that it will be accepted. The analysis gets more complicated if your initial offer was 67:33 and that gets turned down. Had the receiver said yes, he would have ended up with a total of 333 for two days, or 666. But now that he’s said no, the best he can reasonably hope for is a 50:50 split in the final round, or 500. — location: [5324](kindle://book?action=open&asin=B001FA0NOM&location=5324) ^ref-60856 --- In sum, what makes a multiround game different from the one-shot version, even if only one side is making all the offers, is that the receiving side has an opportunity to show you that your theory isn’t working as predicted. — location: [5330](kindle://book?action=open&asin=B001FA0NOM&location=5330) ^ref-10736 --- RUBINSTEIN BARGAINING — location: [5335](kindle://book?action=open&asin=B001FA0NOM&location=5335) ^ref-22653 --- We measure impatience by how much is left if one does the deal in the next round rather than today. — location: [5351](kindle://book?action=open&asin=B001FA0NOM&location=5351) ^ref-15327 --- We represent the cost of waiting by the variable δ. In this example, δ = 0.99. — location: [5353](kindle://book?action=open&asin=B001FA0NOM&location=5353) ^ref-42216 --- When δ is close to one, such as 0.99, then people are patient; if δ is small, say 1/3, then waiting is costly and the bargainers are impatient. Indeed, with δ = 1/3, two-thirds the value is lost each week. — location: [5353](kindle://book?action=open&asin=B001FA0NOM&location=5353) ^ref-39271 --- δ — location: [5356](kindle://book?action=open&asin=B001FA0NOM&location=5356) ^ref-21123 --- So once you know that he will surely take δ tomorrow, that means you can count on 1–δ tomorrow, and so you should never accept anything less than δ(1–δ) today. — location: [5361](kindle://book?action=open&asin=B001FA0NOM&location=5361) ^ref-18192 --- What we are looking for is the number such that when everyone understands this is the least you will ever accept, it leads you to a position where you should accept nothing less. — location: [5365](kindle://book?action=open&asin=B001FA0NOM&location=5365) ^ref-13495 --- Assume that the worst (or lowest) division you will ever accept gives you L, where L stands for lowest. — location: [5367](kindle://book?action=open&asin=B001FA0NOM&location=5367) ^ref-39626 --- L > δ(1–δ(1–L)) — location: [5375](kindle://book?action=open&asin=B001FA0NOM&location=5375) ^ref-10321 --- You should never accept anything less than δ/(1 + δ), because you can get more by waiting and making a counteroffer that the other side is sure to accept. — location: [5377](kindle://book?action=open&asin=B001FA0NOM&location=5377) ^ref-31775 --- By the same logic, the other side will also never accept less than δ/(1 + δ). That tells us what the most you can ever hope for is. — location: [5379](kindle://book?action=open&asin=B001FA0NOM&location=5379) ^ref-52909 --- Think of it this way. The person making me an offer has a claim to all of the pie that will be lost if I say no. That gives him 1/2 right there. Of the half that remains, you can get half of that or 1/4 total, as this amount would be lost if he doesn’t accept your offer. — location: [5396](kindle://book?action=open&asin=B001FA0NOM&location=5396) ^ref-64745 --- Consider yourself a middle-of-the-roader: if it were in your hands, you would prefer a candidate who stands at the position 50 on our scale. But it may turn out that the country is a bit more conservative than that. Without you, the average is 60. — location: [5567](kindle://book?action=open&asin=B001FA0NOM&location=5567) ^ref-4314 --- you state your actual preference, the candidate will move to (99 × 60 + 50)/100 = 59.9. If, instead, you exaggerate and claim to want 0, the final outcome will be at 59.4. By exaggerating your claim, you are six times as effective in influencing the candidate’s position. — location: [5570](kindle://book?action=open&asin=B001FA0NOM&location=5570) ^ref-24738 --- The problem with this averaging approach is that it tries to take into account both intensity and direction of preferences. People have an incentive to tell the truth about direction but exaggerate when it comes to intensity. — location: [5575](kindle://book?action=open&asin=B001FA0NOM&location=5575) ^ref-40708 --- One solution to this problem is related to Harold Hotelling’s observation (discussed in chapter 9) that political parties will converge to the median voter’s position. — location: [5577](kindle://book?action=open&asin=B001FA0NOM&location=5577) ^ref-28899 --- No voter will take an extreme position if the candidate follows the preferences of the median voter—that — location: [5578](kindle://book?action=open&asin=B001FA0NOM&location=5578) ^ref-23217 --- To find the median point, a candidate could start at 0 and keep moving to the right as long as a majority supports this change. At the median, the support for any further rightward move is exactly balanced by the equal number of voters who prefer a shift left. — location: [5581](kindle://book?action=open&asin=B001FA0NOM&location=5581) ^ref-45916 --- When a candidate adopts the median position, no voter has an incentive to distort her preferences. Why? There are only three cases to consider: (i) a voter to the left of the median, (ii) a voter exactly at the median, and (iii) a voter to the right of the median. — location: [5583](kindle://book?action=open&asin=B001FA0NOM&location=5583) ^ref-61568 --- The challenge for the incumbent is much like the famous cake-cutting problem. In the cake-cutting problem, there are two kids who have to share a cake. The question is to develop a procedure for them to divide it up so as to ensure that each feels he has gotten (at least) half of the cake. — location: [5618](kindle://book?action=open&asin=B001FA0NOM&location=5618) ^ref-17388 --- The solution is “I cut, you choose.” — location: [5620](kindle://book?action=open&asin=B001FA0NOM&location=5620) ^ref-38740 --- Now for the real surprise. Across all convex sets, the incumbent, by locating at the center of gravity, can guarantee herself at least 1/e = 1/2.71828 ≈ 36 percent of the vote. The result even holds when voters are normally distributed (like a bell curve) rather than uniform. — location: [5644](kindle://book?action=open&asin=B001FA0NOM&location=5644) ^ref-36684 --- if a 64 percent majority is required to dislodge the status quo, then it is possible to find a stable outcome by picking the point that is the average of all the voters’ preferences. No matter the challenger’s position, the incumbent is able to attract at least 36 percent of the vote and thus remain in place. — location: [5646](kindle://book?action=open&asin=B001FA0NOM&location=5646) ^ref-49304 --- The goal is to pick the smallest majority size that ensures a stable outcome. It looks like two-thirds majority rule is just on the right side of 64 percent to do the trick. The U.S. Constitution got it right. — location: [5657](kindle://book?action=open&asin=B001FA0NOM&location=5657) ^ref-60647 --- The typical real estate agent commission is 6%, which is a linear incentive. How much of an incentive does your agent have to get a higher price? What would an extra $20,000 in the price bring in? Hint: The answer is not $1,200. How would you design a better incentive scheme? What might be some of the issues with your alternative scheme? — location: [5854](kindle://book?action=open&asin=B001FA0NOM&location=5854) ^ref-14050 --- Efficiency Wages You are considering hiring someone for a job in your firm. This job requires careful effort, and good work would be worth $60,000 a year to you. — location: [5913](kindle://book?action=open&asin=B001FA0NOM&location=5913) ^ref-60269 --- You can offer a contract of the following kind: “I will pay you some amount above your other opportunities so long as no shirking on your part comes to light. But if that ever happens, I will fire you, and spread the word of your misbehavior among all the other employers, with the result that you will never earn anything more than the basic $40,000 again.” How high does the salary have to be so that the risk of losing it will deter the worker from cheating? Clearly, you will have to pay more than $48,000. — location: [5920](kindle://book?action=open&asin=B001FA0NOM&location=5920) ^ref-25619 --- Suppose the worker does cheat one year. In that year he will not have to incur the subjective cost of effort, so he will have gained the equivalent of $8,000. But he will run a 25 percent risk of being found out and losing $X this year and every year after that. Is the one-time gain of $8,000 worth the prospective loss of 0.25X every year thereafter? That depends on how money at different times is compared—that is, on the interest rate. Suppose the interest rate is 10%. Then getting an extra X annually is like owning a bond with a face value of $10X (which at 10% pays X annually). The immediate gain of the equivalent of $8,000 should be compared with the 25 percent chance of losing $10X. If $8,000 < 0.25 × 10X, the worker will calculate that he should not shirk. This means $X > $8,000/2.5 = $3,200. If you offer the worker an annual salary of $48,000 + $3,200 = $51,200 so long as no shirking comes to light, he will not in fact shirk. It isn’t worth risking the extra $3,200 forever in order to slack off and get a quick $8,000 this year. And since good effort is worth $60,000 per year to you, it is in your interest to offer this higher wage. The purpose of the wage is to get the worker to put in the requisite effort and work more efficiently, and so it is called an efficiency wage. The excess above the basic wage elsewhere, which is $11,200 in our example, is called the efficiency premium. — location: [5926](kindle://book?action=open&asin=B001FA0NOM&location=5926) ^ref-14964 --- Multiple Tasks Employees usually perform multiple tasks. To take an example close to home, professors teach and carry out research. In such cases, the incentives for the different tasks can interact. — location: [5940](kindle://book?action=open&asin=B001FA0NOM&location=5940) ^ref-4649 --- The overall effect depends on whether the tasks are substitutes (in the sense that when the worker devotes more effort to one task, the net productivity of effort on the other task suffers) or complements (in the sense that more effort to one task raises the net productivity of the effort devoted to the other task). — location: [5942](kindle://book?action=open&asin=B001FA0NOM&location=5942) ^ref-47590 --- are teaching and research substitutes or complements? — location: [5968](kindle://book?action=open&asin=B001FA0NOM&location=5968) ^ref-19045 --- If they are substitutes, the two should be performed in separate institutions, as is done in France, where universities do mostly teaching and the research is done in specialized institutes. If they are complements, the optimal arrangement is to combine research and teaching within one institution, as is the case in major U.S. universities. — location: [5968](kindle://book?action=open&asin=B001FA0NOM&location=5968) ^ref-4034 --- The comparative success of these two organizational forms is evidence in favor of the complements case. — location: [5970](kindle://book?action=open&asin=B001FA0NOM&location=5970) ^ref-15559 --- The reason was that, unlike the Yale students, she thought ahead and started at the back of the book. — location: [5992](kindle://book?action=open&asin=B001FA0NOM&location=5992) ^ref-55653 --- Intrinsically rewarding tasks and do-good organizations need fewer or weaker material incentives. In fact, psychologists have found that the “extrinsic” monetary incentives can diminish the “intrinsic” incentives of workers in such settings. — location: [6002](kindle://book?action=open&asin=B001FA0NOM&location=6002) ^ref-27172 --- The surprise was that the group with only 3¢ payment did the worst of all, getting only 23 right on average. Once money enters the picture, it becomes the main motivation, and 3¢ just wasn’t enough. It may also have conveyed that the task wasn’t that important. — location: [6010](kindle://book?action=open&asin=B001FA0NOM&location=6010) ^ref-6927 --- Multiple Owners In some organizations the control structure is not a pyramid. In places the pyramid gets inverted: one worker is responsible to several bosses. This happens even in private companies but is much more common in the public sector. Most public sector agencies have to answer to the executive, the legislature, the courts, the media, various lobbies, and so — location: [6021](kindle://book?action=open&asin=B001FA0NOM&location=6021) ^ref-40949 --- the effect is weakness of incentives in the aggregate. Imagine that one parent gives a reward for good grades and the other a reward for success on the athletic field. Instead of working synergistically, each reward is likely to offset the other. The reason is that as the kid spends more time studying, this will take some time away from athletics and thus reduce the chance of getting the sports award. — location: [6027](kindle://book?action=open&asin=B001FA0NOM&location=6027) ^ref-33264 --- The expected gain from an extra hour hitting the books won’t be, say, $1, but $1 minus the likely reduction in the sports prize. The two rewards might not totally offset each other, as the kid could spend more time studying and practicing with less time for sleeping and eating. — location: [6030](kindle://book?action=open&asin=B001FA0NOM&location=6030) ^ref-10581 --- In fact, mathematical models show that the overall strength of incentives in such situations is inversely proportional to the number of different bosses. — location: [6032](kindle://book?action=open&asin=B001FA0NOM&location=6032) ^ref-32997 --- Imagine you are the owner of a high-tech company trying to develop and market a new computer chess game, Wizard 1.0. If you succeed, you will make a profit of $200,000 from the sales. If you fail, you make nothing. Success or failure hinges on what your expert player-programmer does. — location: [6041](kindle://book?action=open&asin=B001FA0NOM&location=6041) ^ref-2272 --- The difference, or the bonus for success, should be just enough to make it in the employee’s own interest to provide high-quality effort. In this case, the bonus must be big enough so that the expert expects a high effort will raise her earnings by $20,000, from $50,000 to $70,000. Hence the bonus for success has to be at least $100,000: a 20 percent increase (from 60 to 80 percent) in the chance of getting a $100,000 bonus provides the necessary $20,000 expected payment for motivating high-quality effort. — location: [6059](kindle://book?action=open&asin=B001FA0NOM&location=6059) ^ref-59621 --- We now know the bonus, but we don’t know the base rate, the amount paid in the event of a failure. That needs a little calculation. Since even low effort has a 60 percent chance of success, the $100,000 bonus provides an expected $60,000 payment for low effort. This is $10,000 more than the market requires. — location: [6062](kindle://book?action=open&asin=B001FA0NOM&location=6062) ^ref-57330 --- Thus the base pay is–$10,000. You should pay the employee $90,000 for success, and she should pay you a fine of $10,000 in the event of failure. — location: [6064](kindle://book?action=open&asin=B001FA0NOM&location=6064) ^ref-38926 --- Thus, with this incentive scheme, the programmer’s incremental reward for success is $100,000, the minimum necessary for inducing quality effort. The average payment to her is $70,000 (an 80 percent chance of $90,000 and a 20 percent chance of–$10,000). — location: [6065](kindle://book?action=open&asin=B001FA0NOM&location=6065) ^ref-30678 --- In essence, this incentive scheme sells 50 percent of the firm to the programmer in exchange for $10,000 and her effort.* Her net payments are then either $90,000 or–$10,000, and with so much riding on the outcome of the project it becomes in her interest to supply high-quality effort in order to increase the chance of success (and her profit share of $100,000). The only difference between this contract and the fine/bonus scheme is in the name. — location: [6071](kindle://book?action=open&asin=B001FA0NOM&location=6071) ^ref-1760 --- But these solutions may not be possible, either because assessing a fine on an employee may not be legal or because the worker does not have sufficient capital to pay the $10,000 for her 50 percent stake. What do you do then? The answer is to go as close to the fine solution or equity-sharing as you can. — location: [6076](kindle://book?action=open&asin=B001FA0NOM&location=6076) ^ref-19167 --- Since the minimum effective bonus is $100,000, the worker gets $100,000 in the event of success and nothing upon failure. Now the employee’s average receipt is $80,000, and your profit falls to $80,000 (since your average revenue remains $160,000). With equity-sharing, the worker has only her labor and no capital to invest in the project. But she still has to be given a 50 percent share to motivate her to supply high-quality effort. — location: [6078](kindle://book?action=open&asin=B001FA0NOM&location=6078) ^ref-50875 --- The inability to enforce fines or get workers to invest their own capital means that the outcome is less good from your point of view—in this case, by $10,000. Now the unobservability of effort makes a difference. — location: [6081](kindle://book?action=open&asin=B001FA0NOM&location=6081) ^ref-23989 --- There are two envelopes, each containing an amount of money; the amount of money is either $5, $10, $20, $40, $80, or $160, and everybody knows this. Furthermore, we are told that one envelope contains exactly twice as much money as the other. — location: [6129](kindle://book?action=open&asin=B001FA0NOM&location=6129) ^ref-15481 --- Suppose Baba opens his envelope and sees $20. He reasons as follows: Ali is equally likely to have $10 or $40. Thus my expected reward if I switch envelopes is $(10 + 40)/2 = $25 > $20. For gambles this small, the risk is unimportant, so it is in my interest to switch. — location: [6133](kindle://book?action=open&asin=B001FA0NOM&location=6133) ^ref-59816 --- Both parties can’t be better off by switching envelopes, since the amount of money to go around is not getting any bigger by switching. — location: [6136](kindle://book?action=open&asin=B001FA0NOM&location=6136) ^ref-57695 --- Suppose that Ali opens her envelope and sees $160. In that case, she knows that she has the greater amount and hence is unwilling to participate in a trade. Since Ali won’t trade when she has $160, Baba should refuse to switch envelopes when he has $80, for the only time Ali might trade with him occurs when Ali has $40, in which case Baba prefers to keep his original $80. — location: [6144](kindle://book?action=open&asin=B001FA0NOM&location=6144) ^ref-49775 --- His 1972 hit song “Doctor My Eyes” is still a classic. — location: [6155](kindle://book?action=open&asin=B001FA0NOM&location=6155) ^ref-11778 --- Here’s a case where look forward and reason backward would have made all the difference. The trick is to not choose the best place to sit independently of what others are doing. — location: [6158](kindle://book?action=open&asin=B001FA0NOM&location=6158) ^ref-16679 --- Everyone was given £20 worth of chips, and the person who had amassed the greatest fortune by evening’s end would win a free ticket to next year’s ball. When it came time for the last spin of the roulette wheel, by a happy coincidence, Barry led with £700 worth of chips, and the next closest was a young Englishwoman with £300. The rest of the group had been effectively cleaned out. Just before the last bets were to be placed, the woman offered to split next year’s ball ticket, but Barry refused. With his substantial lead, there was little reason to settle for half. — location: [6165](kindle://book?action=open&asin=B001FA0NOM&location=6165) ^ref-36259 --- Even betting her entire stake would not lead to victory at these odds; therefore, the woman was forced to take one of the more risky gambles. She bet her entire stake on the chance that the ball would land on a multiple of three. This bet pays two to one (so her £300 bet would return £900 if she won) but has only a 12/37 chance of winning. — location: [6172](kindle://book?action=open&asin=B001FA0NOM&location=6172) ^ref-27396 --- Barry should have copied the woman’s bet and placed £300 on the chance that the ball would land on a multiple of three. This would have guaranteed that he stayed ahead of her by £400 and won the ticket: either they both would lose the bet and Barry would win £400 to £0, or they both would win the bet and Barry would end up ahead £1,300 to £900. — location: [6177](kindle://book?action=open&asin=B001FA0NOM&location=6177) ^ref-15851 --- Her only hope was that Barry would bet first. If Barry had been first to place £200 on black, what should she have done? She should have bet her £300 on red. Betting her stake on black would do her no good, since she would win only when Barry won (and she would place second with £600, compared with Barry’s £900). — location: [6181](kindle://book?action=open&asin=B001FA0NOM&location=6181) ^ref-47942 --- But there was a major catch. The proposer would be required to vote for his own proposal. The voting would then proceed in clockwise order around the boardroom table. To pass, a proposal needed at least 50 percent of the total board — location: [6200](kindle://book?action=open&asin=B001FA0NOM&location=6200) ^ref-44427 --- Any person who made a proposal to change either the membership of the board or the rules governing how membership was determined would be deprived of his position on the board and his stock holdings if his proposal failed. — location: [6203](kindle://book?action=open&asin=B001FA0NOM&location=6203) ^ref-15359 --- he used the strategy of a two-tiered tender offer. A two-tiered bid typically offers a high price for the first shares tendered and a lower price to the shares tendered later. — location: [6236](kindle://book?action=open&asin=B001FA0NOM&location=6236) ^ref-50625 --- We can express the average payment for shares by a simple algebraic expression: if fewer than 50 percent tender, everyone gets $105 per share; if an amount X% ≥ 50% of the company’s total stock gets tendered, then the average price paid per share is One thing to notice about the way the two-tiered offer is made is that it is unconditional; even if the raider does not get control, the tendered shares are still purchased at the first-tier price. — location: [6243](kindle://book?action=open&asin=B001FA0NOM&location=6243) ^ref-29657 --- Tendering to the two-tiered offer is a dominant strategy. To verify this, we consider all the possible cases. There are three possibilities to check. The two-tiered offer attracts less than 50 percent of the total shares and fails. The two-tiered offer attracts some amount above 50 percent and succeeds. The two-tiered offer attracts exactly 50 percent. If you tender, the offer will succeed, and without you it fails. — location: [6254](kindle://book?action=open&asin=B001FA0NOM&location=6254) ^ref-17518 --- Because tendering is a dominant strategy, we expect everyone to tender. — location: [6263](kindle://book?action=open&asin=B001FA0NOM&location=6263) ^ref-35213 --- attractive. In fact, Larry’s best strategy is to fire up in the air! In this case, Moe will shoot at Curly, and if he misses, Curly will shoot and kill Moe. Then it becomes the second round and it is Larry’s turn to shoot again. Since only one other person remains, he has at least a 30 percent chance of survival, since that is the probability that he kills his one remaining opponent. — location: [6305](kindle://book?action=open&asin=B001FA0NOM&location=6305) ^ref-13559 --- The moral here is that small fish may do better by passing on their first chance to become stars. We see this every four years in presidential campaigns. When there is a large number of contenders, the leader of the pack often gets derailed by the cumulative attacks of all the medium-sized fish. — location: [6308](kindle://book?action=open&asin=B001FA0NOM&location=6308) ^ref-11494 --- Your chances of survival depend on not only your own ability but also whom you threaten. A weak player who threatens no one may end up surviving if the stronger players kill each other off. — location: [6311](kindle://book?action=open&asin=B001FA0NOM&location=6311) ^ref-48627 --- The new recruit is maltreated, humiliated, and put under such immense physical and mental strain that the few weeks quite alter his personality. An important habit acquired in this process is an automatic, unquestioning obedience. There is no reason why socks should be folded, or beds made, in a particular way, except that the officer has so ordered. The idea is that the same obedience will occur when the order is of greater importance. Trained not to question orders, the soldier becomes a fighting machine; commitment is automatic. — location: [6338](kindle://book?action=open&asin=B001FA0NOM&location=6338) ^ref-44377 --- Because each company can anticipate making high profits on the back end, they are willing to go to extraordinary lengths to attract or steal customers. Thus laser printers are practically given away, as are most cell phones. — location: [6423](kindle://book?action=open&asin=B001FA0NOM&location=6423) ^ref-49889 --- Imagine that the extra driver, instead of crossing the bridge at 9:00 A.M., pulls his car over to the side and lets all the other drivers pass. — location: [6474](kindle://book?action=open&asin=B001FA0NOM&location=6474) ^ref-15749 --- reason is straightforward. The total waiting time is the time it takes for everyone to cross the bridge. — location: [6478](kindle://book?action=open&asin=B001FA0NOM&location=6478) ^ref-55960 --- To start at the end, if Eli ever bids $2.50, he’ll win the dollar (and be down $1.50). If he bids $2.40, then John must bid $2.50 in order to win. Since it is not worth spending a dollar to win a dollar, an Eli bid of $2.40 will win if John’s current bid is at $1.50 or less. The same argument works if Eli bids $2.30. John can’t bid $2.40 and expect to win, because Eli would counter with $2.50. To beat $2.30, John needs to go all the way up to $2.50. Hence a $2.30 bid beats $1.50 and below. So does a $2.20 bid, a $2.10 bid, all the way down to a $1.60 bid. If Eli bids $1.60, John should predict that Eli won’t give up until the bidding reaches $2.50. Eli’s $1.60 is already lost, but it is worth his while to spend another 90 cents to capture the dollar. The first person to bid $1.60 wins, because that establishes a credible commitment to go up to $2.50. In our mind, we should think of $1.60 as the same sort of winning bid as $2.50. In order to beat $1.50, it suffices to bid $1.60, and nothing less will do. That means $1.50 will beat all bids at 60 cents and below. Even a bid of 70 cents will beat all bids at 60 cents and below. — location: [6496](kindle://book?action=open&asin=B001FA0NOM&location=6496) ^ref-56789 --- We expect that either John or Eli will bid 70 cents and the bidding will end. — location: [6506](kindle://book?action=open&asin=B001FA0NOM&location=6506) ^ref-43817 --- before, any child who fails to meet the quota is disinherited. The problem is what to do if all of them are below the quota. In that case, give all of the estate to the child who visits the most. This will make the children’s reduced visiting cartel impossible to maintain. We have put the children into a multiperson dilemma. The smallest amount of cheating brings a massive reward. A child who makes just one more phone call increases his or her inheritance from an equal share to 100 percent. The only escape is to go along with the parents’ wishes. — location: [6529](kindle://book?action=open&asin=B001FA0NOM&location=6529) ^ref-24855 --- In looking back, note that something unusual happened in the transition from a simultaneous-move to a sequential-move game. Criminals chose to forego what was their dominant strategy. In the simultaneous-move game it was dominant for them to carry guns. In the sequential-move game, they chose not to. The reason is that in a sequential-move game, their course of action affects the homeowners’ choice. — location: [6588](kindle://book?action=open&asin=B001FA0NOM&location=6588) ^ref-53790 --- John von Neumann and Oscar Morgenstern’s Theory of Games and Economic Behavior (Princeton, NJ: Princeton University Press, 1947), — location: [6626](kindle://book?action=open&asin=B001FA0NOM&location=6626) ^ref-28653 --- Thomas Schelling’s The Strategy of Conflict (Cambridge, MA: Harvard University Press, 1960) is more than just a pioneering book; it continues to provide instruction and insight. — location: [6627](kindle://book?action=open&asin=B001FA0NOM&location=6627) ^ref-63655 --- For an entertaining exposition of zero-sum games, J. D. Williams’s The Compleat Strategyst, rev. ed. (New York: McGraw-Hill, 1966) still cannot be beat. — location: [6629](kindle://book?action=open&asin=B001FA0NOM&location=6629) ^ref-36462 --- William Poundstone’s Prisoner’s Dilemma (New York: Anchor, 1993) goes beyond a description of the eponymous game to offer a first-rate biography of John von Neumann, the polymath who invented the modern computer along with game theory. — location: [6635](kindle://book?action=open&asin=B001FA0NOM&location=6635) ^ref-53720 ---