Table of Contents
Picture a room full of people staring at each other, each trying to guess what the others are thinking. Nobody speaks. Everyone knows that everyone else is trying to figure out what everyone else is thinking. It sounds like a psychological thriller, but it’s actually how game theorists think about rational decision making.
This is the world of rationalizability, where what makes sense depends entirely on what you think makes sense to others who are thinking about what makes sense to you. Confused yet? Good. That’s where things get interesting.
The Problem With Being Smart
Consider a simple game. You and a stranger must each pick a number between zero and one hundred. The person who picks the number closest to two thirds of the average wins a prize. What number should you choose?
Your first instinct might be to pick fifty, right in the middle. But wait. If everyone picks fifty, the average is fifty, and two thirds of fifty is about thirty three. So maybe you should pick thirty three instead. But if everyone thinks this way, they’ll all pick thirty three, which means the average will be thirty three, and two thirds of that is about twenty two. You can see where this is going.
This logic keeps spiraling downward. Each step seems to make perfect sense, yet you end up somewhere completely different from where you started. The truly rational answer, if you follow this reasoning all the way down, is zero. But here’s the twist: when people actually play this game, almost nobody picks zero. Most people land somewhere in the twenties or thirties.
Why? Because being smart in a game isn’t just about logic. It’s about guessing how smart everyone else is and how many steps ahead they’re thinking. A strategy makes sense only if it makes sense given what you believe others will do, which depends on what they believe you’ll do, which depends on what you believe they believe, and so on into infinity.
What Survives the Scrutiny
A rationalizable strategy is one that survives this hall of mirrors. It’s a strategy you can justify playing if you make some reasonable assumption about what others are doing, and they can justify their strategies with reasonable assumptions about you, and those assumptions don’t contradict each other all the way down.
Think of it like a house of cards. Each card supports others and is supported by others. A strategy is rationalizable if it can be part of a structure where every card has a reason to be there, even though the whole thing might look precarious from the outside.
Here’s what makes this concept powerful: it doesn’t require everyone to be perfectly rational or to have perfect information. It only requires that you can’t rule out a strategy based on pure logic. If there’s some way the world could be, some set of beliefs that aren’t obviously crazy, that would make a strategy sensible, then that strategy is rationalizable.
The Restaurant Game
Consider two restaurants across the street from each other. Both need to set prices for lunch. If one charges significantly more than the other, everyone goes to the cheaper place. If they charge the same, they split customers evenly. If both charge very high prices, they make great margins but fewer people eat out.
What’s a rationalizable price? Well, charging more than your competitor makes no sense if you know they’ll undercut you. But charging the absolute minimum makes no sense either because you’d be leaving money on the table. The rationalizable prices fall into a range where each restaurant might believe the other will charge something reasonable, and their own price is a sensible response to that belief.
Notice what doesn’t happen here: neither restaurant needs to perfectly predict what the other will do. They just need to have beliefs that aren’t self contradictory. One might believe the other values market share, while the other believes the first values margins. Both can be rational within their own belief systems.
This is what makes rationalizability different from predicting the future. It’s about what’s defensible, not what’s destined.
When Smart People Do Dumb Things
Here’s where things get counterintuitive. Sometimes the most rationalizable strategy is one that looks absolutely foolish from the outside.
Imagine an auction where the highest bidder wins but pays the second highest bid. These are real, and they’re called Vickrey auctions. The rational move is to bid exactly what the item is worth to you. Bidding less risks losing when you should win. Bidding more risks paying too much.
But suppose someone doesn’t understand this. They bid higher than their true value, thinking they’ll get a better shot at winning. Now imagine you know this person is in the auction. Suddenly, bidding your true value might not be rationalizable anymore. You might want to bid lower to avoid paying too much when this confused bidder drives up the second highest price.
The fascinating part is that the confused person’s strategy makes their own confusion self fulfilling. By not playing rationally, they change what’s rational for everyone else. The game becomes a different game entirely.
The Infinite Regress Problem
Game theorists love to talk about common knowledge and infinite regress. You know something, you know that others know, you know that others know that you know, and so forth forever. It sounds like philosophical nonsense, but it matters enormously for what counts as rationalizable.
Take a coordination game where you and someone else are trying to meet but can’t communicate. You both know there are two possible meeting spots: the train station and the coffee shop. If you both pick the same place, you meet. If not, you don’t.
Which choice is rationalizable? Both are, because you might reasonably believe the other person will pick either location. There’s no logical way to eliminate one. But here’s the interesting bit: suppose one location is much more famous or obvious than the other. The train station is a major landmark while the coffee shop is hidden on a side street.
Now the train station becomes more likely not because it’s objectively better, but because you both know you both know it’s more prominent. You expect them to expect you to go there. This isn’t about the train station having better coffee or more seats. It’s purely about the structure of beliefs.
And here’s the kicker: even if you personally hate the train station and would much prefer the coffee shop, going to the train station might be your only rationalizable choice. The logic of common knowledge can trap you into doing something you don’t prefer, simply because you both know that you both know that you both know everyone sees it as obvious.
What Gets Eliminated
Understanding rationalizability also means understanding what doesn’t make the cut. Some strategies are so bad that no reasonable belief system can justify them.
Suppose you’re playing Rock Paper Scissors for money. If you always throw rock, no matter what, that’s not rationalizable. Why? Because anyone with half a brain would notice and always throw paper against you. You’d be choosing a strategy that guarantees you lose, which you could never justify with any belief about how your opponent plays.
But here’s where it gets subtle. Throwing rock sometimes is perfectly rationalizable. Throwing it often is fine too. The problem isn’t with the action itself, but with the mechanical predictability. A strategy becomes irrational not when it occasionally fails, but when it keeps choosing something that’s definitely worse than an available alternative.
Game theorists call these dominated strategies. They’re strategies where you could do better no matter what others do. And anything dominated can’t be rationalized because there’s no belief system that makes choosing certain failure sensible.
The Beauty Contest of the Mind
Remember that guessing game about two thirds of the average? Real experiments with real people have been run thousands of times. The results tell us something profound about how we think about thinking.
Most people don’t go all the way to zero. They stop at some middle layer of reasoning. They think one or two or three steps ahead, but not infinitely far. This isn’t because people are stupid. It’s because people are realistic about how much thinking others are doing.
If you think everyone else is going to think one level deep, then thinking two levels deep is perfectly rational. You gain an edge. But thinking twenty levels deep? That’s just wasted brainpower because nobody else is there with you.
This creates a fascinating situation where the rationalizable strategies span a wide range. Zero is rationalizable if you believe everyone is infinitely logical. Thirty three is rationalizable if you believe most people think one step ahead. Fifty is rationalizable if you think most people just pick the middle.
None of these beliefs contradict themselves. They’re all internally consistent. They just assume different things about how much thinking is going on in the room.
When Everyone Is Right and Wrong
Here’s perhaps the most counterintuitive aspect of rationalizable strategies: everyone can be playing rationally while the outcome is collectively absurd.
Picture a crowded subway platform during rush hour. As the train approaches, everyone edges closer to the yellow line. Nobody wants to be at the back because the train is packed and rear positions mean waiting for the next one. So everyone pushes forward. Soon people are uncomfortably close to the edge, some even risking danger.
Is each person being rational? Absolutely. Given that others are pushing forward, staying back means certain failure to board. Is the collective outcome rational? Absolutely not. Everyone would be better off if everyone stepped back from the edge and formed an orderly queue.
Every individual strategy is rationalizable given beliefs about what others are doing. Yet the system as a whole produces something nobody wanted. This isn’t a failure of rationality. It’s rationality working exactly as expected within a perverse set of incentives.
Game theorists sometimes call these coordination failures. Everyone coordinates on a bad outcome because changing your own behavior unilaterally makes things worse for you, even though coordinated change would help everyone.
The Role of Culture and Context
What makes a strategy seem sensible often depends heavily on context that has nothing to do with the formal structure of the game. Social norms, cultural expectations, and shared history all influence what beliefs seem reasonable.
In some business cultures, aggressive negotiation tactics are expected and anything less seems naive. In others, the same tactics would be seen as hostile and self defeating. The actual game, the formal structure of offers and counteroffers, might be identical. But what counts as a rationalizable strategy shifts entirely based on cultural context.
This isn’t relativism or wishy washy thinking. It’s recognizing that rationality operates within systems of shared understanding. You can’t be rational in a vacuum. Your beliefs about what others will do must connect to reality somehow, and reality includes culture, history, and context.
A poker player in Vegas and a poker player in a friendly home game might face identical cards and odds, yet play very differently, and both be perfectly rational. The Vegas player can rationalize aggressive bluffing because they believe others believe this is standard play. The home game player can rationalize cautious play because they believe others believe friendship matters more than winning.
Making Peace With Uncertainty
The concept of rationalizability forces us to accept something uncomfortable: there often isn’t one right answer. Multiple strategies can all make sense simultaneously. The person who charges high prices and the person who charges low prices might both be rational. The aggressive negotiator and the cooperative one might both have defensible positions.
This doesn’t mean all strategies are equal or that anything goes. Far from it. Many strategies can be definitively ruled out as irrational. But once you’ve eliminated the clearly dominated options, you’re often left with a range of reasonable choices.
What separates winners from losers in these situations isn’t always who’s more rational. Sometimes it’s luck. Sometimes it’s who better reads the room. Sometimes it’s who has better outside options that change the game entirely.
The lesson isn’t that rationality doesn’t matter. It’s that rationality is necessary but not sufficient. A strategy must be rationalizable to succeed in the long run, but being rationalizable doesn’t guarantee success.
The Endless Echo
In the end, thinking about what makes a strategy make sense is like standing between two mirrors. You see yourself reflected infinitely, each reflection depending on all the others. Your rationality depends on your beliefs about others’ rationality, which depends on their beliefs about your rationality, forever.
Game theory just makes explicit what we already do implicitly: we act based on beliefs about others, checking that those beliefs don’t contradict themselves or basic logic. We don’t need perfect foresight or genius level intelligence. We just need our strategies to survive the scrutiny of reason.
And when a strategy survives that scrutiny, when it holds up no matter how many times you ask whether it really makes sense given what you believe about what others believe about what you believe, then you’ve found something rationalizable. Whether it’s optimal or successful is another question entirely. But at least you’re not playing yourself for a fool.
That’s not a small achievement. In a world where everyone is guessing what everyone else is thinking, having a strategy that makes sense is sometimes the best you can hope for.


