Lecture 1: Overview and Introduction

Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

Description: Will Ma gives an overview of the general topics and structure of the course, and begins the course by covering the basics of poker reasoning and play.

Instructor: Will Ma

The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation or view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.

PROFESSOR: OK. So let's get started. So before I get into any poker I want to talk about the mentality I want everyone to try to have when analyzing poker in this class. So I call it the decision mentality.

I'm going to start with a story. Who here has heard of credit card roulette? It's like a game you play at the end of a guy going to restaurants. So what happens is poker players, they're going to split the bill by instead of everyone paying their own bill, which is annoying. You have to keep track. You might have to Venmo people after the exact amount. And sometimes the waiter or waitress doesn't want to split the bill per person.

So poker players get around this by just picking one person at random to pay the bill. And we like making this exciting. So what we do is we ask everyone to put in their credit cards. And then we pull out the credit cards one at a time. And if your credit card is pulled out then you're safe. And the last person in has to pay for the whole table.

So it's a pretty fun game. Yeah, I think I'm pretty lucky. The biggest one I lost was in Hong Kong. I once had to pay around 1,200 USD. It was a pretty big table. But overall I'm pretty good at this game. It's a game of skill for sure.

But sometimes this results in some funny stories with non poker players. So this is something that happened to some poker players. So poker pro Matt, he goes to dinner with poker pro Steven. And that brings Emily, who's a close friend whom he also has a romantic interest in.

So when the bill comes Matt's like OK, I'm going to pay for it Emily. So he puts in two credit cards. He's like the second credit card is for Steven-- is for Emily. And then Steven pays for himself. So Stephen puts in one credit card.

So they play credit card roulette. And then Matt, being a very lucky guy, pulls out both of his credit cards before Steven's. And Steven ends up paying for all three of them.

So now the question is, who should Emily thank. So who would you thank if you were in Emily's shoes? Does anyone want to say to thank Steven?

AUDIENCE: Yeah.

PROFESSOR: Because Steven actually paid for the meal. So I think it's a totally reasonable thing to do as a reasonable person to thank Steven who actually had to pull out his wallet. So in this class we want everyone to think in terms of the expected results and not actual results.

So Emily should be thanking Matt because, on average, Matt put in the card for Emily. And on average Matt is going to be paying for Emily because Matt's going to be paying the one third of the time that Emily would be paying. But at the time, Emily thanked Steven for her meal and then didn't say anything to Matt. And then Matt was upset about it and told the entire poker community. That's how I found out about this story.

So we want to think about in terms of on average what your decision would have whether you would have made money in expectation or on average. So roughly, the law of large numbers says, over your lifetime, the amount you end up paying for credit card roulette is the same as you would have paid from splitting the bill. So you know why split the bill? You might as well just save a lot of time by playing this fun game every single time. And over your lifetime, the amount you pay in credit card roulette is roughly going to be what you would have paid from splitting.

So all randomness eventually averages out to it's expected value. That's what this is saying. So what does eventually mean?

So basically when we say a gamble is very risky I'm not mathematically defining anything here. But I just want to throw out some intuitive concepts. So a risky gamble is a gamble where it takes a long time to converge at your expectation. But the point is, no matter how risky it is, eventually it will get you.

So there's a saying that death, taxes are the two things that eventually get you. As poker players, we would like to think that three things eventually get you. It's death, taxes, and the law of large numbers will eventually-- you're going to reach your [INAUDIBLE].

So here's another hypothetical situation. So let's say you get off at the wrong bus stop because you were distracted. And then you were upset yourself you analyze how to not be distracted in the future and get off at the right bus stop. But then after you get off at the wrong bus stop you find $1,000 on the ground. And then you immediately, you're no longer upset and you marvel at your riches.

So this is sort of an absurd story. But situations like this happen all the time in poker. You're going to make a bad decision but bad decisions still get a good result 49% of the time. And if you make the right decision you're still going to get a bad result 49% of the time.

So it's very important to analyze your decisions without being biased by the actual outcome that occurred. So you really want to be obsessed with this self-improvement, analyzing your decisions. If you made $10,000 in a situation where you could have be $12,000 then that's not good enough.

So I want everybody to think in terms of what's the maximum you could of made and analyzing what's the best decision you could possibly have made in every situation. And sometimes it's hard because if the result is exactly correlated with the decision then you can just go back and look at the result and know whether you made a good decision or not. But that's why learning poker can sometimes be very, very hard because you don't have immediate feedback. You're not sure whether the decision you made is what caused you to make that money or you just got lucky.

So with that being said, now let's talk about some ways to reason about poker hands. So roughly there's three levels of reasoning of poker hands. Level one my hand versus your hand.

But by this I mean, you can see what your cards are. And you look into your opponents eyes and you say, OK, I can tell your cards must be pocket kings or whatever. Your hand must beat this other hand. And you played your hand exactly against your opponent's specific hand because you have a soul read on them.

So let's see the example of this. So we'll watch an episode of Poker After Dark here.

[VIDEO PLAYBACK]

[MUSIC PLAYING]

- Raise to 1,200.

- I think you would call it this time, Patrick.

- Button raises. Never anything.

NARRATOR: Contrary to what Patrick might think Jennifer has a real hand and it just got better. She's flopped top set. Patrick flopped a pair of tens with a gutshot straight draw.

- I've got two pair.

- Check.

- Full house.

- I can't beat that.

- I thought you had pocket kings.

- I almost thought I had you.

[END PLAYBACK]

PROFESSOR: So yes, this is sort of well known poker term from way that in the day. If you're a Jennifer Tilly hand-- I'm sorry if I'm making fun of her-- but basically she put her opponent on a specific hand. She looked at Patrick and Antonius and had a feeling that he had pocket kings for some strange reason. And then what happened was, so she had pocket jacks here, which is a really, really good hand. It's a full house. And she just checked the turn and checked the river instead of trying to get Patrick to put more money in because she was so certain Patrick had pocket kings.

Just mathematically speaking, out of all the possible combinations of cards you have, to put your opponent specifically on pocket kings in this example is basically unfounded. So this gets to level two reasoning. So level two reasoning is my hand versus your range of hands, versus your probability distribution of hands. And another name for this is exploitative play.

So let's look at a different hand. And I'm going to show you how to reason about this hand using level two exploitative reasoning. So I'm not going to go through the whole hand. But this is actually a real hand I played in Macau five, six years ago.

So we get to the river and I had ace 10 in this situation. So the pot is 21,000 roughly. And my opponent bets 8,000-- sorry, so the pot is 21,000 after my opponent bets 8,000. So basically the pot was 13,000 and my opponent bet 8,000 making it 21,000.

And I have to decide whether to call with a pretty good hand. I have a pair of aces here. But I know my opponent is [? Rain ?] [? Kahn, ?] who is a very tight player and doesn't really like bluffing. So I don't think he's really ever bluffing. If I'm beating him here it's because he's betting a hand that he thinks is good but is actually worse than mine, like ace eight basically.

So I model my opponent's range as ace king through ace eight. I don't think he has something even stronger, like pocket nines because he would have raised more earlier in the hand. So let me run through this calculation slowly for those of you seeing pot odd for the first time.

So here basically, I want to think of it like this. The pot, which includes the 8,000 he just put out, is 21. We're considering calling for 8,000 so if we lose, if we call and her turns up ace king, which is a better hand, then our net result from this decision is negative $8,000.

If we win and we get our 8,000 back as well as the 21,000 in the pot, so we make, our net is plus 21,000. So therefore, our win to lose ratio needs to be at least 8,000 to 21,000 in order for calling to be positive expectancy.

So we'll do the exact calculation. I'll try to do it somewhat slowly to show an example of how to do this calculation. So essentially what I do is I count the combinations of each hand that exists. So ace king, there's eight combinations because there's two aces remaining that I haven't seen and there's four kings remaining. So it's two times four, there's eight combinations.

Same with ace queen, ace jack, and ace eight. Ace 10 and ace nine, there's only six combinations each because I have the 10, which makes him having ace 10 less likely. And for ace nine there's a nine on the board. So ace nine is also slightly less likely.

So ace 10 we actually get half the pot. So I'm just going to roughly make that equivalent to us winning three of those times and losing three of those times, which is equivalent in expectation. So basically, if you do this counting, there's 33 combinations that beat us, there's 11 combinations that we beat. And so our equity is basically our probability of winning, which is equivalently only the fraction of the pot that we own.

So our equity in this case is 25%. And the pot odds are 21 to 8. So we need we basically 1 over 3.56. So 21 to 8 is 2.56 to 1. We need equity 1 over 3.56 to call, which is 28%, which we don't. So I folded.

So unfortunately, this calculation is a bit ugly. You just have to do it a bunch of times and convince yourself that are doing the right calculation. Basically you got to remember to add one when you're converting the odds ratio to an equity. So it should be fairly simple math.

So in this case, I folded because I didn't have the desired. So is there any questions about that? That was maybe a bit confusing but I don't know of a great way to show the exact conversion from pot odds into equity. But I just want to point out that they're slightly different. One is a ratio of when to lose and one is basically a fraction of win over all possible outcomes.

So just don't get confused by this. That's all I'm saying. I'm going to quickly interchange between the two throughout the course. And you should try to get in the habit of being able to quickly complete between the two.

So this is an example of level two reasoning. And hand reading is about using your opponent's past actions and your knowledge of their tendencies to tweak your probability distribution of over what you think their hand is. So hand reading is not about pegging your opponent on a specific hand. It's a marketing message. Poker players, they wear sunglasses, they have earplugs to prevent people from reading their soul. It matters a bit, but really, these things, things like I scratched my ear before I raised, it should affect your belief about my cards a very small amount, way less than what Lady Gaga and the media makes it seem like.

That's how you reason about a hand exploitatively. You build a model for how your opponent behaves and you make the decision that's the most positive expectancy against that model. And you can go very, very far with level two reasoning. You can basically build a career out of level two reasoning. And it's best targeted towards individual opponents with specific tendencies that you're trying to take advantage of.

So this sounds very good. This sounds like a great way to play poker. I figured out what my opponent's doing right. I just figured out the probability distribution and play in a way make the most money from him. So does anyone have any problems with this type of reasoning?

So I've got some $20 Amazon gift certificates from [? Acuma ?] Capital. Someone can point out potentially something you don't like about this type of reasoning I'll give you this gift certificate.

AUDIENCE: You reveal a lot of information about what your hand is when you think like this.

PROFESSOR: Right. But what if your model incorporates the fact that they are going to behave based on what they think is in your hand. Yeah.

AUDIENCE: It might take you a long time to build this model. And if the players leave your table, or whatever, you spent like-- because before you have a model you can't use this line of reasoning. So it's expensive to do it.

PROFESSOR: That's a good point. So first of all, level two reasoning is not easy. Even though it has flaws, even playing very good level two, poker is not easy. Building the model can be very hard and you could be wrong. But that's not really what I'm looking for. Colin, yeah.

AUDIENCE: Because if your opponent knows your reasoning then he could maybe bet a certain way to push it to 28%, where you'd only call if it's--

PROFESSOR: Right. OK. Yeah. I think that's be good enough. I' going to give a certificate. I'll give it to you at the end. So essentially, the problem is it's this assumption-- the fatal flaw in every plan is the assumption that you know more than your enemy.

Does anyone know where this is from, by the way? If you do I'll have another $20 gift certificate. Does anyone know where this phrase is from? Yeah.

AUDIENCE: Is it from Kasowitz?

PROFESSOR: No. Yeah.

AUDIENCE: The Art of War?

PROFESSOR: No. No one knows. I'm going to keep the gift certificate. That's OK. So essentially, the problem with this line of reasoning is your opponent does not play according to a fixed, static algorithm. They're also an intelligent human who's taking this class who's building all this for you and adapting the strategy to beat you. So if you assume your opponent plays with a fixed strategy, but what if they're doing the same for you?

Now I'm going to get to the third level of reasoning, which I call optimal play. So let's analyze the exact same hand using level three reasoning. So don't get me wrong. Level two reasoning is great. If you can do level two reasoning very well you can make a ton of money. But level three reasoning is a completely different way to analyze this hand that can also make you a lot of money.

So level three reasoning I'm going to think like this. Level three is my range of hands over your range of hands. So instead of looking at my specific cards and figuring out what my cards are, I'm actually going to also put a probability distribution over myself and myself, to my opponent's eyes, what is the probability distribution of my hands?

And in this specific spot, let's just say, I decide that my range in this spot is ace jack through ace seven. So not including ace nine. It's basically ace 10, ace eight, ace seven.

So this is the way to do the calculation. So let's just assume I know my opponent's propensity to bet 1 over 1.6% of the pot on the river. So the pot was 13,000 on the river and I know that my opponent's going to bet 8,000. So I must call with the frequency such that their EV from bluffing is zero. So that's essentially my goal with optimal play.

So essentially what I'm saying is, let's say they have a really bad hand. They have like jack 10 for jack high, which is almost certainly going to lose the pot if they don't bet to try to get me to fold. Then I want to call with the frequency such that with those hands, regardless of whether they bluff, their expectation is zero.

So how do I do that? I just make my call to fold ratio 1.6 to 1. Sorry, in all these computations, to make it easier, I'm assuming 1.6 is approximate for the ratio. It's actually 13 over 8. It actually turned out to be mostly Fibonacci numbers so the ratio is around 1.6 to 1 for all of them. [INAUDIBLE].

So my call to fold ratio needs to be 1.6 to 1 so that if they have that a bad hand their expectation from bluffing is zero. It doesn't matter whether they bluff. So if I do this calculation then ace 10-- sorry, there's a typo-- ace 10 is definitely in the top 61.5% of hands I can have so I need to call.

So I'm not building a probability model for them. What I'm essentially saying is, if I'm not calling ace 10 here then they can exploit me by bluffing too much and I'm just folding all my hands here. And they're going to be able to make money off of me. To prevent this I must call ace 10, which is a 61.5 percentile hand in my range, so they can't exploit me.

So I'm going to make an analogy with RPS. Rock, paper, scissors, does everyone know how to play this game? So you either throw a rock, scissors, or paper. So there's no way that you would not know this game. I know people, if you grew up in a different background you might call it something else or you might have different rules.

But the rules I'm playing with is rock beats scissors, scissors beats paper, and paper beats rock. So exploitative play you think like this. You say, since my opponent just played rock three times in a row I think they're probably not going to play a rock a fourth time in a row. So I'm going to play scissors because I know I can't be beaten.

Optimal play just says I'm going to memorize a sequence of random bits and always play each a rock, paper, and scissors with probably a third. So this is an analogy with rock, paper, scissors. So one question I often get it is, if when you play optimally you're making all your opponent's decisions the same then your opponent essentially is never making a mistake because regardless of what they do it's the same. Then how do you make money playing optimally?

In rock, paper, scissors that's true. If my strategy is just play rock, paper, scissors, each with a probability a third I'm never going to beat you more often than I should. But in poker there's enough opportunities to essentially be inconsistent.

For example, sometimes you will see players fold seven six suited and then, because later they're bored and feel like playing a worse hand, they'll call six five suited in the exact same situation, which is like basically making a strictly dominated strategy. Or like check raising a circling theory range. Essentially, a theoretical optimal strategy you will still extract money slowly from even the best players. Because even the best players in poker right now, there's going to be slight inconsistencies that are not theoretically optimal.

So the optimal strategy, another way to think about it is a Nash equilibrium, if you've heard of the term. Because the best response to the optimal strategy is the optimal strategy itself. Whereas the best response to any exploitative strategy is going to be a different strategy. Whenever you play an exploitative strategy you stand to bee beating by a different exploitive strategy that re-exploits you.

This is a defense optimal plays. We play each with the same probability. Optimal play you're indifferent to your opponent's move. So optimal play, you're making a lot of money when your opponent does something strictly suboptimal. Whereas in exploitative play you make a lot of money when you're winning the mind games and you lose a lot when you're losing them.

So exploitative play is sort of like I know that you know that I know that you know that I know this. But you know that I know that you know that I know that you know that I know this. You get into these mind games. And if you're good at them then it's pretty good.

If you're playing rock, paper, scissors against a four-year-old child, probably you don't want to play optimally. Probably you want to look at their face and try and figure out what they're going to do and try to beat them. But if you're playing rock, paper, scissors against the rock, paper, scissors world champion, which there is one actually, then probably you just want to memorize the sequence of random bits and play each probability a third.

So another good thing about exploitative play is it's intuitive. We sort of grew up thinking in terms of exploitative play. I think that's a fair statement. In most things it's like what do you think might happen? OK, if that might happen, OK, I'll do this.

But optimal play is sort of a weird mentality thing because it's sort of like I'm just going to analyze what my opponent could potentially do. So you need an opponent essentially. And then I'm just going to perform an action that makes it so that my opponent can't really beat me.

So that's enough about those two general concepts. I'm going to take a short break right here since it's good timing. So I'm going to continue and we'll get in some actual poker. But first we're going to play one last game.

We're going to play a game called Who's Taller? Anyone can join the content for a dollar. We're not actually having this so don't get out your wallets. And then the tallest person who joined the game gets the entire pot. So all the people who join the game, we're going to measure who's the tallest, and then that person gets everyone's money.

So again, let's play. So everyone, close your eyes so that you can't look in here. Everyone close your eyes. No peeking. No peeking at how tall-- So I want everyone-- close your eyes. Is everyone's eyes closed?

So put up your hand if you want to join the contest. Put it nice and high so that I can see if you want to join the content. So raise your hand high. It's OK. There's no embarrassment. The whole point of poker is where-- OK, cool.

So now everyone open your eyes. So I think five people joined. So you guys come down here. Let's see who-- [LAUGHTER] Let's see who won the pot.

AUDIENCE: [INAUDIBLE]

PROFESSOR: That's OK. It's totally OK if you lost. So I think the four of you. Sorry, what's your name?

AUDIENCE: Justin.

PROFESSOR: OK. Justin. So I think Justin voted one. So good job Justin, you would have made $3. All right. Cool. So what's the point of this game? The point of this game is so from a game theoretic point of view no one really should be playing this game.

So this is only known as the k-beauty game from game theory. So how tall are you Justin?

JUSTIN: 6'5".

PROFESSOR: 6'5". So why did you play the game? Because he thought there's probably someone who's going to be 6'2" who might play the game, right? But if everyone knew that if you're only 6'2" you shouldn't be playing this game because someone might be 6'5" then you wouldn't play the game because someone else is only going to play the game if they were like 6'8" or something.

So let's say we played this again. Probably only Justin would play. Eventually this always devolves into a situation where basically no one is going to join. Because you know that no matter-- even if you're like seven feet tall, you're LeBron James, you would know that someone would only played this game if they were the tallest person in the world essentially.

So basically poker without blinds, which is the money that's put into the pot at the start of every hand, would essentially be like Who's Taller game. So when you play poker you want the motivation of every hand to be stealing the blinds, stealing the money that was forced into the pot without the choice of the person. So you would always fold king king, which is the second best hand in poker pre-flop, if there were no blinds.

Suppose we were to play Who's Taller again. But I told you that I'm going to force Lee Marie to play. OK So now we've got a game because now, even if you're not sure whether you're the tallest person in the room, if you're taller then Lee Marie then you have a chance of winning. So that's essentially why we need blinds to have a game.

So you always want to think how many chips you have in terms of the blinds. So having $400 in front of you in a game where the blinds are $1 or $2, for our purposes, is equivalent to having $4,000 in front of you in a game where the blinds are $10 and $20. Because essentially, everything you're wagering is relative to stealing the blinds that were forced into the game at the beginning. So in both of these situations we say that you have 200 bets, or 200 big blinds, or 200 BB.

So that's how to calculate your stack size. But what's actually important isn't stack size it's effective stack size. So effective stack size takes into account the stack sizes of the people remaining in the pot as well. So I'll give an example.

So in this case, we're the player with the ace jack. And we decided to go all in. So what is our exact stack size? The big blinds is 2,000, the small blind is 1,000. And we wagered in total 42,000. So what's our stack size?

AUDIENCE: 20--

PROFESSOR: 21, right. It's around 21. It's around 21 big blinds. But we didn't really risk 21 big blinds here, did we? Can someone-- OK, I'll give a $20 gift certificate. How much did we actually risk in some sense. Yeah.

AUDIENCE: I can't see quite well, but I think 12 and one half big blinds?

PROFESSOR: Yeah, exactly. I'll remember it. I'll put it here. So we only risked 12 and a half big blinds because everyone in front of us has already folded. And the true players behind the small blind and the big blind, one of them only has 7 and a half big blinds and the other one only has 12 and a half big blinds. Can everyone see that?

So even though UTG plus two has 32 and a half big blinds could have theoretically covered us and could have theoretically taken all our money. But on this hand, because they already folded, we're essentially only risking at most 12 and a half big blinds. That's the most we can lose. So we can't be eliminated from the tournament this hand.

So here's another example. So here you see this guy that I'm going to call Low Jack for now, you see him go all in for 16 big blinds. So in this case, technically he could lose all his money because we have more than 16 big blinds and we do cover him. But in reality, I'd say he's probably not on average risking 16 big blinds because most of the people who could call him, other than us, have way less than 16 big blinds. Does that make sense?

So essentially, there's not like an exact formula for effective stack size. But you want to sort of think of it in terms of you look at the stack sizes of all the people who could potentially play the hand against you. And you want to look at roughly how much am I going to be risking in this hand. That's effective stack size. Does that make sense to everyone?

So the second thing I want to talk about that's very important is position. So position is basically where you are at the table relative to the blinds. So how many players are remaining to compete versus me for the blinds? Essentially, the fewer players that are left the less strong my hand needs to be attack the blinds.

So I'm going to give names for the positions. Basically the thing that matters is how far you are from the button. So in this example with ace jack, when everyone folded to you and you're the dealer, which is also known as the button, essentially to steal the blinds, you just need to get through the two blinds themselves. So that's essentially two players.

And then if you're in one position earlier, when it was folded to cut off, then he had to deal with three players. So you want to name everything relative to the dealer essentially. So we can just quickly go through the names. They'll get more familiar as time goes on.

So the first person to act, we call them under the gun. So that's this guy here. And then we go around the table. It's under the gun plus one, under the gun plus two. There can be different names for the same position.

So in this specific hand, where there were only six players at the table, you could have called Low Jack under the gun because under the gun is the first person to act to the left of the big blind. But essentially, it's more clear to say Low Jack because when you say under the gun you have to say under the gun at a six-handed table. And then people will know that means you got to get through five hands. But if you say Low Jack it's very clear. You have to get through five hands.

So under the gun and then plus two, and then Low Jack. You can also call under the gun plus two, Low Jack minus one, if you want. And then High Jack, cut off, button. Don't ask me where these names came from. I actually have no idea. And then small blind, big blind.

So that's position. And the importance of position is basically the later you are the fewer hands you got to get through to steal the button, to steal the blinds of everyone who's already folded.

So the third thing that matters in a poker game is equity. So we talked about stack size, we talked about position. The last thing is actually your cards. So the equity of your cards is basically your secret height for the Who's Taller game.

So you can think of it like that. Your cards is like your secret height. And the probability of you winning the pot, or equivalently, the fraction of the pot you would win once the remaining cards are dealt is called your equity. So I'm going to give some examples of calculating equity.

So here's an example where we get it all on the turn. And so I have five four states spades here. So I'm not in a great position. But let's count how many outs I have. So this one I want to just name a river card that would help me win the pot.

AUDIENCE: Seven of spades.

PROFESSOR: Yes.

AUDIENCE: Seven of spades.

PROFESSOR: Seven of spades. So that gives me a straight and a flush actually. So let's go along those lines. So how many spades are there left in the deck?

AUDIENCE: There should be nine spades.

PROFESSOR: Right, nine spades. And then how many cards help me make a straight here?

AUDIENCE: Six more.

AUDIENCE: Seven or a two.

PROFESSOR: Right. A seven or a two. So it's six more because there's eight sevens and twos. But I'm double counting the seven of spades and the two of spades. So that's nine plus six. And then I actually have to subtract one more card. Does anyone see what it is? Yeah.

AUDIENCE: The queen of spades.

PROFESSOR: Right. The queen of spades because that's actually a disastrous situation because then if it wasn't all in I would probably put more money in the river thinking I have a flush and lose to a full house. So it's 14 outs, 17 minus 3. Our equity is around 32%. It's 14 over the 44 cards that could still come.

Did that make sense to everyone? So that's one way of calculating equity. It's just a very simple probability distribution over the remaining cards in the deck that could come.

And if you're worried, what if someone folded the queen of spades so I shouldn't be counting that? Essentially, you just want to pretend that the cards that are in the muck, like the cards that were folded by other players, they're essentially irrelevant because they could have folded the queen of spades. But they could have also folded an irrelevant card, like the jack of hearts or something.

So essentially you just want to all the cards that you haven't seen, it's easier to just assume that they don't affect anything. In theory they affect things a bit because if you know that the dealer button folded two cards, even if you don't know what they are, they're probably more likely to be really crappy cards, like twos and threes rather than aces because if they had an ace they would have played it. But I think not worrying about that is fine. You don't need to worry about it.

Another example of equity calculation. So it's the exact same example as before. But in this case, the probability distribution isn't over cards that could come. All the cards have already come.

The probability distribution is just over my opponent's hand. So this is the same calculation. We calculated our equity is 25%. And that's not a probability distribution over cards to come. That's a probability distribution over our beliefs of our opponent's hand.

So example three of calculating equity. So this one you can't really do by hand. Let's say you get it all in pre-flop, with ace king suited against pocket twos. And you basically need a calculator. There are certain websites that help you do this.

And ace king suited against pocket twos, if they don't have a two of your suit is actually a very small favorite which is cool. But if you have the two of hearts then you're actually a slight underdog. So just make sure for all the different things equity could mean.

Equity could mean a probability over river cards, probability over the unknown. A very good calculator in general is PokerStove, which all have lots of examples of in my slides. And you can download it via this link here.

So this is an example of using PokerStove. You can put in exact calculations. Let's say you get it in with two of diamonds, two of spades.

So pocket twos on the flop of 5, 3, 2 versus a range where you know your opponent is going to have a big pair, like pocket jacks plus, you can actually run it and your equity is 85%, which is actually surprisingly low, I think. Because it seems like you've got three of a kind they've only got a pair. They have two chances to hit one of two cards. But there's a lot of random stuff that can happen, like if they have aces they can make a straight, they can make a back door flush, the board could come two more fives, which counterfeits, or three of a kind deuces. So this is another example of calculating equity.

So this is just a summary of the different situations you might want to calculate equity. And I recommend you download PokerStove. But you don't really have to. And I guess one question you could ask is how do you actually do this at a table? So essentially, if you do this a lot while you're studying about hands and thinking about poker, eventually-- I've been able to-- you just memorize a lot of situations in general what the probabilities are, or at least what the correct decision is.

So now let's talk about raising two win the blinds and antes. So the antes, an ante is an extra small bet that each player must put into the pot each hand. And these sum to around a big blind. And they come in during the later stages of a tournament. And they're inexistant in cash games.

I stuck this under the rug in the earlier examples. So in-- let me go back-- so like in here. Do you see the 1,200 in the middle? So actually stuck this under the rug. But that comes from each player being forced to put in 200, which is 1/10 of the big blind, at the start of every hand.

And if you're playing tournaments then this would usually be the case. Antes will come in fairly soon. If you're playing cash games, where you just sit down and you can leave whenever and you sit down and just play for your own play money, then there's usually no antes. But antes actually make a world of difference in terms of what you want to do.

It's not just the stakes are bigger when there's antes because you don't want to just think of it like there's extra money in the pot every hand. It's equivalent to the blinds being bigger. That's not true because you also have to put in an ante. So if the blinds were proportionally bigger to cover the antes you would have to raise to a bigger size to try to steal the blinds. But with antes you don't need to raise to any bigger size to try to steal the blinds.

So it's actually very action driving. It's very exciting. It makes it so that your basically want to play a very wide range of hands. And you really just want to be trying to win the blinds as often as possible because winning one pot with the antes is so big. So this is what it looks like.

So the first thing I'm going to say is if you're going to start playing tournaments tonight is if no one has raised yet you really, you don't want to call. You want it to raise to give yourself a chance of winning the blinds without seeing a flop. So I'd say the most common beginner mistake I see beginner poker players make is not raising when no one has raised before.

So in this case, it would be just calling for 2,000 and trying to see a flop. But the main issue is you're giving the big blind they can just check and see the flop for free. Whereas if you raise you put them to a decision. And they might fold, and you might just win the pot for free without having to do anything essentially.

So how much do you raise when I say raise. The minimum raise size you can do is raising to two big blinds. However, this is a bit small because you give the blinds fairly good odds to make a profitable call. Although, it's not even that bad.

So let me just talk in general about raising big versus raising small. So the advantage of raising small is that you're risking less. Like let's say you raise and then the next person re-raises, and then the next person goes all in. So you know probably those two people have pretty good hands and you want to fold.

If you raise to a small size then you can fold and you don't lose that much. If I raise to five big blinds then I'm losing a lot more. But what's the benefit of raising big is you give other people worse odds to call. If I only raise to two big blinds, let's actually do this calculation.

Let's say here, instead of going all in I raise two big blinds, which is 4,000. What odds am I essentially giving the big blind you call to call? So this is actually, in this very exact situation, this is a very common mistake I see beginners make, which is you raise to only 4,000. Yeah.

AUDIENCE: So there's 8,000 that's in the pot and they have to call 2,000, so it's 21?

PROFESSOR: Right. So it's 41. But there's also a small blind. So there's actually going to be 9,000 essentially in the pot. Yeah.

So it's going to be 8,800 in. It's going to be 8,800. So approximately 9,000. And they only have to call 2,000 to see the pot. Does that make sense to everyone?

So the odds are actually 4.5 to 1. And ace jack off-suit is a great hand. But there's no hand that ace jack off-suit is more than a 4.5 to 1 favorite against. So in some sense, basically, even if they have jacks two, or whatever, ace two, think of the best possible case, you're still not doing better than the odds you're giving them.

So on the other hand though, it is really risky to raise more than 2.25. So a reasonable rule of thumb, I'd say, is to raise to the 2.25 big blinds in tournaments. I know it's pretty close to do. If you just raised to two it's probably not that bad. But roughly speaking, I think this is a reasonable rule of thumb.

Earlier you could try to raise to more. I think that's sort of customary, although I don't think it's theoretically optimal. You'll often see players raise to like three blinds, especially when you watch pro players. They'll raise to bigger. But the main reason is because they're the better player by a lot and they're just trying to make big pots, which is reasonable.

If you're trying to just play big pots and win big pots then raising to the bigger size is fine. But I don't think there's any theoretical reason to raise to more than 2.25 big blinds in most situations. Other than when in cash games, where all players have a lot of money, then it's a bit different because also in cash games you're not worried about risking more because you're not worried about losing out of the tournament.

So the other thing is instead of raising you should just go all in if the effective stack size is 12 big blinds or less. So recall, the rational for raising big is to prevent others from calling for cheap. But the rational for raising small is to lose less if we get re-raised and have to fold. But 12 big blinds is sort of the point where it's small enough that you never really want to fold after committing 2.25 big blinds.

So if I only have 12 big blinds, I raise 2.25, I've only got 9.75 left after. OK, fine, if I get raises and re-raised maybe I'll fold. But even if I get re-raised once, if I'm raising in the first place my hand is going to be reasonable. If it's a Who's Taller game I'm not going to raise in the first place if I'm 5 feet, or something. So your hand will be reasonable.

So this is beginner mistake number two is being too scared to go all in pre-flop. So that's why this ace jack hand, I actually cheated a bit. So technically here, the effective stack size is-- the big blind has 12 and a half big blinds but the small one only has 7 and a half. So I roughly said, the effective stack size is less than 12, and I just went all in.

So that's definitely a beginner mistake number two in tournaments is to be too scared to go all in pre-flop. So what your goal essentially should be-- oh, sorry. I should say the rule is-- so all of the numbers I said assume there's antes. If there's no antes you should change to 10 big blinds. So the threshold for going all in should be less because when there's no antes you want to be risking less because the pot is smaller.

So overall, when I talked about position I talked about stack size. Essentially what I'm trying to get the point across is if you're just starting out, players tend to make all decisions based on their cards. It's just like, I have a pair of jacks. I see my pair of jacks on this board. I have a pair that's pretty good. And you sort of tend to ignore what the effect of stack size is, how much you're wagering, and position.

But experienced players, the cards actually matter much, much less to the. In fact, if you're doing optimal play you don't care in some sense what your cards are. You just care what your range of cards are at that point. So experienced players, they're willing to raise the blinds with much weaker hands from good positions rather than early positions. And they're going to risk going all in a lot more frequently with a lot worse hands if their stack size is low. So I want you to think about that.

So if you're just starting out that's fine. If you're still trying to figure out whether you have a straight and stuff like that, that's fine. And it's totally fine if you try to play based just on your cards. But I want everyone's goal by the end of the class to be able to play based on these other factors more so than your cards.

So that being said, I'm going to give you some concrete suggestions for those of you who might want to start playing tonight of what hands you should be playing from each position. So most of the tournaments will have nine players per table. So this is going to be roughly a range I recommend to open. So these are the range of hands you should be playing from the worst position, so under the gun at a nine-handed table.

So there's eight players left behind you. You can pick up really good hands. So it's aces, kings, queens, jacks; ace king off-suited; ace king off-suit; 10s; ace queen suited; ace queen off. So that's basically the list of hands.

These are only premium hands. This is like 6.2% of hands, I think. It's a very premium range. So note how tight this is. Because the thing to remember is only the best hand out of nine hands at the table is going to get the pot.

When there's nine hands, even though the average hand is really bad the best hand out of nine hands is always going to be pretty good. So to even think that you have a chance you need to start out with a very premium hold 'em.

Roughly, what do you add as you go around the table? So for the second position-- I'm going to post all these slides before the tournament starts, by the way. So you don't feel that you scramble to write all this down. I'm going to post it.

From the second position, I put in black the hands that I would open from the previous position. And I've put in red the hands that I would open from that position in addition.

So second position there's one less player. OK. I'll gamble with sevens, ace, jack, king, queen, ace 10 suited. So we'll get around the table and then the range will slowly increase.

Also note that these are very conservative ranges. If you watch high stakes poker, probably the players will open a lot more hands than what I'm recommending here. But my general experience has been when you're just starting out to play poker it's much easier to err on the side of being too tight than being too loose. Because it's in some sense a lot easier to play pocket aces than seven five suited.

With pocket aces you're always going to have a pair of aces or better. You just back that bet. And then with seven five suite you need to be able to bluff very carefully, you need to sometimes be able to try to get value when you just hit a pair of sevens. And it's just a lot harder.

So the ranges I'm suggesting are very tight by most poker standards. By tight I mean conservative. But I think that's the right side to err on when you're just starting. So these are the ranges.

As we get to the later positions, so this is Low Jack, we're opening more hands now. Any two suited Broadways. So that's hands like jack 10 suited, queen 10 suited. It's basically any two cards that are the same suit and both 10 or higher.

So maybe I should just run through this quickly. So why is having your cards be suited good?

AUDIENCE: You get a flush.

PROFESSOR: Right. So if it suited you're more likely to make a flush. But if have two unsuited cards then there's two different flushes you could get. If you have queens or diamonds, 10 of spades is going to make a diamond flush or a spade flush.

Essentially, that argument is wrong because there's two reasons why it's wrong. One is it's not more of a play because you need four of the suit to make a flush. And two is, when you make the flush it's also more obvious you have the flush. like

If you only have one diamond, even if it's the ace of diamonds, once there's four diamonds on the board it's way more likely that you have a flush. So you're going to get paid off less. Whereas if you have two diamonds in your hand and there's only three diamonds on the board it's less susceptible that you have a flush.

So high jack is three to the button. So this is what I would recommend. So any pair, pairs are pretty good. Any suited ace, so suited ace means any card with any hand with an ace and another card of the same suit. Any suited connector. By that I mean hands like basically two cards that are next to each other and also the same suit, like 10, 9-- 10 of spades, nine of spades. Or any two unsuited Broadway cards.

So I do use a lot of terminology here. So yeah, please stop me if something is unclear. You can google most of these, I think.

And suited connectors are good. So like a hand like seven, six suited is often better than a hand like 10, 6 suited, even though the trend is bigger than the six because ten is sort of a small enough card where it's not that relevant, the fact that it's a big card. But the fact that seven, six is connected and can hit a lot more straights is very relevant.

So we're if we're cut off I'm just going to show you on PokerStove because there's too many to list. But the thing I want you to know is the percentages. 30%. So remember, under the gun I recommended playing 6%. So we've multiplied how bad our hand could be for us to play by five.

And I think this is what I really want you to try to do. And I think what a lot of new players don't do enough is they always play the same cards regardless of their position. And then even crazier, on the button, if everyone in front of you folds, I recommend them playing about 55% of hands.

This is huge. I want you to play jack three suited. Who thinks jack three suited is a good hand? Or king four off-suit, or queen six off-suit.

What hands do we play from the small blind if it's folded to you? So let's compare opening from the small blind to opening from the button. So for small blind it's a bit different now. So I should mention this because when you're raising from the button, from the dealer, if you get called you get to act last post-flop.

So I'm going to talk more about this in future lectures. But this is basically called having position post-flop. And having position is basically good. Yeah.

AUDIENCE: For all these hands, are you assuming that everyone's folding behind you?

PROFESSOR: Yes, yes. If people haven't folded, essentially, I'll talk more about this in future classes but essentially what you need to do if-- so let let's say this guy raised and you're the button. Essentially you need to consider what his range is and calculate your equity against his range, which you can do--

AUDIENCE: The calculation is completely different.

PROFESSOR: Right. So the calculation is essentially completely different. So essentially the first guy who raised determines what the barrier to entry to the pot is. Because essentially the first guy that raises, if it's from under the gun, he's telling you, my range is pocket eights plus. Even if he's sort of loose it's still going to be pocket fours plus and queen 10 suited plus.

Basically it's saying, if you're playing eight six suited then you're basically in idiot, because my range is way better than that. So essentially the first guy who plays this hand sort of sets the tone on how good your hand needs to be. Thanks for that question. So I'll talk more about this in a future lecture. But essentially that's what you need to do.

So from the small blind let's compare opening the small blind from opening the button. So the issue with opening the small blind is if the big blind calls you actually don't have position post-flop. The big blind gets to act after you post-flop. Pot. So it's actually a lot worse in some sense than opening from the button because you don't get you act last.

But it is better in the sense that there's one fewer person to get through. And also, you have less to wager, which is actually really relevant because from the button, if the blinds are 2,000 or 1,000 or OK, let's just dictate it.

So if the blinds are 20, 40, if you want to steal the lines from the button and you want to do 2.25 you need to put in 90. You raise me 90 and you need to wager 90. But if you're raising from the small blind you only need to put in 70 more. So you're actually paying a smaller price to try to steal the blinds.

But the fact that you're out of position is a very important negative. So I'd say all these factors balance out. And you can roughly play the same range of hands from the small blind as you would from the big blind. I think that's a reasonable rule of thumb.

But the fact that you're out of position hurts a lot less as stacks get shallower, essentially as effective stack sizes get a lot smaller. So you could really drastically increase this range if you only have 10 big blinds from the small blind. So let me just give a few caveats about this.

So actually, I'll leave it to you guys. If someone comes up with a very good complaint about these recommendations I'm giving you I'll give up the last $20 gift card. So what are some problems? Normally I get lots of complaints in this section. You already got one but you can answer. Yeah.

AUDIENCE: They're very predictable so it's unusable.

PROFESSOR: Good. That's a very good point. A lot of people ask me, why would you follow these? If I follow these everyone knows what I'm doing. Well the point is, for most of these, even this range, it sort of encompasses enough hands. So this is where I'm talking about optimal balance play.

I think even though humans haven't solved poker, I think roughly, if you want to ask, I would guess that roughly the optimal range of hands you open from this position is something like this. It's going to be looser. But it's something roughly like this.

Probably you want to probabilistic play some smaller cards just so when the flop comes 2, 2, 4, you can potentially have a two. But essentially you're not doing bad here because if the flop comes three small cards then you were going to be doing pretty well with all your big pairs. And if the flop comes the big cards you've still got ace king, ace queen, ace jack in your hand.

But not that is a good complaint. So in theory, it's in theory optimal to play seven seven some fraction of the time, six six some fraction of the time, and two two some fraction of the time. But that's making things really complicated.

I don't want to add two two to the list because if you play two two every single time then that's way too loose. You're just spewing money by raising with two two against this many players behind. So ideally maybe with two twos you should do a random number generator and 10% of the time you play two two. That would probably be theoretically optimal. OK, good. That would have gotten this. But OK, I'll give it to-- Yeah.

AUDIENCE: I think another problem with playing really tight all the time is that you are also very easily bullied around the table. If someone bets and you really want to play, you really want to stick to your tight range, you'll fold and you'll maybe miss out on something that you shouldn't have depending on your position, and things like that.

PROFESSOR: OK. Good. So yes, these are pretty tight in that it does give the other people incentive to raise a lot because everyone's going to be folding a lot because you are just waiting for these really good hands. OK. That's good. Are there any other questions about these suggestions I gave you guys? Or are there other potential complaints?

I'm going to give this gift certificate to you unless someone thinks that they have another complaint. So I'll give this to you at the end.

This is one thing that I didn't really address. I talked about going all in as well as just raising. So how do these ranges differ? The answer essentially is they don't really differ.

So let's say, here, these are the hands I told you to play. Let's say you were sort of small enough, you only had 12 big blinds with antes and you were supposed to go all in instead of raise, which of these hands do you go all in with? It's roughly the same range.

I think that's a decent rule of thumb. If you just go all in with the same range you're not going to be doing badly. But when the effective stack size is much smaller, like let's say you only have three big blinds, then you can drastically increase this range.

So when your effective stack size is actually say 10, or let's say your effective stack size is five big blinds, intuitively you might think, I can wager a lot, I can wager a lot wider range of hands. So here's the argument. The argument is I'm only risking five big blinds. In the other case I was risking 12 big blinds. I should be way more aggressive with five big blinds than 12 big blinds. So that's the argument.

But the reason why that argument is wrong is because with five big blinds you're going to get called every time. With five big blinds you can't just go all in with seven two off-suit because your opponent has such good odds to call, they're going to call you. Whereas with 12 big blinds, even though you're risking more you also have more chances of getting everyone to fold.

So even though it seems like you're risking way more, in some sense you're not. Because the more chips you are risking the more likely it is that everyone else folds. Does that argument make sense to everyone?

So make sure you don't get tricked by that because it is easy to get tricked. But the fact that when you're risking more you're getting people to fold more means you're actually not risking as much as you think you are. And that's a common fallacy.

So we're near the last part of the class. So are there any other questions? I know some of you probably want to start playing tonight, and I definitely did not address everything you need. But are there any other questions roughly, in terms of opening and going all in that you might want to know before playing tonight? Because I do think we have maybe five extra minutes. So I can answer some questions.

Does everyone not know what the term "bluff" means? I said bluff a bunch of times but I realized I never explained. So bluffing essentially means you have a bad hand and because you have such a bad hand you know the only way of winning that is to get your opponent fold. So basically you bet big to hope that your opponent folds.

If you have a better hand, like an average hand, often you actually don't want to be betting because there's no point. Because when you bet you're essentially just getting adversely selected, where your opponent calls you when their hand is better and you lose more money to a better hand. And your opponent folds when their hand is worse so you're not making any money from a worse hand. And then with your best hands, you want to bet to try to get your opponent to call and increase the amount that you win. And we call this value betting.

So I'm going to get more into this in future lechers. But essentially in poker, the paradigm you want to play is you want to value bet your very best hands and then you want to basically check and try to get to show down, like try to show your hand on the river with your medium hands. And also bet with your bad hands to try to get your opponent to fold better hands.

I'm going to get to the last part now. So quick test. So here what would you do? It's folded to you. You're in the high jack. You have ace, five of clubs. So someone want to suggest what would you do? What would your play be?

AUDIENCE: Call.

AUDIENCE: Bet.

AUDIENCE: Raise.

PROFESSOR: Raise. OK, good. Raise. So not call. That was mistake number one to not do. You always want to raise the blinds if it's folded. But it's fine, it's fine. That's fine. We're all here to learn about these hands.

Now should I go all in or should I just raise to 450, which is 2.25? I should go all in. OK. I think it's not terrible if you just raise to 450. But I said if it's 10 big blinds-- there's no antes so you need 10 or less. And you have 10, so going all in I think is fine.

So we decide to go all in. So there are Nash calculators. They're complicated. But if you run this through a Nash equilibrium calculator this is roughly what you should be going all in with. So we go all in.

So this range is a bit different than what I showed you before. This is an actual Nash calculation just not just my recommendation. It's going to be wider than what you saw before but that's fine.

So now you're this guy. You're seat nine. So I'm going to show you how you do a quick calculation of what equity you need to call here. So what equity you need to call here.

A straight up pot odds calculation says you're getting 23 to 20 or 1.15 to 1. So roughly you need 46.5% equity to call as this player. But there's two players behind.

There is a graphics glitch because seat one has not folded. Seat one and seat two both has cards. So you actually need more than 46.5% equity because the calculation of 46.5% assumes seat one and seat two can't wake up with pocket aces.

PokerStove is very good for this. And you can do this right now. I've taught you everything you need to do these calculations. And PokerStove is also easy to figure out how to use. You can essentially just plug in.

You look at pocket fives, it has 48%. You need 46.5%. And I said you need a bit more than 46.5 due to the fact that there's two players behind so pocket fives, it's good enough. Well, I'll call ace nine suited. I'll call.

So I'm just showing you some calculations on PokerStove. It really is a really good tool. So you can do some calculations. So basically they call. And the hands I showed you here is actually sort of the worse hands that it's plus CV to call with. Pocket fives, ace nines suited plus. So aces nines suited plus means ace nine suited, ace 10 suited, ace jack suited, all the hands that are strictly better than ace nine suited. Ace 10 off-suit, king queen suited.

As the last player here, do you call with king queen suited? It's just a rough guess. How many of you intuitively would call with king queen suited? Two players have already gone all in. So most of people would fold. Oh, you guys would call.

So it turns out, at least according to the Nash calculator-- So here you calculate the pot odds. And this is an exact calculation because there's no more players behind. So just calculate, do I have 29.5% accurate?

Basically if I assume that player one and player two are playing according to Nash equilibrium or optimal ranges, you actually have way more than enough. You actually have 34% where you only needed 25%. So this is a trick question I set up.

The trick was king queen suited doesn't seem like a good hand. But against two other hands it's actually relatively a very good hand. Against one other hand, they could just have a pair and you're behind. They could have an ace most of the time and you're behind.

But against two other hands you actually have really good equity because when you hit a king or queen you have the big pair. You can also hit straights and flushes with king queens suited. And your equity against two other hands is very good. So you have way more than enough so they call. So we're going to get in this all in freefall.

So who are the people I gave these cards to? You guys can come up to the front and I'm going to hand you your cards. The people who I gave these gift certificates to, I said.

AUDIENCE: Do you want me to stay up here?

PROFESSOR: Yeah. Stay up here. So what I want you guys to do-- these are the hands people went all in with. Who was the first person to answer? OK. So you get to pick. Which hand do you want?

AUDIENCE: I'll go with jack jack.

PROFESSOR: OK. So Colin is going to jack jack. Sorry, what's your name?

AUDIENCE: Kevin.

PROFESSOR: Kevin. Kevin, which hand do you want? So he took jack jack. Which of these remaining-- you were second, right? Yeah, you were-- OK.

AUDIENCE: I was last.

PROFESSOR: OK. Which hand do you want?

AUDIENCE: I want king queen.

PROFESSOR: OK. So he's going to take king queen. So you're going to take ace five plus. This is sort of mean but I want to show you guys this how poker works. When you've already won, you've won $20. You guys are going to put in your hand and whoever wins this hand is going to get all $60. [LAUGHTER]

We're going to go for it. So this is the free flow of equities. So it is pre-determined. So it looks like you guys did very well.

You guys picked correctly. You got to pick first. You picked the best hand with equity of 34.73. You got 32.95. You're only a bit behind with 31.1. So the flop is 6, 7, 8.

So let's look at the probabilities now. Jacks is still ahead, ace five has picked up a lot of outs. King queen of diamonds is not looking too good but they did pick up one diamond.

Notice the nine of hearts. So, sorry. Like Queens. I'm sorry. So you're out of the running.

AUDIENCE: He needs a 10.

PROFESSOR: 2 to 1 to cheer. Is there any card you're hoping for? What are you hoping for?

AUDIENCE: I think a 10.

AUDIENCE: A 10 or a heart.

PROFESSOR: So you want to be screaming. I'm going to press it on the count of three. You want to be screaming 10 or higher. And you want to be screaming--

AUDIENCE: Anything else.

PROFESSOR: So I guess you got all three of us. There's going to be more of these good card stories in class. So participate in class, you're going to get gift certificates courtesy of [? Acuma ?] Capital. All right. Thank you guys.

[APPLAUSE]

Free Downloads

Video


Subtitle

  • English - US (SRT)