Jump to content

User talk:Valoem/Poker probability (Texas hold 'em)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
WikiProject iconGambling: Poker NA‑class
WikiProject iconThis page is within the scope of WikiProject Gambling, a collaborative effort to improve the coverage of Gambling on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
NAThis page does not require a rating on Wikipedia's content assessment scale.
Taskforce icon
This page is supported by WikiProject Poker.
Note icon
An editor has requested that an image or photograph be added to this page.
WikiProject Gambling To-do:

Things you can do

  • Current collaborations:
Improve an article to FA
Improve an article to A
  • Help with the Gambling articles needing attention.
  • Tag the talk pages of Gambling-related articles with the {{WikiProject Gambling}} banner.
  • The link to the Missouri gambling site is now out of date and needs to be updated.
  • Japan section reads as though it was written by the gambling industry - quotes of 160% returns are 'citation needed'.

Opening comment

[edit]

You know a big problem with this page is I read "barring the miracle flush or straight" a lot. Why would we bar it? It's a part of the game it needs to be figured into the odds.....

Values in "odds" columns

[edit]

Is it just me, or are all the numbers in the "odds" column 1 lower then what they should be?

I will change them but if someone else points out where my math is going wrong, then my apoligies.

I can assure you, it was correct before. It's okay, it's a common mistake. The formula is: the odds are defined by (1/p) − 1 : 1, where p is the probability. So, if p = 1/2, the odds should be 1:1, not 2:1. What you're forgetting about is that its the relative frequency of winning to losing, not winning to total action. Revolver 07:34, 5 Jul 2004 (UTC)

I'd like to see...

[edit]
  • Also, if you flop a (say) heart flush, odds someone has a single bigger heart (drawing to a bigger flush) by number of opponents. Brian Alspach has some interesting stats about losing flushes (assuming all opponents see the river). [1] I was thinking about putting his findings in, but sans the derivations (too complicated for article). Think it would be a good addition?--Toms2866 02:25, 11 May 2006 (UTC)[reply]
  • What I miss a little bit is the probability to hit something on the flop. E.g. with JTs, how likely is it to have a straight draw (open-ended and/or gutshot), or a flush draw, a pair, two pair,... I found some useful tables at Mike Caros University of Poker Library (XVIII-XXVI for Hold 'em) with some numbers (e.g. 8.14:1 against having a flush draw). WhoCares01 21:43, 10 January 2007 (UTC)[reply]
    And even some simpler odds have been left out. For instance, if you have any unpaired hand, the odds of hitting a pair or better (using at least one of your hole cards) is about 32.4%. I'd like to see other useful odds, such as the odds of hitting two pair or better on the flop, starting with an unpaired hand and using both of your hold cards. Deepfryer99 (talk) 17:29, 26 August 2008 (UTC)[reply]

Starting hands

[edit]

I think the formula {52 \choose 2} = 1326 is going to be pretty incomprehensible to 99% of readers. Would it not be clearer to say 52 times 51 divided by 2 = 1326? We non-mathematicians can understand that the first card may be any of 52, that for each first card the second may be any of 51, and that we divide by 2 because each combination may be produced by either Card A followed by Card B or Card B followed by Card A.

In any case, for holdem, 169 is the magic figure. Distinguishing hands such as 5C 3H from 5C 3S is irrelevant and misleading. Obviously those two examples play exactly the same and have the same chance of winning. I've changed this, but my derivation is pretty clunkily worded, and probably not necessary anyway. Stevage 17:23, 3 December 2005 (UTC)[reply]

I changed the discussion to include both and 52 × 51 ÷ 2 = 1,326. I think by discussing both means of representation, the combinatorial math can be introduced in a way that makes is comprehensible to at least most of the other 99% of readers. It gets really messy trying to show the calculations without binomial coefficients once you get beyond choosing two from a set. I also expanded and hopefully made less clunky the explanation of the 169 different strength starting hands. – Doug Bell talkcontrib 07:23, 6 February 2006 (UTC)[reply]

What does the "any specific (no/)pair" phrase mean in the starting hands table? Does that mean that any pair in your starting hand as the same odds as AA? That doesn't seem to make sense. Revise the numbers please and link this to combinatorial game theory. 70.111.251.203 23:28, 11 February 2006 (UTC)[reply]

That wording existed in the article before I began editing it. The word specific is the key to understand the meaning. AA and KK have the same odds. AK and T2 have the same odds. AKs and 78s have the same odds. Each of these is an example of a specific hand with the same characteristics (hand shape). – Doug Bell talkcontrib 11:47, 23 February 2006 (UTC)[reply]

Calculations for probability of facing larger pocket pairs from multiple opponents

[edit]

The equations I entered are wrong. I will fix them soon, but please leave them there for the moment unless you want to fix them. They are actually reasonably close approximations. The function cannot use (1 - psingle)players as the events are not independent. The calculation and explanation needs to use (players × psingle) - pmultiple where players is the number of opponents faced, psingle is the probability that a single opponent has a higher pair and pmultiple is the probability that multiple opponents have a higher pair. – Doug Bell talkcontrib 23:08, 6 February 2006 (UTC)[reply]

OK, I fixed the equation and the results table. – Doug Bell talkcontrib 01:28, 8 February 2006 (UTC)[reply]
There are very significant errors in the tables for 1 or more opponents having a higher pair, and for 2 or more opponents having a higher pair. The equations given are correct, but they were not followed correctly to obtain the table numbers. Also, computing the P's and summing them is not necessary. It requires extra work, and the equations for computing them are not given. These P's could be computed using a generalized form of inclusion-exclusion, but this is not necessary as the numbers in each table can be computed directly without the P's using inclusion-exclusion.
Instead of P's, define pk as the probability that k SPECIFIC opponents have higher pairs. Then we compute the probability of 1 or more of n opponents having a higher pair as
P = C(n,1)*p1 - C(n,2)*p2 + C(n,3)*p3 - C(n,4)*p4 + ... C(n,n)*pn
by inclusion-exclusion, where the final term is added for odd n or subtracted for even n. Note that C(n,k) is combinations of n things taken k at a time. This is the number of ways to choose the k opponents with higher pairs from the n opponents. Of course, while this gives the exact answer, only enough terms need be computed to obtain the answer to the required precision. Note also that
p1 = (84-6r)/1225.
For the probability of 2 or more of n opponents having a larger pair, we compute
P = C(n,2)*p2 - 2C(n,3)*p3 + 3C(n,4)*p4 - 4C(n,5)p5 + ... (n-1)C(n,n)*pn
by a generalization of inclusion-exclusion, where the final term is subtracted for odd n, and added for even n.
Below is shown the correct way to compute the table entries for a pair of deuces vs. 9 opponents. These methods are simpler than the article's methods, and produce greater accuracy for less work.
Here is the probability of 1 or more opponents having a higher pair with 9 opponents when you hold a pair of deuces:
9*72/C(50,2) -
C(9,2)*72*67/C(50,2)/C(48,2) +
C(9,3)*72*(66*62+1*66)/C(50,2)/C(48,2)/C(46,2) -
C(9,4)*72*(66*(60*57+2*61)+1*66*61)/C(50,2)/C(48,2)/C(46,2)/C(44,2) +
C(9,5)*72*(66*(60*(54*52+3*56)+2*(60*56+1*60))+1*6 6*(60*56+1*60))/C(50,2)/C(48,2)/C(46,2)/C(44,2)/C(42,2) -
...
=~ 41.9%
Not 36.33% as the table shows. The first 2 terms above are already above 40%, and this is a lower bound. Actually 36.33% may have been erroneously computed from
(72/1225 * 9) - 2*P2
but this is wrong. We would want to subtract P2 + 2*P3 if we were using the article's method.
The above may look complicated, but the terms follow a pattern. To get each next term, replace each final *n in the previous term with
*(6m(n-5) + (n-6m)(n-1))
where 6m is the multiple of 6 just less than n. When n is a multiple of 6, just multiply by another factor of (n-5). Then multiply by the next combinatoric factor, divide by an extra combinatoric factor, and alternate + and -.
Here is the probability of 2 or more opponents having a higher pair with 9 opponents when you hold a pair of deuces. This requires generalized inclusion-exclusion, which here means the factors of 1,2,3... starting each line.
C(9,2)*72*67/C(50,2)/C(48,2) -
2*C(9,3)*72*(66*62+1*66)/C(50,2)/C(48,2)/C(46,2) +
3*C(9,4)*72*(66*(60*57+2*61)+1*66*61)/C(50,2)/C(48,2)/C(46,2)/C(44,2) -
4*C(9,5)*72*(66*(60*(54*52+3*56)+2*(60*56+1*60))+1 *66*(60*56+1*60))/C(50,2)/C(48,2)/C(46,2)/C(44,2)/C(42,2)
...
=~ 9.5%
Not 14.484% as the table shows. That is larger than just the first term above which is an upper bound.
These calculations have been confirmed by simulation.
Brucezas (talk) 19:05, 17 October 2011 (UTC)[reply]

Latest changes

[edit]

Just dropping by to say great work on the latest changes. This article could become one of the best within the WikiProject when completed. Look forward to seeing it progress! Essexmutant 00:04, 10 February 2006 (UTC)[reply]

Combinatorial game theory and complexity

[edit]

Added them to the related links, since they are part of it. 128.6.175.60 20:24, 20 February 2006 (UTC)[reply]

Pictures

[edit]

Though I love what's going on with this article, there seems to be an overdose of pictures. I'm on a DSL connection and all the pictures don't load within a short time. Is there a way we can reduce the amount, while still keeping all the good information? Perhaps just a single picture file that has the whole chart, instead of a chart with a lot of pictures? 128.6.175.60 20:35, 20 February 2006 (UTC)[reply]

First, some of it depends on the general responsiveness of Wiki. The "pictures" are all the math equations. If you set your preferences under the "Math" tab to "HTML if possible or else PNG", many of the equations will be rendered as HTML instead of images. There will still be a lot of images, but probably less than half. Try this and let me know how it works for you. – Doug Bell talkcontrib 21:22, 20 February 2006 (UTC)[reply]

3 mistakes so far (please check)

[edit]

The 4th formula in the chapter "starting hands against multiple opponents" seems incorrect: "... and against n opponents is H =..." The passage is "50-2k" , it should be "52-2k" or it does not work out.

I think you are forgetting that 2 cards are already in the player's hand, leaving only 50 cards remaining in the deck to be distributed. – Doug Bell talkcontrib 18:42, 8 March 2006 (UTC)[reply]

Chapter "Pocket Pairs": The Link "Probabilites during play" does not work!

Fixed. I renamed the section and forgot to change the link, thanks for pointing that out. – Doug Bell talkcontrib 18:42, 8 March 2006 (UTC)[reply]

The chapter "Hands with one ace": The formula contains a "*2" in the second half ("3/50 * (13-x)*4*2/49"). Damn, where does this *2 come from? Without it, it should be in the end: 3/1225 + [6*(13-x)/1225].

Added an explanation for the *2. – Doug Bell talkcontrib 18:42, 8 March 2006 (UTC)[reply]

Correct me, if I am wrong. I'd be glad if (in case these are mistakes) get corrected soon.

Thanks.

Sam

Germany

Thank you for your comments, please feel free to either provide additional feedback or simply edit any problems you find in the article. – Doug Bell talkcontrib 18:42, 8 March 2006 (UTC)[reply]

Head-to-head probabilities for different starting hand matchups

[edit]

I added a section on head-to-head matchup probabilities. It doesn't have the mathematical rigor of the other sections, but it may be the most useful section in the article from a practical "at the table" point of view. It certainly seems like something a reader might be looking for in this article. My personal opinion is that adding math rigor for this topic would consume an inordinate amount of space with little added value for 99% of readers. --Toms2866 00:07, 24 March 2006 (UTC)[reply]

Well this is certainly open for discussion, but my thinking on the matter is that if all you want is tables of odds, there are many places on the Web to get those. So my philosophy in developing the article was to link the math and the probabilities. After all, the name of the article is "Poker probability" not "Poker odds tables". So I haven't put anything in without a discussion of the math.
However, the math for complete head-to-head comparisons is not practical. These situations are pretty much only determined through brute force, so I'm fine with the section you added and appreciate the contribution. I will probably tweak it a bit, in particular I've kept all the odds in a X : 1 format so that they can be easily compared. —Doug Bell talkcontrib 02:25, 24 March 2006 (UTC)[reply]

Anyone wanna play me look Brian Maberry up on facebook bring it —Preceding unsigned comment added by 70.91.32.217 (talk) 20:43, 28 September 2010 (UTC)[reply]

References

[edit]

Not related to the above particular calculations, but I noted with pleasure that the chapter "Flopping overcards when holding a pocket pair" matches similar calculations by Brian Alspach: Overcard Calculations. He has a number of other interesting poker calculations that may be worthy of inclusion in this article. Examples include probabilities of straight completion by starting hand, probabilities of making a losing flush by starting hand, board suit and rank distributions, etc. See Poker Calculations by Brian Alspach. --Toms2866 02:44, 23 March 2006 (UTC)[reply]

Thanks, that is good site—better than the other references in the article. I've been having trouble finding quality references. The calculation for overcards is one of the simpler calculations, but it's nice to have independent verification. —Doug Bell talkcontrib 02:54, 23 March 2006 (UTC)[reply]

Move

[edit]

I suggest that we move this article to Texas hold 'em probability. It is a simpler name and therefore better in my opion. --Maitch 16:58, 1 May 2006 (UTC)[reply]

There is a poker probability article so this one should definitely stay what it is to be consistent. 2005 19:40, 1 May 2006 (UTC)[reply]

I'm aware of that article, but subarticles doesn't have to use parenthesis. I would actually say it is more normal to not do it. History of France is a subarticle of France and it doesn't have the name France (history). --Maitch 19:52, 1 May 2006 (UTC)[reply]

It is certainly much more common to do it the way it is, as there are more than dozen of examples in Category:Poker gameplay and terminology that are structured with (poker) parentheses. I don't see any reason to go non-standard here, although I certainly agree it reads better as Texas Hold 'em probability. If nobody cares about going a non-standard way in a few days, go ahead and move it if you want. 2005 20:02, 1 May 2006 (UTC)[reply]
I don't have a strong preference either way, but my slight preference is to leave it named as it is for the reasons 2005 states. —Doug Bell talkcontrib 22:24, 1 May 2006 (UTC)[reply]

Well, in my opion there is a difference between e.g. Aggression (poker) and Poker probability (Texas hold 'em). First of all is Aggression (poker) not a subarticle of Aggression. Secondly, all the other articles you call standard use "(poker)", which is different from "(Texas hold 'em)", so this article is really alone in that category. --Maitch 22:38, 1 May 2006 (UTC)[reply]

I'm inclined toward not renaming the article. I think the current naming scheme makes it clearer that this article is a variant-specific discussion of the topic more generally covered in Poker probability. That said, my opinion is not strongly held.--Toms2866 23:32, 1 May 2006 (UTC)[reply]

Texas hold'em hands

[edit]

Texas hold 'em hands should be included into this article. That chart that they have along with the formula below it would be handy in the probability department. 70.111.244.69 14:49, 29 July 2006 (UTC)[reply]

Shoudn't "\times" ("") be replaced by "\cdot" ("")?

[edit]

It is my opinion as a LaTeX-amateur that the symbol "\times" ("") be replaced by "\cdot" (""). It would be mathematically more correct. The symbol "\times" appears nearly 30 times throughout the article, so I thought I'd ask before changing it! --NicApicella 20:40, 20 August 2006 (UTC)[reply]

The x symbol is probably more easily recognised by those without a maths background. That's not a "no", but something to consider. Stevage 20:46, 20 August 2006 (UTC)[reply]
That was my reasoning in using the symbol. —Doug Bell talkcontrib
I slightly prefer the "\cdot". Any reader with adequate mathematical background to understand the equations will be familiar with the symbol. The symbol can be a awkward in equations using the variable "x".--Toms2866 17:02, 18 October 2006 (UTC)[reply]
I slightly prefer the myself, but I really worked hard to keep the math in the article approachable for everyone—the was just one small component of that effort. And while I agree in principle with the "awkward with 'x'" argument, TeX does at least make the two rather easy to distinguish. I suppose one option would be to replace with some other variable in the equations that use it. —Doug Bell talkcontrib 17:29, 18 October 2006 (UTC)[reply]

Miscalculations in section "Head-to-head starting hand matchups"

[edit]

"Probability" doesn't equal "Odds for" in this section. - Jack's Revenge 23:59, 9 December 2006 (UTC)[reply]

Actually it does. The confusing part with this table (one of the few that I didn't add to the article) is that the odds displayed are odds for the event happening. This contrasts with the rest of the odds in the article which are the odds against the event happening. Odds for are the inverse of odds againsts (i.e. 3 : 1 odds against an event happening are 1 : 3 or ⅓ : 1 for the event). So the odds in that table are calculated by the formula p / (1-p) and are correct.
The reasoning (I presume) behind using odds for in this case is that the favorable outcome for the player is represented by the odds in the table. —Doug Bell talk 11:56, 10 December 2006 (UTC)[reply]
Sorry. - Jack's Revenge 20:33, 12 December 2006 (UTC)[reply]

Chance of two suited cards making a flush?

[edit]

Somewhere in this article it would be nice to see the chance of:

  1. ...two suited cards meeting two more of the same suit on the flop
  2. ...meeting 3 more of the same suit on the flop
  3. ...either way, two suited cards completing to a flush by the river. Stevage 10:51, 16 February 2007 (UTC)[reply]
OK, I'll add that. Same with connectors making a straight draw, a pair making trips. May be a few days before I get to it. —Doug Bell talk 18:38, 16 February 2007 (UTC)[reply]

Flopping overcards when holding a pocket pair

[edit]

I think we should also point out, that these probabilities for overcard flopping also have flops that make pocket pair to come Three of a kind. For instance, there is 0,5696 probability for overcard to flop when holding JJ pocket pair, but there is also 0,1176 probability for one more J to flop. So there is 0,5696*0,8824 =0,50 probability for overcard to flop and J not to flop--Teveten 14:21, 14 March 2007 (UTC)[reply]

I'm a bit confused: You compute the probability for an overcard to be in the flop assuming, for example, having 4 Aces in the deck and choosing 3 cards out of 50. But since it means no danger to me with my pocket pair to see an overcard in the flop under these assumptions, I wonder if it might be more interesting to assume that my opponent already had an Ace, which reduces the remaining Aces to 3 and the deck to 49 cards. --Stefan 20:36, 21 August 2007 (UTC) —The preceding unsigned comment was added by 89.61.250.45 (talk)
To Stefan: I struggled with whether to include the odds you suggest. While it's true that "flopping an overcard" vs. "flopping an overcard that pairs a card held by an opponent" are not the same probability, the ace on the board is still a danger to you even if your opponent doesn't have an ace since you now have to worry that he does. Your hand is weakened whether the overcard pairs an opponent or not—you are now more likely to fold to a bet or be less aggressive, reducing the amount you might expect to win had no overcard appeared. The problem with trying to compute presumptive odds is that the cases can multiply quickly. If you have pocket queens against one opponent, your worst-case overcard situation is that the opponent has A K. This of course reduces the chances of an overcard appearing. —Doug Bell 08:12, 30 November 2008 (UTC)[reply]
To Teveten: Yes, the probability that your hand also improves is worth noting. For this particular case it is fairly straight-forward. Sometimes it makes a real mess if you have to account for every situation where more than one hand improves. For example, when determining outs you should also allow for the cases where you get your out, but the other player also gets a card to improve their hand, thus rendering your out worthless. If I think you have trips when the board is 5♥ 8♥ J♣ and I'm holding A♥ J♥, I'm counting my outs as the nine remaining hearts. However, if I hit my heart and the board pairs, my heart becomes worthless. So I should reduce my outs by the chance of me hitting my out and you also hitting an out to beat my improved hand. Often these situations are hard to generalize, although perhaps it's worth trying. —Doug Bell 08:12, 30 November 2008 (UTC)[reply]

Discussion/derivation

[edit]

This article is a bit too pedagogical for an encyclopaedia article, imho. This article should really be answering the question "What are the chances of X happening in a game of Texas hold 'em.", for various X. Discussion explaining how these figures are calculated is not very relevant, or could be moved down into a footnote, or an explanatory "Derivation" section at the bottom. In particular, what's the point of the "When calculating probabilities for a card game such as Texas Hold 'em, there are two basic approaches." paragraph in the intro? Sure, there are two different ways you can calculate them - but so what? The article is primarily about poker, not about maths. Stevage 07:06, 21 March 2007 (UTC)[reply]

Well, I disagree somewhat with your last statement—the article is about poker and math equally. Just because you're coming at it from the poker perspective doesn't make the math view of it any less valid or relevant. However, your argument is reasonable. If you want to move it to a note, that's fine—I won't move it back. As to using the <ref> tag, I wasn't planning on using those for the notes. I'm planning (or anyone else is free to do so) on making a pass at some point to add reference citations for parts of the discussion. Those would be different than the notes and would go in a separare section. —Doug Bell talk 07:29, 21 March 2007 (UTC)[reply]

Wrong Question

[edit]

To add to Stevage's comments and draw generalizations from them and some of the others:

When playing poker, you're not only interested in the odds of getting a hand, you're interested in the odds of beating your opponents hand. Given the cards you know about (hole cards and community cards), how strong is your hand? It's a comparison of the relative strength of your hand that's needed. So while I think the article provides important information, it's addressing the wrong question.

And I disagree on the mathematical validity point. The article is about applied mathematics. --71.202.189.23 23:44, 12 June 2007 (UTC)[reply]

Well of course a complete poker strategy needs both, but that doesn't make the information here irrelevant. Let's say, for example, one is drawing to a flush when the board is already paired; one needs to evaluate the situation as a whole, including how the opponent has bet, your knowledge of him and any possible tells, to decide whether or not the flush will be good if it hits. But once you decide that it will be, you still need to know your odds of hitting it to make the final decision. We're not offering the information here as if that were all one needs to know to play poker well--it's just one of many things one needs to know. --LDC 17:21, 15 June 2007 (UTC)[reply]
That's what a gaming or applied-math guide is for. This is an encyclopedia. The tone is great for a guide but wrong for an encyclopedia. Canuckle 03:45, 23 June 2007 (UTC)[reply]

Wrong statement about the nuts

[edit]

I've added "(except for someone having a straight flush)" to the statement "if the flop comes with three 2s, any hand holding the fourth 2 has the nuts."

Is it wrong though? At that point, the best hand it is possible to make with the availible community cards is quad 2's. So that hand is the nuts. Although it is possible to hit running cards to make better quads or a straight flush. It depends how you define "the nuts". Is "the nuts" the best hand possible at that point, or the best hand possible after the river is dealt. It is not possible to make a straight flush on the flop if it is 222, only a backdoor straight flush *draw*, so the revision is not correct regardless of your interpretation of the nuts. What do people think? Maybe a wording like: "if the flop comes with three 2s, any hand holding the fourth 2 has the nuts. There is however a small probability that a player may hit running cards to make a straight flush or better quads, which would then become the nuts."? 82.46.37.37 16:53, 3 August 2007 (UTC)[reply]
The "nuts" as used in this article means "the best hand currently possible." It does not mean that a better hand won't become possible with additional cards—defining the nuts as the best hand that ever will be possible is not particularly useful. —Doug Bell 07:26, 30 November 2008 (UTC)[reply]
A nut hand that can't be outdrawn by any remaining board cards is often called a lock hand GarethAdams (talk) 13:02, 14 April 2010 (UTC)[reply]

Intro

[edit]

The methods given are not really mutually exclusive, nor are they exhaustive. As the later sections make clear, there are a number of methods depending on your needs.

You can use a pure stochastic (Monte Carlo) method by generating a random sample set and averaging the outcome. In some cases you can use combinatorics to figure out the number of ways of making a given hand. In some cases you need very little math because you have a dominated hand and just need to compare your outs.

The article is great, I just thought the intro did not really address the tools that are available to a mathemetician or programmer when calculating probabilities.

Conditional

[edit]

If you are holded suited cards, does that increase the odds that other players are holding suited cards too (in the other three suits). By about how much? Oyster Jimmy (talk) 06:13, 22 December 2007 (UTC)[reply]

Yes it does, but I haven't bothered to figure out by how much. It's one of those things that interests me, but which is probably getting too far off the beaten path. I still might slip it in somewhere sometime. —Doug Bell 07:21, 30 November 2008 (UTC)[reply]

Too complex

[edit]

So someone asked me today... what's the chances of getting a royal flush in hold em'? and I'm thinkin' to myself, well all I have to do is look up the odds on wiki, and clip down the 40 possible straight flushes, to 4.

Was it that simple? hell no!

Why? because this page is entirely too complex. I like that, but it'd be nice if you had an introductory table with simple odds flush, 4 of a kind.... and leave the inside outside stuff till later in the article. Cuz it took me too long to answer his question. —Preceding unsigned comment added by Sparkygravity (talkcontribs) 07:52, 20 August 2008 (UTC)[reply]

You could simply have looked on Poker probability at the 7-card royal flush probability since without any additional context the odds in Hold 'em are no different than for any hand made from 7-cards. 67.52.145.34 (talk) 03:50, 15 October 2008 (UTC) (Doug Bell, not logged in.)[reply]

Or if somebody was not overzealous in guarding the article, you could use the external link I provided to estimate the odds in a few seconds. 71.112.98.163 (talk) 04:17, 13 May 2009 (UTC)[reply]
[edit]

Why did somebody remove the odds calculator external link I added?

It is a free program that shows the odds, and that is what many people request on the discussion page. —Preceding unsigned comment added by 71.112.17.46 (talk) 18:44, 2 May 2009 (UTC)[reply]

Odds of getting suited cards

[edit]

Correct me if I'm wrong, but it seems to me that the odds of getting suited cards is 4.25:1, and not 3.25:1 as it appears in the table (Starting hands section, when dealt any hand). Uriy (talk) 05:17, 21 June 2009 (UTC)[reply]

The probability is 1 / 4.25, which makes the odds 3.25 : 1. Check the definitions of "probability" and "odds" to see the difference. --LDC (talk) 23:18, 21 June 2009 (UTC)[reply]

Incorrect Usage of Conditional Probability

[edit]

The opening paragraph reads:

The second approach is to use conditional probabilities, or in more complex situations, a decision tree. There are 4 ways to be dealt an ace out of 52 choices for the first card resulting in a probability of \begin{matrix} \frac{4}{52} = \frac{1}{13} \end{matrix}. There are 3 ways of getting dealt an ace out of 51 choices on the second card after being dealt an ace on the first card for a probability of \begin{matrix} \frac{3}{51} = \frac{1}{17} \end{matrix}. The conditional probability of being dealt two aces is the product of the two probabilities: \begin{matrix} \frac{1}{13} \times \frac{1}{17} = \frac{1}{221} \end{matrix}.

Although the calculation is correct, the usage of the phrase "conditional probability" in the last sentence is incorrect. The following is my suggested rewording. (I have not edited Wiki articles before, so I leave it to someone else to commit these changes.)

The second approach is to use conditional probabilities, or in more complex situations, a decision tree. There are 4 ways to be dealt an ace out of 52 choices for the first card resulting in a probability of \begin{matrix} \frac{4}{52} = \frac{1}{13} \end{matrix}. There are 3 ways of getting dealt an ace out of 51 choices on the second card after being dealt an ace on the first card for a probability of \begin{matrix} \frac{3}{51} = \frac{1}{17} \end{matrix}. This value is the conditional probability that second card dealt is an ace given that the first card dealt is an ace. The joint probability of being dealt two aces is the product of the two probabilities: \begin{matrix} \frac{1}{13} \times \frac{1}{17} = \frac{1}{221} \end{matrix}.

I agree that is better wording. It's been changed. —Doug Bell 07:48, 8 April 2012 (UTC)[reply]

Confusion on 'Starting Hands' section

[edit]

The table lists 13 possible pocket pairs. But this number is then modified by what appears to be a calculation for the suit combinations. Since pairs can never have the same suit in a single standard deck of cards, I suspect that I am misreading the table. Could someone please clarify what the means. Vivafelis (talk) 02:44, 10 January 2010 (UTC)[reply]

I'm not seeing the confusion. With one deck, a pair must have two different suits from the four suits of a particular rank. Gimmetrow 02:52, 10 January 2010 (UTC)[reply]
I see. 6 is the number of possible suit combinations available. Four on the first pick, three on the second pick (4*3)/2. Don't know why I just wasn't seeing it right away. Vivafelis (talk) 15:40, 10 January 2010 (UTC)[reply]
4 over 2 = 4!/(2!*(4-2)!) = 6 93.139.67.66 (talk) 16:42, 25 February 2010 (UTC)[reply]

Pocket Pairs

[edit]

The explanation on the formula to calculate the probability of facing at least one higher pocket pair before the flop is not well explained. Specifically, the formula to calculate 'Pma' lacks detail. I cannot find the answer so far, but obviously when referring to P2 and P3 as the "probabilities of facing exactly 2 or 3 higher pairs", well this depends on how many opponents we have,i.e. 1 to 9. Plus, is Pma a constant or does it change according to the n.

—Preceding unsigned comment added by 74.59.198.40 (talk) 00:59, 28 December 2010 (UTC)[reply] 
Yes, Pma depends on the number of opponents. Gimmetoo (talk) 03:43, 28 December 2010 (UTC)[reply]

Section on the Flop

[edit]

Obviously there are not 50C3 possible flops because this would mean there is only one player, which is ridiculous. — Preceding unsigned comment added by Agimcomas (talkcontribs) 20:26, 28 December 2010 (UTC)[reply]

Without knowing what cards may have been dealt to others, yes there are "50C3" possible flops to go with any particular 2-card hand. Gimmetoo (talk) 23:45, 28 December 2010 (UTC)[reply]

Duplicated odds

[edit]

This may just be a typo, but the table of starting hand odds is misleading: here's what it says now:

AKs (or any specific suited cards) -> odds 331 : 1

AA (.....)

AKs, KQs, QJs, or JTs (suited cards) -> odds 81.9 : 1


Also, these hands are there THREE TIMES:

"AA, KK, or QQ"


So AKs is there twice with different odds, and the top pairs are there three different odds.... whereas fairly essential odds like AQ, AJ, KJ, KT, and QT aren't in the table at all. — Preceding unsigned comment added by 62.199.160.7 (talkcontribs)

It seems that when that table says the "X or Y'" occurs with odds Z, it doesn't mean simply "X occurs with Z" and "Y occurs with odds Z". It means that if you look at the combined case of "either X occurs or Y occurs", that combination case occurs with some odds Z. So, "AKs (or any specific suited cards) -> odds 331 : 1" means that the probability for any specific suited cards (therefore non-pair) is 1/331.5 (odds 330.5:1). "AKs, KQs, QJs, or JTs (suited cards)" means one from that set of four specific named 2-card hands, so probability is 4/331.5 (odds 81.875:1). "AA, KK or QQ" means any one from the set of those three specific named pairs, so it has probability 3x any one specific named pair. "Suited cards, jack or better" has to mean {AK, AQ, AJ, KQ, KJ or QJ} (one of those 6 only).
In a ideal random coin flip, the probability of any one side is 1/2, but the probably of getting "one side or the other" (ie, heads or tails) is 2/2.
Yes, it's probably confusing, but I'm not sure how to rephrase it. Gimmetoo (talk) 00:54, 9 April 2011 (UTC)[reply]

undo revision

[edit]

Undid revision as it was placed in the wrong section and was of questionable validity. --RichardMills65 (talk) 04:06, 13 March 2012 (UTC)[reply]

I see it has returned. You were right to delete it, it pertains to strategy and not to probability. Gaohoyt (talk) 16:54, 13 March 2012 (UTC)[reply]

Odds of winning

[edit]

If I have a pair of aces what is the probability that I will win? etc. — Preceding unsigned comment added by Strider22 (talkcontribs) 18:49, 24 November 2012 (UTC)[reply]

Probabilities for board

[edit]

This edit raises some issues. These probabilities are not sourced (though they match [2]), nor is it entirely clear how the probabilities were calculated. I've asked the editor to document this edit. Gimmetoo (talk) 20:35, 23 May 2012 (UTC)[reply]

Flop divided by 22100

Turn divided by 270725

River divided by 2598960

( ) = combin( )


turn - three or more of same suit = 44616 + 2860

44616 = (4,1)*(13,3)*(3,1)*(3,1)

2860 = (4,1)*(13,4)


river - three or more of same suit = 580008 + 111540 + 5148

580008 = (4,1)*(13,3)*(3,2)*(13,1)*(13,1)

111540 = (4,1)*(13,4)*(3,1)*(3,1)

5148 = (4,1)*(13,5)


river - four or more of same suit = 11540 + 5148

same as last two calcs in river - three or more of same suit


one of the other corrections was just a careless typo

the others were close and i more precisely calculated them


turn - three of a kind (not more) = 2496

2496 = (13,1)*(4,3)*(12,1)*(4,1)


river - three of a kind (not more) = 54912

54912 = (13,1)*(4,3)*(12,2)*(4,1)*(4,1)


turn - a pair (only) = 82368

82368 = (13,1)*(4,2)*(12,2)*(4,1)*(4,1)


river - a pair (only) = 1098240

1098240 = (13,1)*(4,2)*(12,3)*(4,1)*(4,1*(4,1)


river - two pair (only) = 123552

123552 = (13,2)*(4,2)*(4,2)*(11,1)*(4,1)


three cards of consecutive rank (but not four) - turn = is correct = 32000

32000 = 10*4*4*4*41+2*4*4*4*45

three cards of consecutive rank (but not four) - river = 551040

551040 = 10*4*4*4*41*33/2+2*4*4*4*45*41/2


four cards to a straight (but not 5) - turn = 2816

2816 = 11*4*4*4*4

four cards to a straight (but not 5) - river = 114688

114688 = 9*4*4*4*4*40+2*4*4*4*4*44


three or more cards of consecutive rank and same suit - turn = 2000 + 44

2000 = 10*4*1*1*41+2*4*1*1*45

44 = 11*4*1*1*1

three or more cards of consecutive rank and same suit - river = 34440 + 1792 + 40

34440 = 10*4*1*1*41*33/2+2*4*1*1*45*41/2

1792 = 9*4*1*1*1*40+2*4*1*1*1*44

40 = 10*4*1*1*1*1

PokerFix (talk) 17:22, 28 May 2012 (UTC)[reply]

Moved from article

[edit]

This doesn't fit well in the article. It's too close to a "how-to", and it's not cited or formatted well. If anyone can figure out a way to incorporate this content, let's discuss it. Gimmetoo (talk) 06:32, 30 December 2012 (UTC)[reply]

Interesting cases to memorize

[edit]

Percentages rounded-up for memorization purposes.

Calculation performed in the following conditions:

Preflop, heads-up, approximation 3% (up to 6% towards the edge of the spectrum ie. 2,3,4 or Q,K,A).

Known also as the Athanasopoulos' tables.


  • Pair v Overcards: 55% v 45%
  • Pair v Suited overcards: 55% v 45%*
  • Pair v Connected overcards (ie. QQ v AK): 55% v 45%*
  • Pair v Suited and connected overcards: 50% v 50%


  • Pair v One overcard (ie. 88 v 8J): 65% v 35%
  • Pair v One suited overcard (ie. 88 v 8Js): 65% v 35%*
  • Pair v One connected overcard (ie. 88 v 89 or KK v AK): 65% v 35%*
  • Pair v One suited and connected overcard (ie. 88 v 89s): 60% v 40%


  • Pair v Overcard and undercard (ie. KK v AQ): 70% v 30%
  • Pair v Suited overcard and undercard: 70% v 30%*


  • Pair v One undercard (ie. 88 v 85): 90% v 5% (tie 5%)
  • Pair v One suited undercard (ie. 88 v 85s): 85% v 10% (tie 5%)
  • Pair v One connected undercard (ie. 88 v 87 or AA v AK): 85% v 15%
  • Pair v One suited and connected undercard (ie. 88 v 87s): 80% v 15% (tie 5%)


  • Pair v Undercards: 85% v 15%
  • Pair v Suited undercards: 80% v 20%
  • Pair v Connected undercards (ie. AA v KQ): 80% v 20%*
  • Pair v Suited and connected undercards: 80% v 20%*


Connected add a 3% advantage on average

Suited add a 3% advantage on average

Suited and connected add a 6% advantage on average

(*Advantage added in these cases too; however, identical result returned due to rounding purposes)


  • Pair v Smaller pair: 80% v 20%

ie. KK v QQ: 80% v 20%; KK v 33: 80% v 20%


  • Overcards v Undercards or y1y2 v x1x2 (ie. AK v QJ): 65% v 35%
  • y1x1 v y2x2 (ie. AQ v KJ): 60% v 40%
  • y1x2 v y2x1 (ie. AJ v KQ): 55% v 45%

where as y1>y2>x1>x2


  • Undercard with stronger kicker or y1x v y2x (ie. K4 v 84): 70%-35% v 25% (tie 5%-40%); the stronger the underdog the more likely the tie.

where as y1>y2>x


  • Overcard with stronger kicker or yx1 v yx2 (ie. AQ v A4): 70%-35% v 25% (tie 5%-40%); the stronger the underdog the more likely the tie.

where as y>x1>x2


  • Worst hand to deal 72?

72 v 42: 55% v 25% (tie 20%)

I agree that this section isn't in the best format nor is particularly well-written; I can't exactly edit-war with the silent IP person who thinks differently, though. JaeDyWolf ~ Baka-San (talk) 10:56, 9 January 2013 (UTC)[reply]

"Showdown" section: Is Pokergym unreliable?

[edit]

I'd like to express a little concern with a few edits which reference a "Pokergym" website, which I don't think can be included as a reliable source. The section added on this article sites statistics which aren't actually present on the referenced page. There are also some edits made at List of visual mnemonics which describes a bizarre method of remembering that a flush beats a straight that involves bacteria. There is no indication of who has published this information at the source and these edits have all been made by User:Jungle2010. I'd like to request further advice but I don't think these edits should remain. JaeDyWolf ~ Baka-San (talk) 07:46, 4 April 2013 (UTC)[reply]

Nevermind. After sleeping on it I've reverted them anyway and left a notice on the user's talk page. JaeDyWolf ~ Baka-San (talk) 09:23, 5 April 2013 (UTC)[reply]

Tagging article, do not revert without reading and discussing here

[edit]

I am a professor, long time contributor to Wikipedia, and a semipro poker player. This stated, because I have no beef at all about poker articles, and I know when to call something sourced versus original.

This is simply not an encyclopedic article, and stands in repeated violation of WP policies. It is, in essence, a continuing compendium of contributor's original research, on the title topic of poker probabilities (judging by material, Talk, and lack of inline and real citations for content). As such, the whole of the article is essentially not verifiable. I do not mean the research is not reproducible, though from the Talk, it is clear that at times this (immaterial, and unconscionable) admission of non-reproducible original work also arises. We need to be citing examples of calculations and simulations of others, and not doing original research, in the way of calculations and simulations, ourselves. I would mark this article for deletion if I could. It at least needs a complete rewrite, with removal of anything not drawn from an independent (not editor self-generated) source. Le Prof Leprof 7272 (talk) 20:42, 2 March 2015 (UTC)[reply]

Math

[edit]

Where's the math supporting some of this stuff? The thing that specifically caught my eye was the section of 'things that can happen on the flop. (eg - rainbow). If this is supposed to be an article about probability rather than poker strategy I think the math that is being used should be there to support it. --Stevehim (talk) 03:23, 19 January 2017 (UTC)[reply]

Delete article?

[edit]

I see no way to repair this article, and am willing to propose deletion. Anyone object here? Power~enwiki (talk) 22:54, 22 May 2017 (UTC)[reply]