During a conversation on Monday at the Goldman Sachs Technology and Internet Conference, Apple CEO Tim Cook said something that would turn the head of any student of statistics:

“We don’t believe in such laws as laws of large numbers. This is sort of, uh, old dogma, I think, that was cooked up by somebody and Steve [Jobs] did a lot of things for us for many years, but one of the things he ingrained in us [is] that putting limits on your thinking [is] never good.”

It’s likely that Cook was objecting to the beliefs of many critics that large, mature companies like Apple can’t sustain high growth rates and surging stock prices.

However, the law of large numbers has nothing to do with large companies, large revenues, or large growth rates. The law of large numbers is a fundamental concept in probability theory and statistics, tying together theoretical probabilities that we can calculate to the actual outcomes of experiments that we empirically perform.

## Here’s what the law of large numbers actually means

One of the fundamental objects of study in probability theory are random processes that spit out numbers according to some probability distribution, or set of rules for how likely each outcome is. The law of large numbers states that the more times we repeat such a process, the closer the average of the randomly generated numbers comes to the theoretical average implied by the probability distribution.

To see this in action, consider randomly choosing between 1, 2, and 3, with each number having an equal probability of being selected. Each number has a 1/3 chance of being selected. Since we know these probabilities, we can calculate the “expected value” of our process: The expected long-run average value that we get from randomly drawing from these three numbers.

The expected value for a process is calculated by taking each outcome, multiplying it by the probability of that outcome, and adding together all those numbers. In our example, the expected value is 1 × 1/3 + 2 × 1/3 + 3 × 1/3 = 1/3 + 2/3 + 1 = 2. That makes sense, since we are randomly choosing equally from these three numbers, and 2 is in the middle of that set.

The law of large numbers, then, says that this actually will be the long run average result if we randomly pick a number between 1 and 3 a huge number of times. Using Excel, we can actually do this. Here’s the result of randomly choosing between 1, 2, and 3 1,000 times, and then keeping a running average as we go:

While our average bounces around with only a few trials of our randomly picking numbers, by the time we’ve drawn 100 numbers, we get pretty close to our expected average of 2, and adding further trials doesn’t take us too far away from that expected average.

The law of large numbers is why you can never beat the house in a casino. Casino games are usually designed to have a small negative expected value for the player, leading to a small positive expected value for the house. Even though individual gamblers might win big from time to time, the law of large numbers guarantees that, among hundreds or thousands of gamblers and over time periods of months or years, the average outcome for the players will be a loss, giving the house its profit.

So Tim Cook saying that he doesn’t believe in the law of large numbers is kind of strange. This is a basic fact about how probability works. Not believing in the law of large numbers is equivalent to not believing that 2+2=4, or that 7 is a prime number.

The confusion seems to come from the fact that in financial circles, the intuitive tendency that it’s hard for a large company to keep growing at rapid rates has also been referred to as the law of large numbers. While that tendency makes sense — once you’ve sold a billion iPhones, finding customers for the second billion is likely to be a little trickier — calling it the law of large numbers is frustrating, since the probability theory law of large numbers has had that name for almost 200 years.

While Tim Cook can justifiably be proud of Apple’s continuing unexpectedly strong growth, mathematicians and statisticians might prefer that people stop dragging the name of one of the most important theorems in probability theory through the mud.