Do foul shooters choke in the last minute of close games?
Searching Google Scholar for studies about "choking," I came across an interesting one, a short, simple analysis of free-throw shooting in NBA games.
It's called "Choking and Excelling at the Free Throw Line," by Darrell A. Worthy, Arthur B. Markman, and W. Todd Maddox. (.pdf)
The authors looked at all free throws in the last minute of games in the three seasons from 2002-03 to 2004-05. They broke their sample down by score differential, and compared the success percentage to the players' career percentages.
They found that for most of the scores, the shooters converted fewer than expected. Here's the data as I read it off the graph (but see the PDF for yourself). The score differential is from the perspective of the shooting team, and the "%" column is actually percentage points.
-5 points: -3%
-4 points: -1%
-3 points: -1%
-2 points: -5% (significant at 5%)
-1 points: -7% (significant at 1%)
+0 points: +2%
+1 points: -5% (significant at 5%)
+2 points: +0%
+3 points: -1% ("also signficant")
+4 points: +1%
+5 points: -1%
The authors conclude that choking occurs, especially when down by 1 point.
It may not be obvious at first glance from the chart, but there's a tendency to "choke" all the way down: there are 8 negatives and only 3 positives (and the negatives are generally more extreme than the positives). Do players actually shoot worse in the last minute of close games?
I couldn't think of any statistical reason the results might be misleading ... but in an e-mail to me, Guy came up with a good one.
Suppose that career shooting percentage is not always a good indicator of a player's percentage that game. Maybe it varies throughout a career, somehow -- higher at peak age and lower elsewhere, or, even, increasing throughout a career. (It doesn't matter to the argument *how* it varies, just that it does.)
You might expect that the differences would just cancel out. However, the overestimated shooters would be appear in each category more than the underestimated shooters. Why? Because they would miss the first shot more often, and take a second shot *within the same score category*.
As an example, suppose two players have 75% career percentages, but, on this day, A is a 100% shooter and B is a 50% shooter. Suppose they each go to the line twice with the game tied. On their first shot, A makes two and B makes one. So far, their percentage is 75%, as expected. Perfect.
But, only B gets to take a second shot with the game still tied. He does that once, the one time in two he missed the first shot. And he makes it half the time.
So, on average, you have these guys taking five shots, and making 3.5 of them. That's 70 percent -- 5 percentage points less than the career average would suggest.
Now, the numbers I used here are not very realistic -- nobody's a 100% shooter, and hardly anyone is a 50% shooter. What if I change it to 80% and 70%?
Then, following the same logic, and if my arithmetic is correct, those two players combined would make 74.8% of their shots instead of 75%. It's still something, but not nearly enough to explain the results. Still, I really like Guy's explanation.
So, there you go: it does look like, for those three seasons, players shot worse in the last minute than expected. Can anyone think of an explanation, other than "choking" and luck, for why that might be the case? Has anyone done this kind of analysis for other seasons to confirm these results?
UPDATE: Maybe it's just fatigue! See comments.