Wednesday, May 14, 2008

The NFL passing premium revisited

There is a so-called "passing premium" in the NFL. A passing play, on average, will gain more yards than a running play, even after accounting for turnovers. But teams still call for the run, and fairly often. What can explain that?

In an article in the current
JQAS, Duane Rockerbie takes a shot at the question. His study is called "The Passing Premium Puzzle Revisited."

Rockerbie argues that the effect is caused by risk aversion on the part of the coaches. Passes gain more than runs, but they are also riskier – sometimes they succeed spectacularly, and sometimes they fail badly. Rushing plays also vary, but not as much as passes.

Risk aversion is an established concept in finance. Investments with little or no variance, such as short-term government bonds, are expensive, yielding only a couple of percentage points. Investing in the stock market, however, gives average returns that are much higher. However, if you invest in equities, you take the risk of a decline or a crash.

Rockerbie argues that the same is true in football. Coaches are willing to accept fewer yards in exchange for less risk, and that's why they often call for the run. They care about the average gain, but they care about the risk, too.

So, as expected, strong rushing teams will call more rushing plays than weak rushing teams, and weak passing teams will call more rushing plays than strong passing teams. But, also, teams with *predictable* rushing results and varying passing results will call more rushing plays than teams with predictable passing and varying rushing.

Economists model risk aversion in terms of "diminishing marginal utility." What that means is that the more money you have, the less each additional dollar means to you. This is sufficient to explain risk aversion. No matter how rich you are, you will never take an even gamble for $10, because the $10 you lose would mean more to you than the $10 you would win. (To make you take the bet, I'd have to offer you (maybe) $11 or $12 against your $10.) Diminishing marginal utility, is, on its own, enough to explain risk aversion.

And so Rockerbie uses a standard economics utility function, one that mathematically models a certain function where dollars are worth less the more of them you already have. And he applies that to yardage – the more yards you've already gained, the less you care about the next few yards. (There's no evidence that football yards should have the same mathematical marginal utility function as dollars, but I guess you have to start somewhere.)

Using that equation, and full regular-season data for the 2006 NFL season, Rockerbie sets out to figure how proportion of running plays "should" have been called for each team. However, doing this requires him to estimate a mathematical parameter for risk aversion. Because the San Diego Chargers had the best record in 2006, he uses the parameter that makes the Chargers look like they made a perfect decision. (In Figure 3, he shows that choosing some other parameter wouldn't affect the results all that much.)

The results: two teams ran more often than they "should have" – the Patriots and Saints – and the rest of the teams (not including the Chargers, who were modelled to be perfect) ran less often than they "should have."

Rockerbie shows that, generally, the farther a team was from its optimal run/pass proportion, the worse its record. That fact, he argues, supports the model.

But I don't think it does.

Actually, it's Brian Burke who doesn't think it does; I'm just agreeing with him. Brian notes (in parts 2 and 3 of a recent
three-part posting) that it's not running that leads to wins, but the other way around. Teams with a fourth-quarter lead will call a lot of running plays, to reduce the risk of a turnover and run out the clock. And that explains the correlation more convincingly than risk aversion.

Further, Brian points out, in a financial application, every dollar is of equal value. But in football, there are other constraints, like having to earn ten yards in four downs, or suffer a big loss. As Brian says,



"At the end of each year no one takes most of your money away if your mutual funds don't earn at least 10%. If they did, and you hadn't made your 10% by November, your risk tolerance would dramatically increase for the final 2 months of the year."

So there are situations in the game – third and 9, say – where you have to take more risk, and probably call for a passing play. And any analysis has to take those constraints into account.

Also, it's not just the down-and-yards situation that calls for an increase in passing plays – it's the score. The value of an additional yard may be reduced, on average, the more yards you have. But the goal of football is not to gain yards – it's to win the game. No matter how many yards you have, if you're down by 2 points on your own 20 late in the fourth quarter, you'd better try a few throws.

And so, the more points – or yards – the other team has, the risker you have to be. So, as Brian notes, the better your defense, the more you can afford to run.

Brian's conclusion is that he doesn't think the study shows what it purports to show, and I think he's right on in his arguments.

--

A couple of additional things that bother me a bit:

First, the study uses the teams' raw numbers from the 2006 season. But diminishing marginal utility doesn't depend on the outcome, it depends on the *expectation* of outcomes. And outcomes, on average, are always more extreme than their expectation – the guy who went 3-for-5 probably wasn't really a .600 hitter. You have to regress all the results to the mean, somewhat, to get the actual expectation. Rockerbie doesn't do this. He explicitly assumes that all teams' performance is an unbiased estimate of their talent, which is false.

--

Secondly – and most important – shouldn't the risk-aversion model account for the fact that risk is reduced when you have multiple plays? The risk-return tradeoff for one play is different for one play than for fifty consecutive plays.

Suppose I offer to bet on a coin flip, and offer you favorable odds of $15,000 to $10,000. You recognize that the game is weighted heavily in your favor, but, because you're risk averse, you reluctantly turn me down – the risk of losing $10,000 with 50/50 odds is too much for you.

But, now, suppose I offer to play the same game with you, not just once, but one hundred consecutive times. Now, the odds of you losing money are almost zero. To finish in the red, you'd have to lose 2/3 of the 100 coin flips, and the probability of that is very, very low. So you'd accept the bet and get rich.

It seems to me, if I'm getting this right, that the analysis in this study is for a single play. (Specifically, in Equation 6, the sigma-squareds are those hypothesized for one play.) But over, say, 100 plays, the variances of the average play would be 1/100th as in the study. Over a season, you could almost argue that the variances would be almost zero, and you should just go 100% for the play (probably passing) that would yield the most benefits.

That is: for a single play, if you pass, there's the possibility you gain lots of yards (a completion), but the possibility that you lose lots of yards (an interception). Even though there's a higher chance of a completion than a turnover, it's still risky. But over a season, the chance of having more interceptions than completions is very close to zero. So shouldn't lots of passing be your best option, even if you're risk averse?

Am I missing something?

--

Finally, there are results in the table that confuse me. I may not understand how this really works, but, if you look at the chart, for the Raiders, Steelers, Falcons, Broncos, and Titans, (a) running has a higher expectation than passing, and (b) running has a lower variance than passing.

For those five teams, shouldn't a risk-aversion model predict that they should run 100% of the time? According to this model, why would you EVER pass if your expectation is lower and your risk is higher?



Labels: ,

4 Comments:

At Wednesday, May 14, 2008 3:04:00 PM, Blogger JTapp said...

I think Prospect Theory (Kahneman and Tversky) sheds some light on this. People (and coaches) are not so much risk-averse as they are LOSS averse. Risk-aversion changes with the situation they are presented with. People are more risk-seeking when faced with a prospect of a loss.

For example, if you told a coach he could get 3 yards at 100% certainty with a run, but told him with a pass he could have an 80% chance of 5 yards and a 20% chance of zero, he’d likely just take the guaranteed 3 yards, even though his likely payoff would be 4 yards by passing. He’d be risk-averse.

But, if you told the coach he would 100% lose 3 yards by running, but if he passed he had a 20% chance of losing 0 yards and an 80% chance of losing 5 yards, he’d likely gamble with the pass even though he would likely lose 4 yards. He’d be risk-seeking.

The mathematical probabilities are the same in both examples, just flipped. There are a ton of examples in the research and in finance of this type of behavior.
My limited example illustrates what the Theory has shown, you can invent your own examples.

 
At Thursday, May 15, 2008 6:56:00 PM, Anonymous Anonymous said...

I have not read the study or the rebuttal yet, but the principal reason why football teams don't opt for the play that yields the most average yards (other than the yards needed for a first down, which is a critical determining factor of course) is game theory. You can't pass or run too much or the defense will key on that kind of play and eventually it won't be worth nearly as much as starts out.

For example, if a pass, given normal defense for the game situation, is worth 5 yards and a run is worth 3.5 yards, if you opted to pass all the time, the defense would change their defense, key in on the pass only, and the pass would now be worth 4 yards and the run 4.5 yards (or whatever). THAT is why teams can't choose to pass or run according to how many yards each is worth given the defense and given the amount of time they pass and run typically.

Without first figuring out the correct balance between running and passing according to game theory (and that is not easy), any analysis is useless.

I assume that the author addressed this issue, but you don't mention it in your critique.

MGL

 
At Thursday, May 15, 2008 7:21:00 PM, Blogger Phil Birnbaum said...

Actually, I did mention it originally, then deleted it ... probably should have left it in. The author actually never mentions the game theory aspect, which is a curious omission -- however, since he did come up with a mix of running/passing for all situations, maybe he thought it was obvious.

 
At Monday, May 19, 2008 11:27:00 AM, Anonymous Anonymous said...

I’ve never understood why it’s considered a “paradox” that the average yardage gains for passess and runs isn’t the same. Since the greatest imperative is to make first down w/in 3 plays (in most cases), the probability distribution of gains will matter a lot.

Let’s take an extreme example: every run gains 3.5 yards, while pass plays yield a 10-yd completion 45% of the time and an incomplete 55% of the time. Passes have a much higher average gain (4.5 vs. 3.5). Yet the correct strategy is obviously to run every play, which will aways yield a first down after 3 plays (and thus will always yield a touchdown). In contrast, a team that always passed would give up possession 17% of the time.

We can't calculate the "right" frequency of running and passing without accounting for likelihood of reaching a first down within 3 plays. And there’s no reason to think average yardage (even if interceptions and fumbles were properly accounted for) is an adequate proxy for that.

 

Post a Comment

<< Home