Tango on the 1992-94 home run explosion
There was a large increase in major-league baseball hitting in 1993 and 1994, one that continues today. In 1992, there were 0.721 home runs per game. In 1994, there were 1.033 home runs per game. It wasn't a one-time increase: the 1994 rate has pretty much stayed with us to the present day.
What happened? There are various theories. One says that the 1993 expansion brought in a bunch of inferior pitchers, and the dilution of talent caused the numbers to jump. Another theory says that it's the ballparks – Coors Field entered the National League in 1993. A third theory says it's the ball: it was juiced up around that time, and remains juiced today.
In a post about a year ago, I argued that expansion couldn't have caused an effect as big as the one we saw. With reasonable assumptions, you can show that expanding by two teams should cause home runs to increase by only 3%. And you could also do the same for ballparks – even with Coors Field now in the equation, and even combining that with the 3% for expansion, there's still no way you can explain the 40% increase in home runs.
But for those who don't follow that logic, Tom Tango has an excellent study that should now win over the unconvinced.
Tango looked at the 1993 season, and compared it to 1992. But he stripped 1993 down, considering only players who played in 1992, in parks that were in existence in 1992. And he adjusted all the players' stats to give them equal playing time in 1992 and 1993.
The results: even among the incumbent players and parks, there was an 18% increase in home runs.
Repeating the study for 1993/1994, Tango found a second increase, this time of 20%.
Over two years, even without the effects of expansion, and the effects of new parks, there was a 42% jump in home runs.
So it can't be the parks. And it can't be expansion.
I don't think this finding is a big surprise, but it's so thorough, and so understandable, that even non-sabermetric fans should be convinced. You'd hope.
Anyway, if it wasn't the parks or expansion, was it the ball? Tango presents convincing evidence to say it was.
According to Dr. James Sherwood, MLB's ball tester, minor-league balls travel 391.8 feet under the same conditions that major-league balls travel 400.5 feet, for a difference of 8.7 feet. And, according to Greg Rybarcyzk of HitTracker Online, if you eliminated all home runs in 2006 that cleared the wall by less than 8.7 feet … you'd have roughly the same home run rate as before the jump.
I think Tango has found the answer.
7 Comments:
There's an aspect to the homerun increase which still needs a closer look after Tango's study, which is aging.
Hypothetically, if every batter in 1992 had been 24, we'd expect big power increases in 1993 and 1994 from their typical gains in power as they turned 25 and 26, without steroids, juiced balls, narrower strike zones, worse pitchers or anything else.
The gain and loss of power on the aging curve is not symmetrical, so that gains from 24 to 25 would not be cancelled e.g. by losses from 32 to 33. Without adjusting for age, I think Tango's matching method would be biased toward showing some increase in a skill like power hitting which increases rapidly in young players and declines more slowly in old players, since all the matched players are a year older in year 2 of the match.
In fact 1992 did have a relatively high proportion of under-26 players, compared to later in the 90s (21.6% of PA by under 26 hitters in 1992, vs 18.6 in 1993 and 16.7 by 1996).
The increase Tango found is so large that I think correcting for age would not deflate it too much.
However, his suggestion that we could attribute the HR increase to the ball alone depends in part on the assumption that the players were "the same."
Agreed: there's a hidden assumption there, that the pitchers and hitters aged at the same rate with respect to home run rates.
What you could do is repeat the study for different pairs of years, to get a baseline for what the age effect is, and subtract that out. But, as you say, the effect Tango found is so large that aging would explain only a small part of it.
I actually did also do it for every year since 1957. The age bias, survivorship bias, and whatever other bias that you would get would basically be the same every year. That is, if the 20% or 15% was due in large part to this bias, we'd see it every year.
The average and median matched-pair increase was 5%. And since 1995, it's 2%. For the non-matches, it's been 0% since 1957.
Whether this means that HR hitters age better than HR pitchers, that's certainly possible. It of course comes nowhere close to explaining what we did see.
***
One of the largest increases was 1968/69 of 37% for the matched pairs, and 25% for the rest of the league.
The largest was 1976/77, with 59% for the matched pairs and 48% for the rest of the league.
The largest decrease was 87/88, with 30% drop for the matches, and 28% for the rest of the league.
Good stuff!
Isn't it accepted that players develop more power as they age? I thought that was conventional wisdom. That would explain the 2-5% increase among the matched pairs.
Yes, my charts show the HR hitter peaking in his late 20s with speed peaking in the early 20s, with walks peaking in the late 30s.
The question would be of the balance between hitters and pitchers. That is, if hitters peak in HR hitting at 29 and pitchers peak in HR preventing at 20 (an illustration), then you get the results I'm showing. If they both peaked at age 29, then you wouldn't get the results I'm showing.
Have you ever considered the overall progress of bat technology? In a study titled "Game Theory" done by Bill James, he suggests that much of the offensive explosion can be attributed to bat design rather than steroids. Any thoughts?
Almost all the increase came between 1992 and 1994. So if it were the bats, then almost all players would have had to switch bats in that two-year timespan. My guess is that that's not likely, and if it happened, we'd probably have heard about it.
Post a Comment
<< Home