/cdn.vox-cdn.com/uploads/chorus_image/image/13012411/20130108_kkt_aq2_689.0.jpg)
Interceptions don't matter. Well, that's not really true. Interceptions and turnovers in general are major factors in wins and losses. It's just that the actual throwing of interceptions is, by and large, a random process. According to Football Outsiders, the year-to-year correlation coefficient in interceptions thrown was just .08 from 2007 to 2010. That is: only a tiny part of a quarterback's interceptions in any given year can be predicted by looking at the number of interceptions he threw the previous year.
This should make some intuitive sense. Despite our constant abhorrence at awful decisions and terrible throws, the truth is that those represent only an exceedingly small part of the throws a quarterback attempts. Interception rates generally fluctuate between 2% and 4%. That means that a few extra tipped balls or a few dropped interceptions can make a massive difference in interceptions from season to season.
The implication is fairly simple: don't focus so much on Josh Freeman's two late-season, four-interception games. Those were by and large fluky and not indicative of the rest of the season, when Freeman averaged a very normal 1.9% interception rate compared to his season-long 3.1% average. Two games can make a big difference in our perceptions.
Of course, there's also side of the argument. Football Outsiders' adjusted interception rate (which looks at both tipped passes and dropped interceptions) correlates better from year to year with a .33 correlation coefficient, and that had Freeman throwing interceptions at a somewhat ludicrous 4.3%. Interception rate isn't everything as evidenced by Kevin Kolb, Blaine Gabbert and Sam Bradford having exceedingly low adjusted interception rates, but there's a point where it does become extremely harmful. And I'm fairly sure that when you're throwing an interception every 23 attempts, that point has been reached.