Do Week 1 Results Predict Week 2 Results?

Do Week 1 results predict about Week 2 results? Using football science, statistics, the spread and NFL history can help make sense of gambling and trends.


Week 1 is in the books and Week 2 kicked off with a Baltimore Ravens’ win over the Pittsburgh Steelers Thursday night. Last Monday, we looked at what Week 1 can tell us about the season in general. This time, we’ll focus on what Week 1 can show us about Week 2 in particular.

Last time we needed to rely on preseason Team Totals to calibrate expectations, limiting our data to 1999-2013. This time, we’ll expand our data set – again using data from Pro-Football Reference – to encompass all Week 1 games going back to 1978, so we’ll be looking at 1061 Week 1 results. Scheduling oddities, such as Week 1 byes during the 1999-2001 seasons, reduce our dataset by a few games, which is why we have an odd number of games.

To refresh our memories, last time we found that teams that significantly overperformed or underperformed against the spread (“ATS”) in Week 1 went on to have substantially better or worse years than expected before the season. However, by Week 2 sportsbooks have had time to adjust to Week 1 lines, so it seems possible (probable?) that any “informational value” we obtained from Week 1 results is already priced in. A quick way to look at this is to examine how much of a relationship, if any, exists between Week 1 results ATS and Week 2 results ATS.

Quick answer? Not much. Across 1,061 teams that played in both Weeks 1 and 2, the correlation between Week 1 and Week 2 ATS is effectively zero (0.02 to be exactly). There’s not much to be gleaned overall, but what might be learned from the real outliers?

Our single best Week 1 result ATS was by the 1989 Cleveland Browns, who went on the road to Pittsburgh and won 51-0, thereby beating the spread by 49. Only the 1997 Jets (41-3 @ Seattle) and 1987 Buccaneers (48-10 vs. Atlanta) had results at least 40 points better than the spread. The Browns beat the spread by 6.5 in Week 2, the Jets failed to cover by 10.5, and the Bucs failed to cover by 3. Playing around with some arbitrary endpoints, we see that the 31 teams that covered by at least 27 points in Week 1 beat the spread by 2.2 points again in Week 2. However, expand that to the 36 teams that covered by 26 points, and the effect vanishes entirely. So caution must be applied to such analysis.

What about the other end of the spectrum? How did the 1989 Steelers, the 1997 Seahawks, and the 1987 Falcons do in Week 2? The Steelers and Seahawks both failed to cover the next week, by 21 and 16.5 points respectively. The Falcons did, however, cover by 7.5. Doing some similar arbitrary endpoint type analysis shows that the 77 teams that failed to cover by at least 21 also failed to cover the next week by an average of 1.5 points, a fairly substantial margin in sports betting terms. Moreover, this effect is much more “robust” than the Week 1 ATS over-performances, showing no sudden cliffs as we saw between +26 and +27. In addition, the effect continues to increase in size with greater underperformances, representing an underperformance of over 4 points in Week 2 for the 25 teams with -28 ATS results or worse in Week 1.

Consistent with what we saw with our earlier analysis comparing Week 1 under/overperfomers to their Team Totals, there is more to be learned from the underperformers ATS than the overperformers. However, even then this only applies to the very extreme cases. Only one 2014 Week 1 team, the St. Louis Rams (crushed 34-6 by the Vikings, failing to cover by 31.5 points), falls into this category. Historically speaking, teams like the Rams have continually been overvalued going into Week 2 by almost a touchdown, underperforming in Week 2 by an average of 6.3 points.

Out of curiosity, I ran this same analysis for the Over/Under (the total number of points a sportsbook expects to be scored in a game). The overall correlation seen between a team’s Week 1 and Week 2 Over/Under results was effectively zero. However, while almost no effect was seen for the extreme Over teams, the teams that went extremely Under their Week 1 expectation tended to bounce back strongly in Week 2. This effect varied with the size of the Week 1 underperformance, but averaged a 4-point Week 2 overperformance for teams that went Under in Week 1 by at least 20 points; the lack of sharp cutoffs mitigates any arbitrary endpoint concerns we might have. This analysis flags two teams – Houston and Washington – as strong candidates for a Week 2 over.

Follow Konstantin on Twitter @kmedved.

Konstantin Medvedovsky analyzes data, looking at home field advantage and weather, or ifearly-season results are predictive

Leave a Reply

Your email address will not be published.