Elo ratings of the Premier League: mid-season review

This is going to be the first of 2 posts looking at the mid-season performance of my football prediction and rating system, Predictaball. In this post I’m going to focus on the Elo rating system.

Premier league standings

I’ll firstly look at how the teams in the Premiership stand, both in terms of their Elo rating and their accumulated points, as displayed in the table below, ordered by Elo. Over-performing teams, as defined by being at least 3 ranks higher in points than in Elo, are coloured in green, while under-performing teams, the opposite, are highlighted in red.

Man City are dominating the Elo ranking, with 85 more points than second-placed Chelsea, which is completely expected from their 18 successive (often high-scoring) victories. It can be seen that there is an asymmetric rating distribution, with 13 teams below the mean. This emphasises the dominance of the top 6 (Everton in 7th are a long way behind 6th placed Arsenal). The competitiveness of the top teams is highlighted by the fact that a mere 34 points separates second placed Chelsea from 5th placed Liverpool, which is less than half the difference from Man City to Chelsea.

Elo rank Team Elo Points Points rank Rank difference Played
1 Man City 1796 58 1 0 20
2 Chelsea 1711 42 3 -1 20
3 Tottenham 1695 37 5 -2 20
4 Man Utd 1687 43 2 2 20
5 Liverpool 1677 38 4 1 20
6 Arsenal 1633 37 5 1 20
7 Everton 1504 27 8 -1 20
8 Leicester 1494 27 8 0 20
9 Burnley 1470 30 7 2 19
10 Southampton 1427 19 14 -4 20
11 Crystal Palace 1425 18 15 -4 20
12 Bournemouth 1407 17 18 -6 20
13 Watford 1406 25 10 3 20
14 West Ham 1405 18 15 -1 20
15 Stoke 1395 20 13 2 20
16 Huddersfield 1387 23 11 5 20
17 West Brom 1380 15 19 -2 20
18 Brighton 1375 21 12 6 20
19 Swansea 1366 13 20 -1 20
20 Newcastle 1357 18 15 5 19

Overall, Elo and points totals appear to be well correlated, with a few exceptions. For example, Tottenham are third in terms of Elo, but 5th in points. Likewise Man Utd are second in the actual league, but only have the 4th highest Elo. Looking at the tail end of the league and we see similar phenomena. Brighton are 18th when ranked by Elo, but are 12th in the actual league, a difference of 6 ranks!

There are a number of reasons for this behaviour: the most obvious being that points don’t take the opponent’s strength into consideration, while Elo does. Winning a game against a team in the top 6th will result in more Elo points than against a relegation candidate, but both wins would be awarded with 3 points. A strength of Elo is that by taking opponent strength into account, it shows a fixture-independent table, while ranking by points isn’t entirely fair if a team has managed to have fewer games against the top 6 than others.

Due to the inclusion of margin of victory in the Elo update equation, (see Elo explanation post), a win by a larger score results in additional Elo points. This could partly explain why Stoke are ranked 15th in Elo but 13th by points, as they have the second worst goal difference in the league (-18). Another potential explanation for this discrepancy between ranks is how promoted teams are handled. Currently, only teams in the top 4 European leagues are tracked, so when a team is promoted up to the Premier League (or La Liga etc…), it is assigned the average rating of the relegated teams. So Newcastle, Brighton, and Huddersfield were all given the same rating (1350) at the start of the season, which isn’t entirely accurate and it may take longer than 20 games for their ratings to converge on their actual values. This last point is quite important, Elo rating is a continuous score with only a soft-reset each season, whereas points are obviously wiped clean each summer. Just because a team has a high rating doesn’t mean they are going to be good now or in the future, but that they were good enough in the past, and a lot can change in the off-season.

The correlation between Elo rank and points rank is shown below (with a high r value). Teams above the line have worse Elo than their points suggest (thereby overperforming in the real league), while teams below the line are scoring fewer points than their skill level would suggest (underperforming). The difference between the top and bottom of the league is clear, with teams at the top having less variation between their 2 ranks, while teams in the lower half are more dispersed. The 3 promoted teams (Brighton, Newcastle, Huddersfield) are the 3 most over-performing teams (in that order), which suggests that setting their Elos to be equal to the averaged rating of the relegated teams isn’t entirely accurate, although in absence of tracking the rating of the lower leagues I can’t see a better way of handling this that still maintains a zero-sum system.

Expected goals

No football analytics post is complete without mentioning expected goals (xG), the stat so beloved by analytics and yet so poorly understood by football ‘experts’. I’m using the table found here of expected points, provided by Gracenote sports and Simon Gleave although note that it is one matchday behind. Under-performing teams are highlighted in red and over-performing in black, calculated as a 3 point difference compared to the expected total. It provides an alternative view of performance looking at match level data rather than just the result. Importantly, these two methods of calculating over and under-performance are not directly comparable, the Elo method purely based on match outcome and the other comparing outcomes with how the match was played.

There are a number of differences with my Elo rating. Firstly, the top 6 are ordered differently, with Liverpool and Arsenal moving up into positions 2 and 3 respectively (although it looks like Arsenal are only ahead of Spurs on goal difference). Their model has identified Liverpool, Arsenal, and Spurs as under-performing, while according to Elo both Liverpool and Arsenal are ranked relatively fairly, although it agrees that Spurs are under-performing. Both systems agree that Man Utd are doing better than expected.

The biggest under-performer by xG is Crystal Palace, who are the joint biggest under-performer by Elo, along with Southampton and Bournemouth. The bottom half of the table doesn’t contain any under-performing teams. The over-performing teams as identified by xG are Burnley (first by quite some margin), and Huddersfield, both of which are considered to be over-performing by Elo but to a lesser extent, with Brighton the most over-performing according to Elo.

Rating change over the season

I’m interested to see how the team’s ratings have changed over the course of the season. I’ve plotted the temporal trend below, although it can be hard to identify which team is which, any suggestions for how to better visualise this with 20 lines when there isn’t much y-separation would be welcome!

The most immediate finding is the large separation between the top 6 and the bottom 14. This is slightly worrying as it leads to a sense of inevitability in games between a top-6 and a bottom-14 team, although looking at it from a more positive perspective it allows for potentially exciting games of football whenever the top-6 play each other. Man City’s fantastic season is shown here as they started the season ranked 3rd after Chelsea and Spurs, but overtook of them by the start of October and never looked back, while Spurs started to slide down the table. From my own perspective, I’m heartened to see Liverpool’s improvement following their 3-0 away win at Stoke at the end of November.

Looking at the bottom of the ratings, we can see Newcastle, Huddersfield, and Brighton starting the season at 1350 Elo, the lowest of all teams, with Huddersfield immediately jumping up with their 3-0 away win at Crystal Palace, before falling back into the mid-table and being kept company by Brighton, while Newcastle starting off well before going on a losing run from mid-November.

The table below displays the change in Elo across the half-season so far, once again demonstrating Man City’s superiority, having gained more than double the number of rating points of the second most improved team (Man Utd). Swansea’s dismal season is shown here, having lost 72 points over the course of the season.

Team Current elo $\Delta elo$
Man City 1796 145
Man Utd 1687 70
Liverpool 1677 59
Burnley 1470 56
Huddersfield 1387 29
Chelsea 1711 17
Brighton 1375 17
Watford 1406 11
Arsenal 1633 9
Leicester 1494 8
Newcastle 1357 -1
Tottenham 1695 -7
Crystal Palace 1425 -26
Everton 1504 -40
Bournemouth 1407 -42
Stoke 1395 -54
Southampton 1427 -56
West Ham 1405 -61
West Brom 1380 -62
Swansea 1366 -72

European leagues

Predictaball also tracks the other 3 major European leagues (La Liga, Serie A, and the Bundesliga) even those these predictions for these matches aren’t tweeted. As I’ve already made this post longer than I expected, and also because I know even less about these leagues than I do the Premiership (I actually don’t follow football that much, I just enjoy predictive modelling), I’m just going to display the Elo tables below without much commentary. Remember that the Elo systems are league-dependent and scores from different leagues are not directly comparable.

La Liga

As with the Premier League, the league is effectively grouped into 2, with the team in 3rd separated from the remaining 17 teams by 126 points, and only 255 points separating the 4th place team from last. By calculating the standard deviation of Elo, we get a measure of the spread of skill in the league, with a more competitive league having a smaller skill range. This value is 142 for the Premier League and 133 for La Liga, which isn’t a large difference.

There are also some discrepancies between the Elo ranking and the actual league position, the biggest of which by far is Girona, who lie in 8th in the league, but only have the 17th best Elo. On the other hand, Espanyol have the 11th best Elo but are placed in 16th.

Elo rank Team Elo Points Points rank Rank difference Played
1 Barcelona 1837 45 1 0 17
2 Real Madrid 1738 31 4 -2 16
3 Atletico Madrid 1707 36 2 1 17
4 Valencia 1581 34 3 1 17
5 Villarreal 1554 27 6 -1 17
6 Sevilla 1544 29 5 1 17
7 Athletic Bilbao 1534 21 11 -4 17
8 Real Sociedad 1508 23 8 0 17
9 Eibar 1492 24 7 2 17
10 Celta Vigo 1488 21 11 -1 17
11 Espanyol 1455 17 16 -5 16
12 Leganes 1450 21 11 1 16
13 Getafe 1434 23 8 5 17
14 Alaves 1423 15 17 -3 17
15 Real Betis 1403 21 11 4 16
16 Malaga 1400 11 19 -3 17
17 Girona 1393 23 8 9 17
18 Levante 1369 18 15 3 17
19 La Coruna 1363 12 18 1 16
20 Las Palmas 1326 11 19 1 16

Serie A

Serie A is characterized by 2 dominant teams, Juventus and Napoli, who only have 5 Elo points separating them (and 1 point). Roma also look strong but the gap to the 4th place team is 96 points. The standard deviation of Elo for Serie A is 150, which is the highest of the European leagues, suggesting that there is greater variability in team skill.

There are a number of over-performers, such as Sampdoria who are placed in 6th but have the 10th highest Elo, but interestingly very few under-performers, with no team being rated 2 positions better by Elo than their actual standing. I’m also amazed to see Benevento having picked up a solitary point from 18 games, well deserving of the lowest Elo score across all 4 leagues. This doesn’t necessarily mean that Benevento are the worst team in these leagues, but they are the furtherest from their league’s average.

Elo rank Team Elo Points Points rank Rank difference Played
1 Juventus 1779 44 2 -1 18
2 Napoli 1774 45 1 1 18
3 Roma 1730 38 3 0 17
4 Inter 1634 37 4 0 17
5 Lazio 1617 36 5 0 17
6 Atalanta 1575 27 6 0 18
7 Fiorentina 1553 26 8 -1 18
8 Torino 1516 24 9 -1 18
9 Udinese 1499 24 9 0 17
10 Sampdoria 1492 27 6 4 17
11 Milan 1483 24 9 2 18
12 Bologna 1454 24 9 3 18
13 Sassuolo 1426 20 14 -1 18
14 Chievo 1412 21 13 1 18
15 Cagliari 1393 17 15 0 18
16 Genoa 1391 17 15 1 18
17 Crotone 1353 15 17 0 18
18 SPAL 1343 15 17 1 18
19 Verona 1340 13 19 0 17
20 Benevento 1237 1 20 0 18

Bundesliga

The German league looks to be the most competitive, with the leaders having the lowest Elo of these 4 leagues and bottom-placed team having the highest. This is reflected in the standard deviation of Elo at 98, far lower than the other leagues. A similar finding has been identifed previously, where the bookies were less accurate at predicting Bundesliga matches than the 3 other leagues.

Elo rank Team Elo Points Points rank Rank difference Played
1 Bayern Munich 1780 41 1 0 17
2 Borussia Dortmund 1623 28 3 -1 17
3 Bayern Leverkusen 1573 28 3 0 17
4 Hoffenheim 1559 26 7 -3 17
5 Schalke 1554 30 2 3 17
6 Leipzig 1545 28 3 3 17
7 Borussia Moenchengladbach 1522 28 3 4 17
8 Augsburg 1504 24 9 -1 17
9 Hertha 1480 24 9 0 17
9 Ein Frankfurt 1480 26 7 2 17
11 Wolfsburg 1475 19 12 -1 17
12 Werder Bremen 1463 15 16 -4 17
13 Hannover 1438 23 11 2 17
14 Mainz 1418 17 14 0 17
15 Hamburg 1413 15 16 -1 17
15 Freiburg 1413 19 12 3 17
17 Koln 1382 6 18 -1 17
18 Stuttgart 1380 17 14 4 17

Related

comments powered by Disqus