Let’s be quick, since I don’t want to waste too much time on this. We’ll start off with how I performed last year. My first goal was to average roughly 6.5 correct tips for every 9 games played, so I made my goal to get over 72% tipping accuracy. Sadly, I missed this by about 3 tips, getting 147 instead of 150 from the 207 games played. My other goal was to get a Mean Average Error (MAE) of below 30 points. I missed this by a bit more, getting a MAE of 30.85 and missing my target by 177 points.
Now, how did the teams perform? Here are two handy tables to see where each team ended up using SccPR2.
Nothing looks too out of place (no prizes for guessing who the biggest incliners and decliners were). North Melbourne ended up dropping to 10th place despite still coming 8th in the season. This is due to the recency weighting applied to the ratings. Since North Melbourne won the necessary games in the first half of the season and then dropped away spectacularly, they ended up much lower than teams who performed well at the back end of the season. When it comes to finals, inconsistencies can be explained with the sentence “Individual games are random”. If someone were to come up with a probability distribution for the finals system used last year, I’d wager that the Bulldogs had very slim chances of winning the premiership like they did. That said, labelling them as “lucky” would be inconsiderate. It took a lot of character and skill to win those 4 games like they did. They were injury-ridden, but they never gave up.
And that brings this post to an end. I’ll probably have the same goals for this coming season, and hopefully the new and improved SccPR3 performs (at least) slightly better than the old system. I’m going to try to not tinker with the system too much during this season, but no promises.