Quote:
Originally Posted by Major Major
Lol, see. There's your slant. There is only one reality when looking at the standings and that is the standings. And the NHL only counts it one way.
|
I can't speak for Itse, but I don't see it as a slant either. The whole intent is to see win-loss, and the underlying reasons behind it.
I guess he could have done it as the NHL standings does it, but then why not just go look at the standings? I don't mind it broken down in a basic win-loss record.
Maybe another column that includes 'x amount of points from 20' or something would add some additional insight into the overall picture, but I wouldn't say that Itse is showing some slant or bias in his numbers. It really is a win-loss record, and come playoff time, I would say that the way the information is presented would probably be a bit more predictive? Maybe less predictive?
I say keep it as 'win-loss', and maybe add another column to state 'total points' (I figure we can assume they are 20 point maxes given 10 game segments here). Could be interesting to see the overall points gained or lost and the relationship to the other data to fine-tune the associations.
Thanks for doing this Itse. I like advanced stats, basic stats, and everything in-between.
Also, with the original post, there MAY not be a decline defensively when facing increased shots (well, I think in this case, it is accurate) - depending on how you look at it. I often don't care too much about the shot totals in-game. I find that they often misrepresent what is happening on the ice.
For instance, when Calgary was using Hartley's system, they would get out-shot frequently. However, they won most of those games that season (the playoff season, and also half the season before that - that is a pretty lengthy sample size). Also, if you really go back and remember many of those games, you would see that Calgary wasn't exactly tested - lots of outside shots, lots of contested shots from the high danger areas, etc., and the other team never really did muster much in the way of 'dangerous offence', so I argue that they were mostly contained. Didn't always happen, but I argue it probably happened enough to skew the results and expectations of what CORSI tries to predict, which is why I don't really get all that excited about CORSI.
I just find shot totals (including high danger chances) so subjective. They are not all the same, some teams allow certain shots to go through depending on their goaltending situation and defensive ability. Some teams have no finishing ability but are monsters around the boards. CORSI isn't a garbage stat, but I find it very underwhelming, and I believe the confidence interval is simply too low to make it a reliable predictor. I am talking more than just the 'bad luck' in a few games a season, though I am sure that adds up as well, when the pucks hit posts, when a ref makes a bad goal call, etc.
Sometimes it is nice to just see a more simplified data set and start digging into the underlying reasons as to why they are happening. Whether it is advanced or simple, I can always appreciate someone taking the time to show data in a different way, as it does get you thinking a bit, so thanks Itse.