This article by Justin Bourne was posted recently, and shared in a comment in another thread.
I find it exciting and think it is worth calling attention to it.
It gives some detail on exactly the type of data that will help enhance the limited or simplistic xG stats currently in use today, and even mentions your Calgary Flames!
We have had many discussions about the limitations of xGA this season.
Some things we have discussed, and that I have personally noted when discussing the Flames is basically that the current methods do not give
- sufficient granularity in the play leading up to the opportunity
- the defending team’s defensive posture / coverage (Flames’ lapses!)
- what the goalie has to do to save the puck, including
- where the actual shot is placed by the shooter
Check it out. Very exciting and worthwhile, as they are talking about how to include the first 3 of 4
https://www.sportsnet.ca/nhl/article...-shot-quality/
Some highlights - he starts discussing Matt Murray in a .8xx sv% game, pivots to the Flames
The gist with respect to Murray was that Murray thought he played fine and Keefe excused the goals
Quote:
"I think he's been great. The numbers look terrible, but again, here tonight, I thought he looked really good. I mean, it's four tap-ins. They're three feet or less in front of the crease."
|
The pivot to discuss the Flames
Quote:
On the offensive side of this same quality of scoring chance conversation sit the Calgary Flames, who have done a remarkable job making unestablished goaltenders look like peak-era Dominik Hasek
(Lists these 12 relatively unaccomplished goalies)
One thing access to deeper data has allowed us to recognize is the Flames are not just some unlucky team that’s due some positive breaks, but rather one that creates a lot of shots that aren’t overly dangerous. They generate in bulk from outside the slot – the most in the NHL – but are in the bottom half of the league at getting them from the inner slot.
|
Then he goes on to discuss the limitations of current xGA methods
Quote:
Analytics work that aims to quantify the quality of chances created has come a long way, and as a result, “expected goals” is now a reasonably common and accepted stat in the public discourse. It evaluates several factors such as shot location and considers influences such as “rush” or “rebound” to measure the likelihood of those attempts converting into goals. As a stat “expected goals” has its imperfections (as many stats do), but over larger samples there’s useful information there.
|
Then it goes on to elaborate on things that can be done with the tracking chips
Quote:
I only point out the imperfections because expected goals is purely a proxy that relies on comparable shots from comparable spots/scenarios and how often those kinds of attempts typically go in. To focus measurements more finely, there are a bunch of factors that would require puck and player tracking…
…which we now have.
With microchips in pucks and jerseys we should eventually be able to:
• create a version of expected goals that allows us to show how a created chance is more valuable when it’s taken by a totally unmarked player, rather than one with two defenders in the immediate vicinity of the shooter.
• tell that a shot following a cross-ice laser pass is more dangerous than one off a pass that’s gently slid across the ice
• And of course, be able to tell that when a goalie’s been juked out of his jock with a shoulder fake or no-look pass, the resulting attempt on a wide-open net is more likely to go in than one where the goalie is square to the shooter.
Some of the factors that create quality chances are repeatable skills, but we haven’t yet had the information to build them into our models.
Now we do.
|
The great thing is that he says he will be reporting out based on this during the playoffs!
Has everyone seen this? I’d be interested in your thoughts