Quote:
Originally Posted by FanIn80
Can you (or anyone) quickly explain why Elliott was expected to allow 2.83 GA60, while Allen was only expected to allow 2.53?
|
First of all though a single season isn't a huge sample size in terms of shot volume. If you really want to get a feel for a goalie, use a larger sample. Rob Vollman suggests three seasons I believe.
Reasons might include
-Because they played the bulk of their games in different stretches of the season (Allen injured later in the year)
-with different opponents
-with different healthy teammates
-different "penalty-taking" stretches (we see it all the time, one month a team is going mentally wild, next month they're disciplined)
-shot blocking luck
-the two goalies do play vastly different individual styles both in terms of puck handling - choices such as whether to freeze pucks for faceoffs VS direct them to a teammate in the corner, to clear a dump-in quickly before the forechecker arrives or to stay in net.
-Elliott had a better save percentage, so he may have had a lead more often in games, and score effects can influence what a goalie is facing.
-Goalie weaknesses can affect what quality or quantity of shots teams are launching at them. Though I don't see that meaning much against Elliott who had an elite SV%
One thing I did examine was RBA (Rebounds Against).
Elliott allowed 2.12 per 60 minutes.
Allen allowed 1.83 per 60 minutes.
Hockey is too fluid a sport to capture exactly why, but as Elliott's actual GA shows, that's okay, because he exceeded statistical expectations anyways.
If you look at the actual Fenwick data (unblocked shot attempts faced), Elliott saw 41 while Allen saw 39.2 - so even there you see Allen saw about two less pucks his way per game on the same team though it doesn't quite include shot quality the way xGA would.