Yep. Like most stats it doesn’t work on its own so XG above and below need looking at with s% and probably XG. What XG does tell you (to some degree) is how well the systems are working and being executed but even then other stats plus the eye test complete the picture. No stat on its own tells you anything much.Just look at shooting %.
I have developed such a distaste for expected goals. Beyond the metrics inability to account for actual on ice situations, people believe it is an account of luck. The Avs, Cats and Canes are not just super lucky. Conversely the Habs, Hawks and Flyers have not been unlucky. Fun fact with expected goals, From 2014-2017 the Kings led the entire league in xGF% by a fairly substantial margin, but only made the playoffs once and Lombardi/Sutter were fired.
I have developed such a distaste for expected goals. Beyond the metrics inability to account for actual on ice situations, people believe it is an account of luck. The Avs, Cats and Canes are not just super lucky. Conversely the Habs, Hawks and Flyers have not been unlucky. Fun fact with expected goals, From 2014-2017 the Kings led the entire league in xGF% by a fairly substantial margin, but only made the playoffs once and Lombardi/Sutter were fired.
MoneyPuck.com -About and How it WorksVariables In Shot Prediction Model:
1.) Shot Distance From Net
2.) Time Since Last Game Event
3.) Shot Type (Slap, Wrist, Backhand, etc)
4.) Speed From Previous Event
5.) Shot Angle
6.) East-West Location on Ice of Last Event Before the Shot
7.) If Rebound, difference in shot angle divided by time since last shot
8.) Last Event That Happened Before the Shot (Faceoff, Hit, etc)
9.) Other team's # of skaters on ice
10.) East-West Location on Ice of Shot
11.) Man Advantage Situation
12.) Time since current Powerplay started
13.) Distance From Previous Event
14.) North-South Location on Ice of Shot
15.) Shooting on Empty Net
Model Description: Expected Goals FabricMy animating assumption is that all of the skaters (between six and eleven, in total, usually) are working together both to generate shots for their team and to suppress the generation of shots by the other team. In principle, I consider all of the skaters equally able to affect both processes, in the long run, even if a given skater (for either team) might be only minimally involved with a given shot. All of play in all three zones leading up to and including the decision by a given player to shoot the puck, I understand to be the product of the (combined) ability of all of the skaters, and I call the goal likelihood of shots generated, at the moment the decision to shoot is made, the "expected goal probability", or xG for short. Then, in addition to the xG of the pattern to which a given shot conforms, the shooter themself can, in principle, affect the goal likelihood of the shot, by shooting the puck skilfully, or perhaps (as I might) by woefully butchering whatever chances they and their teammates (and opponents) have conspired to generate. This ability I call "finishing impact" or "shooting talent". Finally, a goaltender can, in principle, affect the goal likelihood of a shot after it is taken, by sneakily interposing their body between the puck and the goal, or, (again as I might do) contriving to fail to do so.
This three-fold division&em;all of the skaters collectively produce the shot, the shooter shoots, and the goalie endeavours to save&em;is the animating idea behind the model I describe here. Even at this very basic level these choices have important consequences. For instance, a player who is skilled at receiving passes in (relatively) less-dangerous shooting positions and then skating untrammelled with the puck to more-dangerous locations will score more often for that reason, and that skill will appear in my accounting in their impact on xG (which will increase, since their team is taking more dangerous shots) and not in their finishing impact (which will presumably decrease, since they are shooting from locations where goals are easier to come by). Similarly, including goaltender effects only on shots already taken prevents us from making any estimate of goaltenders' impact on xG, conceded or generated, from, say, their tendency to handle the puck.
Throughout this article, when I say "shot" I will mean "unblocked shot", that is, goals, saves, and misses (including shots that hit the post or the crossbar). All shots in all strength situations that are taken against a team with a goaltender on the ice are considered.
...
I use a design matrix X" role="presentation">X
for which every row encodes a shot with the following columns:
- Two indicators for the shooter; (in 2019-2020 there were 879 shooters)
- One for tip/deflections
- Another for all other shot types
- An indicator for the goaltender; (in 2019-2020 there were 86)
- A set of geometric terms and shot types, described below;
- An indicator for "rush shots", that is, in-zone shots for which the previous recorded play-by-play event is in a different zone and no more than four seconds prior;
- An indicator for "rebound shots", that is, shots for which the previous recorded play-by-play event is another shot taken by the same team no more than three seconds prior;
- An indicator for teams which are leading and another for teams which are trailing; to be interpreted as representing change in configurations surrounding shots compared to when teams are tied;
- Four indicators for different skater strength situations:
All shots are assigned exactly one of the above indicators, which should all be understood as the change compared to a similar shot at even-strength, that is, all 4v4 and 5v5 shots gathered together.
- SH for short-handed shots, that is, ones taken by a team with fewer skaters;
- PPv3 for shots taken by a team with more than three skaters against a team with exactly three skaters;
- PP for all other power-play shots, that is, ones taken by a team with more skaters;
- 3v3 for play where both teams have exactly three skaters (mostly in overtime)
- An interaction term for shots which are slapshots and also on the power-play of any kind, this term is meant to proxy for one-timers.
A New Expected Goal Model That is Better Than Corsi at Predicting Future Goals and Wins Above Replacement 1.1 and Expected Goals 1.1: Model Updates and ValidationMy research has led me to different conclusions on the predictive power of expected goals, but before I get into that, I want to address my issue with this line of thinking. A part of me wishes that we had stuck to calling expected goal models “Shot Quality” models instead, because I think that the term “Expected Goals” implies that these models are solely predictive in nature, which isn’t necessarily the case. Even if expected goal shares were completely useless for predicting future goals at the team levels, expected goals would still be extremely useful for describing past events and telling us which teams relied heavily on goaltending and shooting prowess, or were weighed down by poor shooting and goaltending, and even which shots the goaltender deserved most of the blame for, so I disagree with the premise that hockey fans should stop using expected goals at the team level if they are not as predictive as Corsi.
...
I accounted for the following variables in my model:
- Shot distance and shot angle. (The two most important variables.)
- Shot type.
- The type of event which occurred most recently, the location and distance of this event, how recently it occurred, which team the perpetrator was, and the speed at which distance changed since this event. (The inclusion of the last variable was inspired by Peter Tanner of Moneypuck.)
- Whether the shooting team is at home.
- Contextual variables such as the score, period, and seconds played in the game at the time the shot was taken.
- Whether the shooter is shooting on their off-wing. (For example, a right-handed shooter shooting the puck from the left circle is shooting from the off-wing, and a left-handed shooter shooting from the same location is not.)
https://www.csahockey.com/what-we-doCSA’s proprietary methodology systematically catalogs every shot sequence resulting in a shot on goal for every game played in the NHL, using 34 individual standardized points of data, including:
- Passer
- Passer location
- Shooter
- Shooter location
- Offensive situation (i.e., man advantage/even strength, odd man rush/settled offense and face offs)
- Screens
- Deflections
- Broken plays
And of course RESULTS including:
This proprietary methodology allows CSA to accurately categorize each and every shot sequence resulting in a shot on goal by type, creating the definitive measure of a scoring chance—the actual probability of scoring. CSA has analyzed more than 250,000 shot sequences resulting in a shot on goal—more than 8 million individual points of data, creating a new generation of team and player performance metrics that will change how you see the game.
- Rebounds
- Whistles
- Goals
I'm not sure it's fair to say the expected goals models don't account for actual on-ice situations. Each model is built a little differently but one similarity for all is that they're not simply based on shot location stripped of any of context of what's occurring on the ice. Most models will account for what happened prior to the goal (e.g., rebound, rush chance, cross-ice puck movement, tip, power play, etc.). The private models that we're not privy to (except for occasional releases of stats) tend to account for more pre-shot occurrences, so you could certainly argue those are better (which is probably why at least 30 teams are using Sport Logiq's data to some extent or another). The public models aren't totally without value, however, in quantifying the quality of shots a team is generating or giving up in a game and over the course of a season.
To your point about luck versus skill, I won't dispute that the skill of the shooters on a team will have an impact on whether that team is over-performing or under-performing their expected goal totals. In that sense they can be useful because over the course of the season teams that are over-performing their expected goal totals clearly have better finishing than teams that are under-performing their totals (e.g., the Kings). What I see when I look at the expected goal totals to date there is that the Kings are doing a pretty good job generating the types of shots that historically go in the net but that they suck at finishing. Some people might say, "well, duh, we already knew that," but I think it's useful to have a quantifying data point to highlight that (shooting percentage alone lacks all of the context of what's actually happening on the ice).
The thing about expected goals is that it really doesn't seem to deviate all that much from shots for. Let's take the 3 year stretch from 2018-2021. The top 3 teams in xGF% are Vegas, Carolina and Montreal, while the top 3 teams in SF% are Vegas, Carolina and Montreal. Last year the top 3 teams in SF% were Boston, Colorado and Florida, while the top 3 teams in xGF% were Colorado, Toronto and Boston(with Florida in 4th). So, it just seems to me to be a glorified shots for that doesn't account for skill.
Brown is on his last legs, I think he retires and becomes a Kings spokesperson for youth/charity etc.Is there a metric for 'expected failure' ? Maatta would be #1 and Brown would be in the top 10. It's is true that Brown mixes in some 'unexpected success' moments. He actually does. But the 'expected failure's' are 8X those.
Why would it? How do you score?
There's a ton these models leave up to interpretation and I think they should be that way. The mistake isn't the stat, it's the application.
Haha, yeah I was more pointing out that despite all the work that has gone into developing these, you can get nearly the same amount of information by watching the shot tracker on a broadcast.
I definitely agree that there is nothing wrong with the stat, it's another piece of information to discuss. I just don't believe it is infallible and charts like the one JFresh posted can lead people to think there is luck at play rather than skill divides. It's kind of like the PDO discussion when that stat was more popular, at the end of the day it really just shows you that some teams are good and some teams are bad.
free tkachev
cause the headcoach is a dinosaur.I( really don't get why Tkachev is in the minors, he outplayed every Kings winger
and i have a hard time to beliebve that people like Grundstrom, Arvidson or even Iafallo are an upgrade over Tkachev
I( really don't get why Tkachev is in the minors, he outplayed every Kings winger
and i have a hard time to beliebve that people like Grundstrom, Arvidson or even Iafallo are an upgrade over Tkachev