FanPost

Ranking the Lynx

Now that the season is over, we naturally start wondering where the 2011 Lynx rank among WNBA champions. Could they be the best team the league has ever seen, or did they just catch some lucky breaks and end up with the trophy? How do we determine where they fit in the spectrum of champions?

Luckily, we have two methods available to determine which championship team is the best of the best.

First we'll use Kevin Pelton's method from Basketball Prospectus.

I ranked teams using a quick, dirty rating that combines performance in the regular season and playoffs, as measured by point differential. The key adjustment is also adding the average regular-season differential of each team's playoff opponents, so that I attempt to measure postseason performance by how the team would have fared against a group of league-average opponents. Lastly, I added a one-point bonus for each championship team

He added a point for winning the championships, since he was ranking all teams rather than just champs. We're going to forgo that since we're only ranking teams that won it all. The list...

  1. 2000 Comets 19.0
  2. 2002 Sparks 13.1
  3. 2001 Sparks 13.0
  4. 1998 Comets 12.8
  5. 2006 Shock 11.8
  6. 1997 Comets 11.6
  7. 1999 Comets 10.2
  8. 2011 Lynx 10.2
  9. 2005 Monarchs 10.1
  10. 2007 Mercury 9.4
  11. 2008 Shock 8.2
  12. 2004 Storm 7.7
  13. 2010 Storm 7.2
  14. 2009 Mercury 5.9
  15. 2003 Shock 4.5

Two things jump out immediately on this list. First is the low ranking of last year's "Perfect Storm" team that won 28 games and swept through the playoffs. They get dragged down by a mediocre playoff point differential and the weakest set of playoff opponents any champ has ever faced, thanks to the extreme weakness of the Western Conference last year. The 2010 Sparks had a huge negative point differential, the 2010 Mercury were nearly dead even in that category, and the 2010 Dream reached the finals as a #4 seed.

The other impact item is more troublesome. The top three teams all came from seasons with 16 teams. Taking the top eight teams in a 16 team league will obviously result in better point differentials for the weakest playoff teams than taking the top eight in a 12 or 13 team league. The league also had shorter playoffs through 1999 and shorter finals through 2004, both of which affect the results in ways favorable to the older teams. None of these factors affected Pelton's NBA rankings, so we have no reliable guide for making adjustments. For now, we'll let these rankings stand as they are and move on to the other system we have.

John Hollinger ranked NBA finalists for ESPN.com this summer.

For both the regular season and playoffs, I looked at two factors: win-loss record and average scoring margin. Every regular-season win was worth two points, with the 1999 participants having their wins prorated to an 82-game season. Similarly, every playoff win was worth four points, but each playoff loss docked a team four points -- this helped differentiate between champions who went 15-2 (like the 1991 Bulls) and those who went 15-9 (like the 1988 Lakers).

For scoring margin, I took the team's season scoring margin and divided by 15; basically, a one-point-per-game increase was worth 5.47 points in this formula. For playoff scoring margin, I did the same thing but multiplied by four -- since most teams played about four times as many regular-season games as playoff games, this made the two virtually equal.

This method doesn't incorporate playoff opponents point differential, so the expanded league issues from above don't come into play. We do need to tweak this slightly, however, to compensate for the WNBA's shorter schedule. To make a one-point-per-game increase worth 5.47 points in a 34 game season, we have to divide the scoring margin by six rather than 15. The WNBA used a shorter season through 2002, so we'll prorate the regular season wins and point differential for those early teams.


What about the shorter playoffs? This did come up in Hollinger's ranking and he adjusted thus...

From there, only one other tweak was necessary: adjusting for those teams in the earlier years that didn't have as many early-round playoff games in which to rack up points. Teams that didn't play a first-round series got 12 extra points; teams that played a best-of-three got six points; teams that played a best-of-five got three points. That's an approximation, obviously, but it mirrored what other teams in their situation actually did.

Basically he added three points for each additional playoff win the team would have needed. That's an easy fix for this system. The 1997 Comets will get 15 bonus points, the 1998 and 1999 Comets will get nine, and the champions from 2000-2004 will each get three.

The list...

  1. 2000 Comets 192.2
  2. 1998 Comets 181.5
  3. 2011 Lynx 176.7
  4. 2001 Sparks 175.7
  5. 2002 Sparks 167.5
  6. 2010 Storm 160.2
  7. 1999 Comets 157.0
  8. 2005 Monarchs 143.3
  9. 2004 Storm 130.5
  10. 2008 Shock 129.0
  11. 2007 Mercury 128.2
  12. 1997 Comets 122.8
  13. 2006 Shock 119.2
  14. 2009 Mercury 118.2
  15. 2003 Shock 108.8

That seems like a more reasonable list than the one produced by the Pelton method. There may still be some inflation from the 16 team league, as all three of the champions from that era rank in the top 5, but it's not overwhelming. Any stat based method is likely to put the 2000 Comets on top regardless of adjustments.

Does that mean the 2011 Lynx are the third best team ever? Not necessarily, but it's not out of the realm of possibility.