• 1/4/2016 – SSBMRank 15-11
  • 1/5/2016 – SSBMRank 10 , 9 , 8 , 7
  • 1/6/2016 – SSBMRank 6 , 5 , 4
  • 1/7/2016 – SSBMRank 3 , 2 , 1
  • 1/8/2016 – Recap

The Process

If a player accomplished any of the following:

  • Top 64 at Apex 2015, CEO 2015, Evo 2015, Big House 5
  • Top 24 at Dreamhack Winter
  • Top 16 at a US Major
  • Top 8 at a European major

, then they were auto-nominated. Any other player had to be nominated with a reasonable list of accomplishments with a form that was distributed. This form was in distribution for roughly 3 weeks. With the help of a few others, I added people to the nomination list, but also removed some that were auto-nominated on the final ballot. Some examples of this included: Lovage who had only participated at Evo. It was tough to create a fully objective criterion to determine who was active because no particular system seemed to encompass what I deemed acceptable. Mostly, I went by an “eye-test” to determine whether a person was active, but went fairly conservative on removing people as well. I also removed some auto-nominated players that were very unlikely to make it into the top 100. This was done to trim a list of 300 players to about 150, making it easier on the panelists to rate the entire list.

There is some slight biases to US players as opposed to other Regions (Europe, Japan, Mexico), which I recognize. This was due to the majority of panelists and larger tournaments being in the US. I would rather put a person on a “hidden boss/Inactive” list than give a very inaccurate rating. However, I did compromise from previous years by allowing European Nationals to help determine activity, which wasn’t done in previous SSBMranks. As more tournaments occur in Europe, I am fully confident that there will be more than enough exposure to loosen the requirements on international players. To be consistent, many players in the US were also not included if their Nationals/Majors attendance were sparse. A very strong and compelling argument would be needed to include a player that didn’t meet the auto-nomination process.

Ballots were distributed to ~60 panelists spanning from regions all across the world. Panelists were selected based on their knowledge of the Smash scene. A strong effort was made to distribute according to region fairly, so that one region did not dominate the majority of the ballots. People who had questionable ballots in the past were not asked this time around. The instructions given to the players were this:

“Given the quality and quantity of work in 2015, if everyone entered 100 tournaments today, who on average would place the best?”

Based on this prompt, panelists gave a rating from 1-10 with the worst player on the nomination list getting a “1” and the best getting a “10”. Panelists did not have to rate all players if they did not know enough but were expected to rate at least 90/150 (~60%) of the list.

After the ~40 ballots were compiled, I further scrutinized the list. Any player that did not receive enough ratings were moved back into “inactive/hidden boss” status as most of the panelists did not know how to rate players that did not attend much in 2015.  Players that also only attended 1-2 majors were also removed if they did not receive enough tallies from panelists.

The ratings were added together, with the “maximum” and “minimum” values removed to reduce variance. Furthermore, I audited the ballots for any obscurities or typos (Ex. 5.9 instead of 9.5 for PPMD). Any player deemed inactive due to a combination of low tallies or majors were removed from the rankings. The players were ordered by score to determine their final rank.

Latest Posts

  • SSBMTrophyBG

SSBMRank 2015 Full List

January 19, 2016|8 Comments

  • SSBMTrophyBG

SSBMRank Recap

January 8, 2016|6 Comments

  • SSBMTrophyBG

SSBMRank 2015 #1

January 7, 2016|23 Comments

  • SSBMTrophyBG

SSBMRank 2015 #2

January 7, 2016|6 Comments

  • SSBMTrophyBG

SSBMRank 2015 #3

January 7, 2016|14 Comments