What's new

CoasterForce's favourite coasters 2018

Bedholm

Roller Poster
23/25 just missing the new for 2018 rmcs. It feels weird as an American to have all of the European rides that placed top 25, but not all of the American ones.
 

JoshC.

Strata Poster
The thing to remember with this as well is that this isn't a way of determining which coasters are the best, but simply a way of going ''here's a sample of X number of people, and these are their favourites". So of course, if X is large and only a small number of people have ridden certain rides, their ranking will be lower.

I don't know how the Mitch Hawker poll did the maths, but I assume they did something to combat that? Which I would guess is why someyisom it ended up providing questionable results?

There's probably a way to do it so that you can include people's top 20/25/whatever too, but that would add complications since those who do that likely have higher counts (and you could then get an "experienced" bias). But again, there's no doubt a way to include it, though you're then stepping into complicated equation territory..
 

jayjay

Giga Poster
I didn't see this as some kind of industry defining, rigorous ranking. Just people who like numbers making numbers for other people who like numbers and arguing over them. It's what I love about this fandom.

Anyway, I've been on 7, because I'm a scrub who left Europe once, 12 years ago.
 

Peet

Giga Poster
As a list of CoasterForce's favourite coasters, ie the ones that people talk about the most on the forum and rank highly in the many poll/list/rank topics we have, I'd say it's bang on. The bias for Europe and lack of Asian representation reflects the users. We know it's not going to be the new Mitch Hawker poll but I think it's done exactly what was intended, great job!
 

Snoo

The Legend
The thing to remember with this as well is that this isn't a way of determining which coasters are the best, but simply a way of going ''here's a sample of X number of people, and these are their favourites". So of course, if X is large and only a small number of people have ridden certain rides, their ranking will be lower.

I don't know how the Mitch Hawker poll did the maths, but I assume they did something to combat that? Which I would guess is why someyisom it ended up providing questionable results?

There's probably a way to do it so that you can include people's top 20/25/whatever too, but that would add complications since those who do that likely have higher counts (and you could then get an "experienced" bias). But again, there's no doubt a way to include it, though you're then stepping into complicated equation territory..

Yes Mitch Hawker did have a way to eliminate some portions of bias but of course, in a voluntary poll, it's impossible to eliminate all. You can see that over the years with some rides shooting up higher then the general populations opinion due to a trip groups such as TPR or ACE took to a lesser known park (any Far East coasters are great examples). I know he still has it on his website even though he hasn't ran the poll in over 5 years. Good read through. :)
 

Hyde

Matt SR
Staff member
Moderator
Social Media Team
It is very true: T Express, Eejainaka, and a number of other Asian coasters get pulverized when you consider ridership numbers. It is very difficult to statistically rectify a roller coaster with 1 rider who ranked it #1 versus a roller coaster ranked #1-3 by 15 people. When is that CF Live Asia again so we can boost ridership numbers? ;)

The way our methodology worked out and the caliber of the data we have - we are assuming top rated roller coasters are those with high rankings and high ridership numbers.

I don't know how the Mitch Hawker poll did the maths, but I assume they did something to combat that? Which I would guess is why someyisom it ended up providing questionable results?

There's probably a way to do it so that you can include people's top 20/25/whatever too, but that would add complications since those who do that likely have higher counts (and you could then get an "experienced" bias). But again, there's no doubt a way to include it, though you're then stepping into complicated equation territory..
Mitch Hawker Poll actually operated in the inverse. The base calculation that was used is called a pairwise comparison, which compares the ranking of each roller coaster within a ballot, calculating wins and losses of rank.

This is to say, a roller coaster ranked #1 in a 10 coaster ballot would be awarded 9 wins (not including itself), while a roller coaster ranked #2 would have 8 wins and 1 loss, etc. This is why rankings were rendered as pictured below, with W/L categories for each ride:

steelmutualrider2013.jpg

Here's a full detail of results from the 2013 steel coaster poll.

Since Mitch Hawker did not have a cap to number of roller coasters that could be entered in the poll - enthusiasts could enter all roller coasters ridden, this gave enthusiasts with larger roller coaster counts greater sway in the poll, since a roller coaster ranked last in their ballot would face a large number of losses than the same roller coaster being ranked last in a lesser ridden enthusiast with a smaller ballot. So, counting straight wins and losses still runs the risk of favoritism of few enthusiasts with very large ridership numbers. It's still a better method than counting favorites-only or going with the regional point award system Golden Tickets uses, but not bullet proof. (same as our rankings :p )
 

Snoo

The Legend
It is very true: T Express, Eejainaka, and a number of other Asian coasters get pulverized when you consider ridership numbers. It is very difficult to statistically rectify a roller coaster with 1 rider who ranked it #1 versus a roller coaster ranked #1-3 by 15 people. When is that CF Live Asia again so we can boost ridership numbers? ;)

The way our methodology worked out and the caliber of the data we have - we are assuming top rated roller coasters are those with high rankings and high ridership numbers.

Mitch Hawker Poll actually operated in the inverse. The base calculation that was used is called a pairwise comparison, which compares the ranking of each roller coaster within a ballot, calculating wins and losses of rank.

This is to say, a roller coaster ranked #1 in a 10 coaster ballot would be awarded 9 wins (not including itself), while a roller coaster ranked #2 would have 8 wins and 1 loss, etc. This is why rankings were rendered as pictured below, with W/L categories for each ride:

View attachment 4290

Here's a full detail of results from the 2013 steel coaster poll.

Since Mitch Hawker did not have a cap to number of roller coasters that could be entered in the poll - enthusiasts could enter all roller coasters ridden, this gave enthusiasts with larger roller coaster counts greater sway in the poll, since a roller coaster ranked last in their ballot would face a large number of losses than the same roller coaster being ranked last in a lesser ridden enthusiast with a smaller ballot. So, counting straight wins and losses still runs the risk of favoritism of few enthusiasts with very large ridership numbers. It's still a better method than counting favorites-only or going with the regional point award system Golden Tickets uses, but not bullet proof. (same as our rankings :p )

So what you're saying is you and @Pokemaniac are bringing back the Mitch Hawkers Poll? :D
 

Hyde

Matt SR
Staff member
Moderator
Social Media Team
*Thinks about legwork needed to properly code out all roller coaster names*

giphy.gif
 

Tomatron

Giga Poster
I've got 19 of the 25.

X2 and Lightning Rod were both spiteful on my last visits to their respective parks, and I've not been to Kolmarden yet or back to the parks which have since had the other RMCs open.
 

DelPiero

Strata Poster
I have 20/25, only missing the Cali creds, I305, Twisted Timbers and Karnan.
We have a very solid top 10 that's for sure.
 

caffeine_demon

Strata Poster
I've got 16/25

The remaining 9, in about the order I'd like to ride them:
1 - Steel Vengeance, Cedar Point (RMC, 2018) - 0,06
2 - Lightning Rod, Dollywood (RMC, 2016) - 0,14
15 - Storm Chaser, Kentucky Kingdom (RMC, 2016) - 0,48
12 - Outlaw Run, Silver Dollar City (RMC, 2013) - 0,38
11 - Fury 325, Carowinds (B&M, 2015) - 0,37
5 - Twisted Colossus, Six Flags Magic Mountain (RMC, 2015) - 0,25
22 - Twisted Timbers, Kings Dominion (RMC, 2018) - 0,78
9 - X2, Six Flags Magic Mountain (Arrow, 2002)- 0,29
8 - Voyage, Holiday World (PTC, 2006) - 0,29

And my ranking for the one's I've ridden
6 - Skyrush, Hersheypark (Intamin, 2012) - 0,25
17 - Wicked Cyclone, Six Flags New England (RMC, 2015) - 0,55
3 - Taron, Phantasialand (Intamin, 2016) - 0,18
13 - Nemesis, Alton Towers (B&M, 1994) - 0,42
19 - Schwur des Kärnan, Hansa Park (Gerstlauer, 2015) - 0,58
4 - Maverick, Cedar Point (Intamin, 2007) - 0,25
23 - Top Thrill Dragster, Cedar Point (Intamin, 2003) - 0,80
18 - Expedition GeForce, Holiday Park (Intamin, 2001) - 0,56
24 - Montu, Busch Gardens Tampa (B&M, 1996) - 0,80
7 - Helix, Liseberg (Mack, 2014) - 0,27
21
- Wildfire, Kolmården (RMC, 2016) - 0,59
10 - El Toro, Six Flags Great Adventure (Intamin, 2006) - 0,31
14 - Shambhala, Port Aventura (B&M, 2012) - 0,43
16 - Millennium Force, Cedar Point (Intamin, 2000) - 0,51
25 - Intimidator 305, Kings Dominion (Intamin, 2010) - 0,81
20 - Boulder Dash, Lake Compounce (CCI, 2000) - 0,58
 

Hyde

Matt SR
Staff member
Moderator
Social Media Team
I thought it'd be fun to also consider my own roller coaster rankings against the aggregate, just to see how much agreement or disagreement I share. Each highlighted roller coaster are those I've ridden (a whopping 12), followed by my personal ranking, and the difference in ranking between my personal ranking and community ranking.

1 - Steel Vengeance - #1, 0
2 - Lightning Rod - N/A
3 - Taron - N/A
4 - Maverick - #4, 0
5 - Twisted Colossus - N/A
6 - Skyrush - #7, -1
7 - Helix - N/A
8 - Voyage - #2, 6
9 - X2 - #5, 4

10 - El Toro - N/A
11 - Fury 325, #8, 3
12 - Outlaw Run, #6, 5

13 - Nemesis - N/A
14 - Shambhala - N/A
15 - Storm Chaser - #3, 12
16 - Millennium Force, #11, 5

17 - Wicked Cyclone - N/A
18 - Expedition GeForce - N/A
19 - Schwur des Kärnan - N/A
20 - Boulder Dash - N/A
21 - Wildfire - N/A
22 - Twisted Timbers - #13, 9
23 - Top Thrill Dragster - #12, 11

24 - Montu - N/A
25 - Intimidator 305 - #9, 14

Interesting that my first 3 roller coasters ridden roughly mirror community consensus. Skyrush was actually the only roller coaster of the Top 25 Community Rankings that I ranked lower than it's actual finish. Biggest differences in roller coaster ranking for my personal versus community were Intimidator 305, Storm Chaser, and TTD; totally the result of not having ridden other folks top roller coasters.
 

MestnyiGeroi

Giga Poster
This is fantastic. I’m just sad that I still haven’t gotten my act together after such a busy 2018 (work busy and coaster busy) to compile my latest ranked list and post it here.

Now that it’s “too late” I’ll get around to doing it. :(
 
Top