New Rides Survey

Login or Register to Hide This Ad

horsesboy

professional Pantheon observer
Advisory Panel
Silver Donor
Jun 16, 2013
2,747
5,775
113
Pretty sure, after a ton of surveys received from areas where they didn't email it, that they might just toss the whole thing.
Entirely possible but we don't really know how they emailed them out or even if they are the ones compiling them. It's not uncommon for a company to hire an bn independent company to do the actual surveys and summarise the results so Sea might not even know who bn is responding vs who they sent them to.
 

GrandpaD

Curve Flattener.
Aug 3, 2017
4,344
9,333
113
Newport News, VA
I'm sure it's an independent company. But they'll have algorithms to match up surveyed locales and respondents. If it's not of whack of what they anticipated they may just trash the whole thing or weed out the the "bad".
 
  • Like
Reactions: tursiops

Jonesta6

Glumble
Feb 14, 2019
2,685
2,162
113
From my experience doing this sort of thing, those in Virginia responding to Virginia parks is probably ok, but Virginians responding to SWO or BGT will probably raise eyebrows.

It does start off saying our records indicate you've visited recently, so either that's a terrible piece of boilerplate or we're all taking a survey after a code was required meaning that we're all essentially creating duplicate records for the same respondent and thus will be tossed out.
 
  • Like
Reactions: tursiops and BGWnut

warfelg

Advisory Panel
Mar 16, 2016
6,112
9,731
113
From my experience doing this sort of thing, those in Virginia responding to Virginia parks is probably ok, but Virginians responding to SWO or BGT will probably raise eyebrows.

It does start off saying our records indicate you've visited recently, so either that's a terrible piece of boilerplate or we're all taking a survey after a code was required meaning that we're all essentially creating duplicate records for the same respondent and thus will be tossed out.
my $0.02 on this:
how many people from here are doing it vs how many was it sent to?

Say it was sent to 20,000 people, 15,000 do it, along with 75 from ParkFans, that’s 0.05% of the results and not really messing with anything.
 
Feb 3, 2019
2,460
3,953
113
I know these are demonstration numbers.... But no survey has anywhere near a 75% response rate.

If they're lucky maybe 7.5%..

But probably a realistic number is .75%
 

warfelg

Advisory Panel
Mar 16, 2016
6,112
9,731
113
I know these are demonstration numbers.... But no survey has anywhere near a 75% response rate.

If they're lucky maybe 7.5%..

But probably a realistic number is .75%
Common survey response is 30-40%. So at 20,000 sent out, that's 6,000-8,000 responses. So using my theoretical 75 number that's 1.25-0.94% of the results. Still a statistically small enough of the response to not say the survey was compromised so to speak.
 
Feb 3, 2019
2,460
3,953
113
Common survey response is 30-40%. So at 20,000 sent out, that's 6,000-8,000 responses. So using my theoretical 75 number that's 1.25-0.94% of the results. Still a statistically small enough of the response to not say the survey was compromised so to speak.
Google begs to differ.

Internal surveys will generally receive a 30-40% response rate (or more) on average, compared to an average 10-15% response rate for external surveys.

I would call this an "external survey" wouldn't you?

So it's more like 2000 responses and now 75 is closer to 4%. Starting to make a bit of a difference at that percentage.
 

warfelg

Advisory Panel
Mar 16, 2016
6,112
9,731
113
Google begs to differ.

Internal surveys will generally receive a 30-40% response rate (or more) on average, compared to an average 10-15% response rate for external surveys.

I would call this an "external survey" wouldn't you?

So it's more like 2000 responses and now 75 is closer to 4%. Starting to make a bit of a difference at that percentage.
Overall that’s still a small impact if 75 ParkFans members took part. Based on responses to this thread if that number is more at 15-20 the percent of responses from us drops again. I doubt we’re responding in big enough numbers to effect the survey and make them throw it out.

On top of that they do ask about what park you visited last, so I would expect that someone that’s a member here eventually would get one anyways.
 

Jonesta6

Glumble
Feb 14, 2019
2,685
2,162
113
Back to what I was getting at, if we're somehow accessing the survey behind some sort of code wall (enter x code to take survey), then SEAS would be seeing multiple responses on the same code (respondent) which will likely mean it's an invalid entry and the answers get tossed.

However, if it's just poor phrasing 'our records indicate you've been to one of our parks recently', then we'd be a drop in the bucket of answers depending on the surveying model they use.
 
Feb 3, 2019
2,460
3,953
113
This is a fair point..

And looking at the URL itself, we can suss out some interesting details:

Survey name:
s3/5435383/2021-Roller-Coaster-Survey?

Source?
utm_source=Cheetah&

How the respondent got the survey:
utm_medium=Email

Name of the survey:
&utm_campaign=SWO_M_RideSurvey02052020_IP_Warming_8

tp is likely what @Jonesta6 is talking about. This would almost certainly be the identifier of who received the survey.
&tp=i-H43-I1-6k-4KQ-1o-NXHQ-1c-3dO-d1TvO
 
  • Like
Reactions: Alf33 and Zachary

warfelg

Advisory Panel
Mar 16, 2016
6,112
9,731
113
If SEAS made a survey that allows multiple go rounds with the same access code and it doesn’t block second attempts like this then that’s on them for building a poor survey to prevent something like this.
 

Jonesta6

Glumble
Feb 14, 2019
2,685
2,162
113
This is a fair point..

And looking at the URL itself, we can suss out some interesting details:

Survey name:
s3/5435383/2021-Roller-Coaster-Survey?

Source?
utm_source=Cheetah&

How the respondent got the survey:
utm_medium=Email

Name of the survey:
&utm_campaign=SWO_M_RideSurvey02052020_IP_Warming_8

tp is likely what @Jonesta6 is talking about. This would almost certainly be the identifier of who received the survey.
&tp=i-H43-I1-6k-4KQ-1o-NXHQ-1c-3dO-d1TvO
Yeah, that's a likely candidate - all the UTM stuff is for Google Analytics, which is interesting since the survey isn't being hosted on one of SEAS main domains.

Per @warfelg's point, it's probably because the survey engine is passively reading the parameter - generally speaking, a cookie would be a far more useful method to restrict access after multiple uses.
 
Aug 21, 2016
1,360
1,131
113
I said that BGW needs more good family rides (like ones that everyone can enjoy). I'd much rather see a Cobra's Curse type attraction after 2021 than another big thrill coaster.

I really don't think you should put down "giga" because at this point it's logistically very unlikely they get something like Fury or Orion.
 
  • Like
Reactions: BenWilkerson
Login or Register to Hide This Ad