Jump to content

SPECIAL: Crowdology Market Research Q & A


Administrator

Recommended Posts

Hi again everyone,

Hopefully I can answer all of your questions!

JD - Thanks again for lots of good questions, again I’ll go through them one by one;

1.What form of quality control do you employ before releasing a survey to the general masses?

This varies company to company, but at Crowdology, we undertake a number of quality measures. I don't want to disclose too much because we don't want people to trick the system, but our systems can automatically tell when people are completing a survey too quickly, have more than one account with us, and are completing the survey from a different country than needed.

2. Some surveys have questions at the beginning like have you had oil added to your spark plugs. I assume there is a reason for these off beat questions like seeing if you are paying attention. What is your take on these type of questions?

As a market researcher myself, this seems quite bizarre. Unless a survey had something to do with spark plugs or oil, I do not see the point. I have seen "trap" questions before, that seek to check if the respondent is just going quickly through a survey or not but not in this form.

3. How do you guard against survey bias? In this case I am thinking about people liking a product more than they really do to please the people offering up the survey.

To reduce this bias, often the company wanting to do the survey will get an external market research company to undertake the research.

4. Do the questions where they say to select option A really work to weed out those just ploughing through a survey?

This can work, but it is not the only method of catching these people. Some systems can automatically tell if someone is doing the survey too quickly.

5. Related sorta to question 4. What is your rejection rate overall all for your surveys where you feel the survey taker did not do a thorough job in answering the questions? How many surveys do you have to have completed to determine if a survey is statistically meaningful?

By deleting panellists who consistently speed through a survey, we can keep quite a high quality, which means the rejection rate is often very low. Regarding the second part your question, this depends on which type of data needs to be analysed. A simple analysis usually requires between 100 and 200 completes. In order to for a proper analysis 1000-2000 is usually needed.

6. How long does it take your company to create a survey for a company? Days? Weeks? Just talking on average here.

This can be so varied! From my experience, this process usually takes a week or two and usually involves our team passing back and forth to the client improvements for the survey until they confirm that's what they want.

Galomorro - I totally agree with you about the "not having enough choices" part of your argument. This is very bad practice and actually goes against the UK (where I am based) Market Research Society code of conduct (MRS) which states that respondents should be "able to provide information in a way that reflects the view they want to express, including don’t know/prefer not to saywhere appropriate." Unfortunately, mistakes can slip through. If you feel that you cannot answer the questionnaire, please decline to answer it, and contact the survey company in question.

Often survey companies work through a number of other companies to get a higher number of interviews. This can lead to problems for respondents contacting the relevant person when a problem occurs. In order to help the survey company identify the survey you have had an issue with, always keep track of the date, time, and if possible a survey code present at the start of the survey or on the email. Otherwise, the subject of the survey can also help.

Regarding your questions on rewards; I suggest that you research different survey company’s rewards programs before signing-up. Reviews on Survey Police can often give a good indication of whether or not a company pays well. Like any company, survey sites try to minimise costs, so there needs to be a balance between keeping their panellists happy with rewards and staying competitive with other survey sites. At Crowdology we balance this by offering approximately the same as minimum wage in the US and UK for completing surveys. A 5 minute questionnaire = $0.50 or 40p

footfree - I believe ksedwar has answered your comments on the % taken off rewards. Regarding the amount you have to accrue before taking out money, I believe this is again because of the transaction costs - the cost of transacting a small amount after a survey instantly will be expensive for the company in question, and will also reduce the amount you get. At Crowdology, ours is quite low at $8 or £4. I understand it must get frustrating when you are close to your pay out point and are being screened out of surveys but I'm sure that if your persist then you will eventually be able to take the money out :)

1957horses - Survey companies offering Amazon vouchers to their respondents are likely to have a business account with Amazon. Because it involves a large amount of vouchers being sent I believe there may be a charge to this. Check a survey companies “rewards” section on their website and they should be able to explain.

Thanks again for all your questions.

I look forward to tomorrow!

Andrew

Crowdology UK and USA

Link to comment
Share on other sites

Sorry somelady I've only just spotted your questions - you must have posted while I was writing!

I'll answer the questions for you;

Why have survey companies started paying so little for the surveys and charging us to get our rewards?

It's a shame you that you feel this way. At Crowdology we feel that the amount we give is fair, however, we would not suggest using us as your main source of income. The market research industry is very competitive so if we raised the prices of our research then we wouldn't win any projects and then I wouldn't have a job! :P

Why do they ask questions where we have to choose between one to several options, what's more important, where none of the options are what we really choose in life and all or most of the options are equally important?

I believe I have already addressed this in another question. This is bad practice and the questionnaires should include a "don't know" or "none of the above". You should complain to the survey provider if this occurs.

We don't shop that way, we - or at least I-take everything into consideration and if I have to choose like that I won't buy it, i'll go for a product that has all the options I want. How can you get accurate results with those kid of questions?

As researchers, we often like to try out different "decision making" models to see if any fit well with certain consumers. This can help companies market their products.

Why do surveys ask how many of the product we would buy the first time buying a new product when they should know that most of us would only buy one to see if we like it?

This varies for different consumers. Different people have different purchasing processes

Why do they understate the time a survey will take?

And even if you tell them they never change the time or incentive?

At Crowdology we take the median time the survey takes to complete and this is what our incentive is based on. Companies that say "10 minutes but it's 30 to 60 minutes" are breaking market research code as it is misleading the respondent. You should complain if this occurs

Many thanks,

Andrew

Link to comment
Share on other sites

Just started at the site and have 2 questions:

1. The site loads OK in Firefox 24 , but not in Internet Explorer 8. In IE8 the area with About , FAQS , etc. is all the way on the right hand side. The problem is it obscures everything on that side of the screen.

2. In the profile section for Occupation , if the respondent indicates that they do not work (as I did) , why go through the remainder of the questions since they all pertain to someone working. Granted , the option is there for each to answer "I don't work" , but there is really no need to have to do that.

Link to comment
Share on other sites

Well I got my first survey and then I get this!! Not a good start!

This is somewhat embarrassing, isn’t it?

It seems we can’t find what you’re looking for. Perhaps searching, or one of the links below, can help.

Link to comment
Share on other sites

One thing that puzzles me is when a survey company sends out the same survey to you repeatedly. I have had cases where a survey on alcoholic beverages was sent to me at least once a week (sometimes 2-3 times a week) and the initial question was "Do you drink alcohol?". I don't and answered "no". Not likely to change that sudden. Also , have had multiple times per week (3 times in one day once) where a company sent surveys on erectile dysfunction and low testosterone. Why do they do this?

In reference to another poster's posting about ridiculous questions , recently had a survey that had a page of questions , the last of which was "Can you juggle four oranges at once?". Had nothing to do with the survey.

Link to comment
Share on other sites

Hi everyone,

We're coming to the end of this weekly Q & A, but I will finish off by answering the questions coming from episemion and danielle1234

Firstly, answering episemion ;

One thing that puzzles me is when a survey company sends out the same survey to you repeatedly

This is likely to be because of an error in the mailing system. Having a person answer the same survey more than once would mean that their opinions would be doubled in the data, making the data less robust.

In some cases, we commission a "tracking study", which asks very similar questions to the same respondents over a period of time (eg. once a month) to track how their thoughts and opinions change over time.

Secondly, I am rather intrigued by the questions such as "Can you juggle four oranges at once?" However, I have another idea why this may occur. In my time as a researcher, in order to get a quote from an online panel, you need something called an "incidence rate" - this is the percentage of people who you expect will be relevant to the questionnaire. For example, the incidence rate of interviewing just males would be 50%. These random questions at the end may be the online panels’ way of testing what the incidence rate is (e.g the percentage of competent jugglers in the population – this sounds bizarre but it’s possible!), and then being able to quote appropriately.

danielle1234 I wasn't sure exactly what you were asking but I think you mean one of two things, so I will answer them both;

Do survey companies use the information given by a panellist even if the person is disqualified?

1) If by "disqualified", you mean the person was caught either speeding through the questionnaire, or giving answers that were believed to be unreliable - At Crowdology we do not keep this data, and it is deleted, as it is seen as unreliable.

2) If by "disqualified", you mean "screened out" - yes, we keep the data so we can find out the "incidence rate" which I mentioned earlier. This can be helpful if we wish to run a similar survey in the future.

That brings us to the end of our Q & A

I hope I have answered your questions fully as well as giving you insight into the real people behind survey sites.

Personally, I have thoroughly enjoyed answering your questions as it helps me understand our panellists and what your worries and thoughts are. If you enjoyed this session, I'm quite happy to repeat it sometime in the future if you have more questions.

Many thanks to Survey Police and all their forum members,

Andrew White

Crowdology USA and UK

P.s. Feel free to join our panel and check out our profile on survey police! http://www.surveypolice.com/crowdology

Link to comment
Share on other sites

The week is over, so this thread is now locked.

Thanks again to Andrew from Crowdology for volunteering to share his expertise on market research. Thanks also to all of you for your great questions!!!

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...