Jump to content
Polvey1

Survey Police Tops Sites

Recommended Posts

ErgoProxy

I've been doing pretty good with mypoints and Swagbucks (both run by the same company). I've got 100's of $$ worth of points racked up between the two of them.

  • Like 2

Share this post


Link to post
Share on other sites
Cliche01

I wasn't sure about Point Club at first, (which I just joined in 2019) but it's grown on me, just rather slow in rewarding points.  The longer you stay with it, the more it rewards, a gimmick they have built in.  Beyond that, runner-up Pinecone has always done good by me, though really tapered off lately in terms of number of surveys I get. 

  • Like 1

Share this post


Link to post
Share on other sites
schludermann

I think these survey sites are so highly rated by survey police, because survey police has a business relationship and survey police is transparent on that point. Most of these sites have dubious reputations, like PointClub "invalidating" surveys months after you completed the survey and you never get the points. Tellwut is okay, so long as you don't take the bait for high point surveys that have 99.8% late screen out.  One Opinion has a long and documented history of persistently abusing panelists and inhibiting rewards redemption.  Paidviewpoint and Prolific are good panels that are run with integrity. 

  • Like 1

Share this post


Link to post
Share on other sites
highbids

Where can we go to find independent honest reviews of the survey sites

Share this post


Link to post
Share on other sites
schludermann

Reviews have to be corroborated, so there's some work involved, it's worth it to stay away from junk panels and the majority are. when scanning the reviews, ignore the posts with no detail, likely it is a shill and if it's a real person, they don't know how to communicate and it's junk.  only consider reviews a few months old and check out BBB listings for the site. Survey Police reviews are skewed, the summary ratings are not arithmetically accurate, they derived from stale content, are very low detail and SP also blocks reviewers for "griping", meaning the descriptive are biased to appear more positive.  Survey Police probably does this because they get money from the panels for referrals, placements and ads.

  • Sad 1

Share this post


Link to post
Share on other sites
Administrator

Hi Schuldermann,

Sorry you feel that way about our rankings. We wish to clarify a few points that you've made which are factually incorrect:

The reviews are not 'skewed', nor are the summary ratings arithmetically inaccurate. We use a sophisticated ranking algorithm which is based on Dirichlet distribution model. First page results must have a minimum number of reviews - hence why a panel with a high rating but with few reviews isn't on the first page. An explanation for this is provided on the first page of the rankings page under 'How is this calculated?'

The algorithm does not count old reviews; although dated reviews are shown on listings, they are not actually counted towards the rankings themselves.

It is very rare that we block negative reviews. There are literally thousands of negative reviews for panels posted on our site. Your implication is that panel ratings should be even lower than they are, which is actually a bias in itself.

If you'd like to learn more about how our site actually works, please reach out to us directly. Posting this type of misleading and false information about how SurveyPolice operates is not helpful.

Share this post


Link to post
Share on other sites
schludermann
On 1/30/2020 at 2:10 PM, Administrator said:

Hi Schuldermann,

Sorry you feel that way about our rankings. We wish to clarify a few points that you've made which are factually incorrect:

The reviews are not 'skewed', nor are the summary ratings arithmetically inaccurate. We use a sophisticated ranking algorithm which is based on Dirichlet distribution model. First page results must have a minimum number of reviews - hence why a panel with a high rating but with few reviews isn't on the first page. An explanation for this is provided on the first page of the rankings page under 'How is this calculated?'

The algorithm does not count old reviews; although dated reviews are shown on listings, they are not actually counted towards the rankings themselves.

It is very rare that we block negative reviews. There are literally thousands of negative reviews for panels posted on our site. Your implication is that panel ratings should be even lower than they are, which is actually a bias in itself.

If you'd like to learn more about how our site actually works, please reach out to us directly. Posting this type of misleading and false information about how SurveyPolice operates is not helpful.

My assertions about the efficacy of the ratings on Survey Police are made from what I've observed and having my reviews censored. Blocking reviews based on a perception creates a bias.

 

I put forth 15 minutes to gather the below illustrations of long persistent and on going inaccurate reputation ratings. Since the rating system is as constructed is cumbersome to peruse, I used simpler examples to illustrate what is happening.

 

SP ratings error
https://www.surveypolice.com/microsoft-playtest-research
6 stars by 1 reviewer, 3 stars displayed
https://www.surveypolice.com/medsurvey
5.75 stars avg from 3 reviewers, 3 stars displayed.
https://www.surveypolice.com/iquestion
2 stars avg from 2 reviews, 3 stars shown
https://www.surveypolice.com/technology-advisory-board
1 star from one review, 3 stars displayed
https://www.surveypolice.com/research-participants-institute
1 6 star review, 3 stars shown

 

Share this post


Link to post
Share on other sites
ksedwar
On 1/30/2020 at 10:01 AM, schludermann said:

Reviews have to be corroborated, so there's some work involved,

The corroboration is done by algorithm.

On 1/30/2020 at 8:12 PM, schludermann said:

My assertions about the efficacy of the ratings on Survey Police are made from what I've observed

Not trying to argue here but SP reviews are not a popularity contest. What you see is an average score not a 1 for 1 result of ratings and reviews. You can not observe the algorithm just by looking at the rankings and reviews. You can only do that by knowing exactly how the algorithm works and that is normally guarded information. If it was all public then it would be easy to artificially manipulate, at least in the short term, but the algorithm helps defeat artificial manipulation.

On 1/30/2020 at 8:12 PM, schludermann said:

and having my reviews censored.

Maybe it would help to know what reviews are censored? They do not always go live upon submission. They are manually approved and that may take more time than you feel is appropriate but unless the review is against the TOS they should be all viewable eventually.

On 1/30/2020 at 8:12 PM, schludermann said:

5.75 stars avg from 3 reviewers, 3 stars displayed.

A panel having an overall rating of 5 stars from only 1 or 2 reviewers would be biased. 3 stars overall from 1 reviewer or 3 reviewers giving 4.5 to 5 stars is the algorithm working out an average score just as it should based on the number and age of reviews. The more information an algorithm has the more accurate it gets so panels with only a few reviews may look off but it's really just not enough information to get an accurate result.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...