By Topic

Microworkers vs. facebook: The impact of crowdsourcing platform choice on experimental results

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Gardlo, B. ; Dept. of Telecommun. & Multimedia, Univ. of Zilina, Zilina, Slovakia ; Ries, M. ; Hossfeld, T. ; Schatz, R.

Subjective laboratory tests represent a proven, reliable approach towards multimedia quality assessment. Nonetheless, in certain cases novel progressive quality of experience (QoE) assessment methods can lead to better results or enable test execution in more cost-effective ways. In this respect, crowdsourcing can be considered as emerging method enabling researchers to better explore end-user quality perception when requiring a large panel of subjects, particularly for Web application usage scenarios. However, the crowdsourcing platform chosen for recruiting participants can have an impact on the experimental results. In this paper, we examine the platform's influence on QoE results by comparing MOS scores of two otherwise identical subjective HD video quality experiments executed on one paid and one non-paid crowdsourcing platform.

Published in:

Quality of Multimedia Experience (QoMEX), 2012 Fourth International Workshop on

Date of Conference:

5-7 July 2012