Skip to Main Content
Subjective laboratory tests represent a proven, reliable approach towards multimedia quality assessment. Nonetheless, in certain cases novel progressive quality of experience (QoE) assessment methods can lead to better results or enable test execution in more cost-effective ways. In this respect, crowdsourcing can be considered as emerging method enabling researchers to better explore end-user quality perception when requiring a large panel of subjects, particularly for Web application usage scenarios. However, the crowdsourcing platform chosen for recruiting participants can have an impact on the experimental results. In this paper, we examine the platform's influence on QoE results by comparing MOS scores of two otherwise identical subjective HD video quality experiments executed on one paid and one non-paid crowdsourcing platform.
Date of Conference: 5-7 July 2012