Recruiting Participants With Programming Skills: A Comparison of Four Crowdsourcing Platforms and a CS Student Mailing List

Abstract

Reliably recruiting participants who have programming skills is an ongoing challenge for empirical studies involving software development technologies, often leading to the use of crowdsourcing platforms and computer science (CS) students. In this work, we use five existing survey instruments to explore the programming skills, privacy and security attitudes, and secure development self-efficacy of University CS student participants and participants from four crowdsourcing platforms (Appen, Clickworker, MTurk, and Prolific).

We recruited 613 participants who claimed to have programming skills and assessed recruitment channels in regards to costs, quality, programming skills, and privacy/security attitudes. We find that 27% of crowdsourcing participants, 40% of self-reported developers from crowdsourcing participants, and 89% of CS students got all programming skill questions correct. CS students are the cheapest recruitment channel and rate themselves lower than crowdsourcing participants in terms of secure development self-efficacy.

Publication
To appear in The ACM Conference on Human Factors in Computing Systems (CHI)