Abstract
Obtaining high-quality data from crowds can be difficult if contributors do not give tasks sufficient attention. Attention checks are often used to mitigate this problem, but, because the roots of inattention are poorly understood, checks often compel attentive contributors to complete unnecessary work. We investigated a potential source of inattentiveness during crowdwork: multitasking. We found that workers switched to other tasks every 5 minutes, on average. There were indications that increasing switch frequency negatively affected performance. To address this, we tested an intervention that encouraged workers to stay focused on our task after multitasking was detected. We found that our intervention reduced the frequency of task switching. It also improves on existing attention checks because it does not place additional demands on workers who are already focused. Our approach shows that crowds can help to overcome some of the limitations of laboratory studies by affording access to naturalistic multitasking behavior.
Original language | English |
---|---|
Article number | 19 |
Number of pages | 29 |
Journal | ACM Transactions on Computer-Human Interaction |
Volume | 23 |
Issue number | 3 |
Early online date | 30 Jun 2016 |
DOIs | |
Publication status | Published - Jul 2016 |
Keywords
- Human-centered computing
- HCI theory, concepts and models
- interruptions
- multitasking
- cuing
- crowdsourcing
- online experimentation
- methodology
- human performance
- data entry
- transcription