Benjamin Hanrahan, David Martin, Jutta Willamowski, John Carroll
The Journal of Collaborative Computing and Work Practices

We developed TurkBench to better understand the work of crowdworkers on the Amazon Mechanical Turk (AMT) marketplace. While we aimed to reduce the amount of invisible, unpaid work that these crowdworkers performed, we also probed the day-to-day practices of crowdworkers.
Through this probe we encountered a number of previously unreported difficulties that are representative of the difficulties that crowdworkers face in both building their own tools and working on AMT. In this article, our contributions are insights into 1) a number of breakdowns that are occurring on AMT and 2) how the AMT platform is being appropriated in ways that, at the same time, mitigate some breakdowns while exacerbating others.

The breakdowns that we specifically discuss in this paper, are the increasing velocity of the market (good HITs are grabbed within seconds), the high amount of flexibility that requesters can and do exercise in specifying their HITs, and the difficulty crowdworkers had in navigating the market due to the large amount of variation in how HITs were constructed by requesters. When the velocity of the market is combined with a poor search interface, a large amount in variation in how HITs are constructed, and little infrastructural support for workers, the resulting work environment can be frustrating and difficult to thrive in.


Report number: