Customer Login  |  

HCOMP 2016 Takeaways

by Kevin Dodds

Information Evolution was proud to sponsor the recent AAAI Conference on Human Computation and Crowdsourcing. Speakers and presenters from around the world brought deep academic insight into the vast AI training work cloud labor handles for digital economy leaders like Twitter, LinkedIn, Facebook, Google, and WordPress, as well as the potentially transformative effect of distributed labor on society as a whole.

Important themes covered at the conference include:

  • The ethics and evolution of cloud labor
  • Techniques for managing workers and enhancing output quality
  • The impact of the volunteer crowd
  • The intersection of crowdsourcing and artificial intelligence (AI)

Cloud Labor: Cloud laborers can now rate the folks who hire them (“Requesters” in Amazon Mechanical Turk terminology) with an application called TurkOpticon. Just as Uber lets drivers rate passengers, crowd workers can tell each other which requesters are the fairest paying and whose work is most rewarding. Sites such as Turker Nation provide a forum for crowd workers to exchange information as well. Also in the works are several “platform cooperatives” where workers can earn a stake in the products they help make.

Time limits: Research shows that quality increases as the time allowed to perform a decision-making task decreases. The sweet spot is a balance between high worker stress levels and boredom. Short “game” tasks and other methods can keep workers sharp on the job, too.

Argumentation: Asking workers to defend their answers improves results, and it takes about the same time to process the tasks.

Orientation: Results from a simple image annotation task (drawing the outline of a shape) were more accurate with the images upside down, making them more abstract and less “familiar.” This research suggests workers with domain knowledge are not the best candidates for some types of tasks.

Payment variations: Standard fixed pricing, lottery-type incentives, and time donation models all affect participation, worker diversity, and data accuracy.

Worker pool size: Having just the right number of workers available for the tasks on hand is key to getting into the “Goldilocks Zone.” Having too few workers means tasks go unperformed, while having too many causes stressful, unnecessary competition and quality problems.

Crowd voting: Keynote speaker Ashish Goel of Stanford University presented research on collaborative budget-setting processes currently in testing by city governments, a type of “volunteer crowd” work.

Crowd sensors: Another project in development has citizens in areas affected by catastrophic events use their mobile phones (another species of volunteer crowd) to give reports and/or measurements, better informing emergency response teams.

AI: Cloud labor can improve artificial intelligence algorithms, so there is a lot of research and discussion on how long certain crowd tasks will continue before AI training data sets are no longer needed. Computer vision, for instance, is an active area where crowd workers sort images of curbs, traffic, etc. to train driverless vehicle vision and decision systems, but at some point the crowd will complete this work, essentially working themselves out of a job. What comes next is anyone’s guess, but the consensus seems to be that having humans handle more complex and nuanced decisions is a likely development of cloud labor as robots are more and more common.

posted by Shyamali Ghosh on November 3, 2016

{ 0 comments… add one now }

Leave a Comment