CloudCrowd Offers Peer-Reviewed Crowdsourcing Through Facebook

CloudCrowd is an on-demand labor-as-a-service crowdsourcing company that uses a Facebook application to route work to laborers. It takes a complex project that a company would normally have to hire and train employees to do and breaks it down into thousands of short, simple tasks which are performed and verified by laborers.

By running its system on Facebook, it has access to plenty of labor from around the world, and are present where these people spend their time online. CloudCrowd’s biggest challenge is convincing companies that crowdsourcing will work for their projects to fill its Facebook app’s available work queue.

CloudCrowd was founded in 2009 by Jordan Ritter, the co-founder of Napster and crowdsourced spam-fighting company Cloudmark, and former Cloudmark executive Alex Edelstein. The San Francisco-based team has 15 employees, half of whom are engineers, and is self-funded. It has a pool of over 25,000 registered labor-on-demand workers including work at home moms, retired professionals, and college students, and 1.5 million tasks have been completed on the system.

Depending on the size and complexity, clients pay $5,000 to hundreds of thousands per project plus a $5,000 set up fee, but only pay for completed, approved work. Current clients range from start ups to Fortune 500 companies in industries like e-commerce, education, web search, social networking, business data, and content publication. They include start up RentCycle, Lombardi Sports, and the University of Southern California, which used CloudCrowd to find potential donors by having laborers pull alumni names and locations from thousands of old documents.

Amazon’s Mechanical Turk was the first full scale modern crowdsourcing marketplace, but those with projects to complete had to write instructions and verify accuracy of work for each task, making it hard to scale. CrowdFlower improved this process by creating a system that verified work by comparing the output of multiple workers on Mechanical Turk and other worker pools. CloudCrowd seeks to further streamline the process by eliminating redundancy through peer reviews that verify work, and by creating what it calls a labor operating system that uses APIs to turn a client’s labor need into a function call.

From Complex Project to Completed Work

To initiate a project, a CloudCrowd project manager meets with a client to determine how a complex, multi-part task can be reframed as a series of small, simple tasks. CloudCrowd has developed streamlined “machines” for many common building blocks of complex tasks. These machines are actually sets of instructions and tools that when given to their laborers, help them quickly and accurately complete tasks like translation, copy editing, data reformatting, and data mining from web search. CloudCrowd restructures a client’s project to be done by these machines, and builds new ones when necessary, enlarging its repertoire of machines over time.

This process removes a large inefficiency of crowdsourcing to date — that clients had to write instructions themselves, a tricky process that Crowdcloud’s in-house employees are trained in. A batch of tasks is run as a trial to ensure the instructions and tools are accurate, and once approved by the client, it’s only a matter of funneling data in and out of the machines to generate a completed project.

Tasks are assigned a payment value, from a cent for simple tasks that take a few seconds like creating directory entries for websites, to dollars for longer, more complex tasks like editing documents. They are then routed to laborers by listing them as available work on CloudCrowd’s Facebook application. Consistent, accurate work builds a worker’s credibility score slowly over time, while inaccurate work quickly decreases their score. As a worker completes tasks, money is added to their CloudCrowd account, which is then paid out to a PayPal account the next day.