CloudCrowd is an on-demand labor-as-a-service crowdsourcing company that uses a Facebook application to route work to laborers. It takes a complex project that a company would normally have to hire and train employees to do and breaks it down into thousands of short, simple tasks which are performed and verified by laborers.
By running its system on Facebook, it has access to plenty of labor from around the world, and are present where these people spend their time online. CloudCrowd’s biggest challenge is convincing companies that crowdsourcing will work for their projects to fill its Facebook app’s available work queue.
CloudCrowd was founded in 2009 by Jordan Ritter, the co-founder of Napster and crowdsourced spam-fighting company Cloudmark, and former Cloudmark executive Alex Edelstein. The San Francisco-based team has 15 employees, half of whom are engineers, and is self-funded. It has a pool of over 25,000 registered labor-on-demand workers including work at home moms, retired professionals, and college students, and 1.5 million tasks have been completed on the system.
Depending on the size and complexity, clients pay $5,000 to hundreds of thousands per project plus a $5,000 set up fee, but only pay for completed, approved work. Current clients range from start ups to Fortune 500 companies in industries like e-commerce, education, web search, social networking, business data, and content publication. They include start up RentCycle, Lombardi Sports, and the University of Southern California, which used CloudCrowd to find potential donors by having laborers pull alumni names and locations from thousands of old documents.
Amazon’s Mechanical Turk was the first full scale modern crowdsourcing marketplace, but those with projects to complete had to write instructions and verify accuracy of work for each task, making it hard to scale. CrowdFlower improved this process by creating a system that verified work by comparing the output of multiple workers on Mechanical Turk and other worker pools. CloudCrowd seeks to further streamline the process by eliminating redundancy through peer reviews that verify work, and by creating what it calls a labor operating system that uses APIs to turn a client’s labor need into a function call.
From Complex Project to Completed Work
To initiate a project, a CloudCrowd project manager meets with a client to determine how a complex, multi-part task can be reframed as a series of small, simple tasks. CloudCrowd has developed streamlined “machines” for many common building blocks of complex tasks. These machines are actually sets of instructions and tools that when given to their laborers, help them quickly and accurately complete tasks like translation, copy editing, data reformatting, and data mining from web search. CloudCrowd restructures a client’s project to be done by these machines, and builds new ones when necessary, enlarging its repertoire of machines over time.
This process removes a large inefficiency of crowdsourcing to date — that clients had to write instructions themselves, a tricky process that Crowdcloud’s in-house employees are trained in. A batch of tasks is run as a trial to ensure the instructions and tools are accurate, and once approved by the client, it’s only a matter of funneling data in and out of the machines to generate a completed project.
Tasks are assigned a payment value, from a cent for simple tasks that take a few seconds like creating directory entries for websites, to dollars for longer, more complex tasks like editing documents. They are then routed to laborers by listing them as available work on CloudCrowd’s Facebook application. Consistent, accurate work builds a worker’s credibility score slowly over time, while inaccurate work quickly decreases their score. As a worker completes tasks, money is added to their CloudCrowd account, which is then paid out to a PayPal account the next day.
Difficult tasks are only open to those with high credibility scores, and certain specialized tasks like translation require the worker to pass proficiency tests to earn credentials. CloudCrowd differs from other crowdsourcing companies, though, because quality assurance of work is based on credible peer review, instead of comparing the output of multiple workers performing the same task.
First a worker completes a task, then a worker with a higher credibility rating than the first verifies the accuracy of the work as a short task of its own. This continues up the credibility ladder as necessary, and once someone with a high enough rating verifies the work, CloudCrowd can be confident it is accurate without having multiple workers repeat the longer root of the task. This creates a highly efficient system that can verify all types of tasks, not just those verifiable by two people having identical answers.
To kickstart the system as the best workers built their credibility ratings, CloudCrowd employees initially acted as the final say on accuracy. Soon, the worker community had some highly credible laborers and could handle verification autonomously. This leaves the company’s employees to focus on writing instructions, building tools, reassembling projects, and finding clients.
Facebook as a Gateway to Labor
CloudCrowd launched its Facebook app in October of last year, and was immediately besieged by people wanting to join its labor pool. Workers quickly outpaced the supply of tasks, and the company had to turn off viral features like posting to users’ feeds when they earned money. Potential workers could still sign up if they found their way to the application, or if they were invited through CloudCrowd’s referral program called Personal Crowd, where users get paid a percentage for the work done by those who sign up with their referral code.
When a user opens the CloudCrowd app, they see their summary page which lists their earnings, credibility score, and work history, showing the tasks they’ve attempted and how their performance on each was graded. The appeals section lets users contest erroneous evaluations, and My Details lets users manage their credentials earned from passing proficiency tests and the PayPal account to which their earnings are paid.
The Available Work tab lists all the open tasks on CloudCrowd. Tasks have project names, categories, credential and credibility score requirements and today’s pay rate for completion. For instance, a user might see a project named “Review Business Categorizations”, which falls under the “Categorize, Tag and Label” machine, requires a credibility score of 31 and pays $.02 per task completed.
Clicking the task leads to the instructions page explaining how the task is performed, the criteria for a task being verified as complete, and special notes to help workers avoid common pitfalls. In this case, a user must visit links and determine if a company is a retailer, manufacturer, wholesaler, or distributor. As the sites are categorized, higher credibility users are given tasks in which they check to see that the categorizations are accurate. Since users are paid by the task, not by time, they can complete as many or as few tasks as they like.
The only real issue with the Facebook app is that there often aren’t any tasks available that don’t require translation credentials. As companies are still acclimating to crowdsourcing their work, and CloudCrowd has an ample worker pool, the general tasks available during peak hours are quickly completed. To solve this, CloudCrowd lets users sign up for email notifications telling them when work becomes available. If the company sees there are not enough people with credentials to perform a specific task, like German to English translation, it simply raises the pay rate and watches as users send invitations to those with the right skills in hopes of getting a cut of the money.
The application is simple, but effective. Instructions are clear, and any questions can often be cleared up in each task’s dedicated forum. Since CloudCrowd is on Facebook, it makes it easy to pop in and complete a few tasks while chatting or waiting for your energy to replenish in a social game. The work history makes grading transparent, and payments happen promptly. Once CloudCrowd gets more clients signed up and fills up the available work queue, it could be a reliable way to earn money for users with a few minutes or a few hours, specialized skill sets or no training.
CloudCrowd becomes more efficient with each project it undertakes. Developing more machines and improving instructions, it is able to easily complete more complex tasks for cheaper. Its improvement on the translation process is a good example of how it is disrupting industries. Previously, a company would farm out translation to specialists who worked full time, translating by the full page and requiring separate editing. Projects would cost $80 a page and take three days says CTO Jordan Ritter. With CloudCrowd translating line by line for a few cents each, it can complete the same project in one day for $10, and charge $20, a quarter of the cost of traditional translation companies.
Its efficiency on translation and editing tasks has led them to spin off dedicated companies in these verticals called TranslationZen and EditZen. Clients are only concerned with price, speed and accuracy, and don’t mind CloudCrowd using crowdsourcing to accomplish tasks. Ritter says the company will continue to expand into new verticals.
The Facebook application operates as an iframe instead of outside of Facebook using connect because this keeps the burden of session verification on Facebook. However, Facebook often makes changes to the iframe, which CloudCrowd has to scramble to adopt. The company says it would like to see Facebook relax restrictions on what can be done inside the iframes to provide a richer, more efficient interface for completing certain tasks.
CTO Jordan Ritter says that while Cloudcrowd does compete for the same clients as fellow labor-on-demand provider CrowdFlower, they work on different platforms. CrowdFlower. built on top of existing labor pools like Amazon’s Mechanical Turk and Gambit, determines work accuracy by using algorithms to compare the same work repeated by multiple laborers. Ritter says this older, technical approach using statistical theory is less efficient and isn’t applicable to some subjective tasks. “We think humans are still smarter than machines” he says of Cloudcrowd’s peer review system of determining accuracy.
CloudCrowd are now improving its APIs, hoping to systemize common tasks for clients such that they can simply enter the foreign language text and make the function call get_translate to get an accurate English translation quickly and cheaply. The biggest challenge now is for CloudCrowd’s business development team to convince companies to break away from the traditional route of of hiring employees or commissioning specialists to complete projects and view crowdsourcing as an option even for highly complex tasks. If it could fill its Facebook app’s task queue, the company could use its adequate labor pool who handle quality assurance themselves to make assigning tasks to CloudCrowd the new outsourcing.