On Friday, Facebook increased user feedback’s weight in its automated Platform enforcement system. This caused it to automatically disable a number of app that had received a large volume of negative feedback. Now, some developers are upset that they weren’t given fair warning to correct their apps, and several claim they haven’t done anything wrong — and a few who were taken off have been reinstated already.
So far, Facebook has provided an appeal process for developers of apps that have been disabled and has said that improved feedback monitoring in app Insights will launch soon. Still, the site’s effort to protect the user experience from spam apps has hurt its relationship with some in the developer community.
Reports in the Facebook Developer forums indicate the disabled were mostly small to mid-sized, with some in the 10,000 to 20,000 daily active user range including one called Game of Truth, which you can see above has dropped to zero DAU. Many of the affected developers are threatening to leave the Facebook Platform, while other are calling for better benchmarks for assessing how much negative feedback is too much.
The company’s goal is to keep the user experience enjoyable for both applications users and non-users so developers have healthy Platform to work on long into the future. Facebook CTO Bret Taylor said that by improving its automatic enforcement systems, the site reduced spam by 95% in 2010 while also reducing enforcement actions.
In a statement, Facebook said “we started getting a lot of user feedback, spiking significantly over the past week, on the amount of application spam people are seeing in their feeds and on their walls. As a result, we turned on a new enforcement system [Friday] that took user feedback much more heavily into account. This resulted in a number of applications with high negative user feedback being disabled or having certain features disabled.”
This means that the apps which were disabled didn’t necessarily violate Platform policy, but somehow led users to mark their posts as spam or report them. The apps may not have made it clear to a user when they would post to their wall or send invites to friends. It’s also possible that users who received wall posts from apps their friends were using marked those posts as spam because they had never personally used the app.
Enforcement Warning Systems and Negative Feedback Benchmarks
Some developers in the forum and who have emailed us directly claim they’ve lost development and marketing investments as well as the trust of their core user because their apps were removed from the Platform. They believe they were treated unfairly because they weren’t given advance notice or the opportunity to fix their apps.
Facebook may need to reconsider the warning and communication protocols for its enforcement system. Leaving apps with negative feedback running for a few more days after warning developers may in fact be better for the Platform experience, as users may grow skeptical of spending time and money on games if they fear they may be suddenly disabled.
Other developers are seeking a better way to gauge what level of negative feedback Facebook deems unacceptable. A Hong Kong developer posting on the forum under the alias ‘takwing’ explained [edited for clarity]:
“For example, I include a wall post feature, and it turns out that 99% of the users love it and 1% always mark the wall post as spam. Does this 99% good out-weigh the 1% bad? How about the case of 90% good vs 10% bad? We need a certain guideline so that we can review and make judgement on whehter we should implement a feature or not. As I may think that the feature is good and most people love it… but it turns out Facebook thinks 10% bad feedback is unacceptable and bans my app.”
Facebook should consider releasing some sort benchmark for an acceptable level of negative feedback. This would allow developers to test new features, but know to remove them if negative feedback exceeds the benchmark.
A Facebook engineer named Eugene responded to the developers in the forums stating “Where we have failed is not providing enough feedback about negative engagement metrics to developers before needing to take this action. This is something we are working hard to fix with the new Application Insights that will be launching over the next few weeks – you will have detailed information about both positive and negative engagement of the content your application generates.”
The planned improvements, which include the ability to see how quantities of posts marked as spam and stream story hides, should make it easier to monitor feedback fluctuations. Developers will need to either need to confer to establish benchmarks or Facebook will need to provide one to make this Insights data as useful as possible.
Some developers have posted to the forum saying their apps have been re-enabled, and others have been allowed to recreate their apps with a new app IDs and have users re-grant permissions. Several others, though, report they’ve received automatic denials when they tried to appeal their decision.
Even if Facebook is trying help users and preserve trust in the Platform so it can continue to make money for developers, the site will need to learn from this incident. It should consider being more cautious when changing its automatic enforcement systems, and look at how it can improve data and communication regarding Platform enforcement.
Just a month ago, Facebook angered some in the developer community when several developers received notice that they had 48 hours to change fix authentication data leaks, even though they weren’t leaking data. Communication needs to improve if Facebook wants to entice developers to build on its Platform.