Facebook announced that 99 percent of the ISIS and al-Qaida terror-related content it removes from the social network is detected before even being flagged and, in some cases, before it even goes live.
Head of global policy management Monika Bickert and head of counterterrorism policy Brian Fishman said in the latest installment of the social network’s Hard Questions series that by using automated systems—including photo and video matching and text-based machine learning—once Facebook becomes aware of terror content, 83 percent of subsequently uploaded copies are removed within one hour of upload.
WORK SMARTER - LEARN, GROW AND BE INSPIRED.
Subscribe today!
To Read the Full Story Become an Adweek+ Subscriber
Already a member? Sign in