Earlier today, measurement provider DoubleVerify announced it has uncovered a new bot network that perpetrates fraud by circumventing ads.txt protections, an IAB-sanctioned protocol aimed at reducing domain spoofing that was unveiled two years ago.
The botnet carries out a sophisticated, unique type of fraud that scrapes content from premium publishers’ websites and creates falsified copies of the original scraped pages on its own server.
DoubleVerify claimed the bot network then creates new ad slots, which did not exist on the web property, on the resulting spoofed webpage and uses falsified URLs to manipulate the automated trading ecosystem. It does so by passing on the ‘ad-buying opportunity’ to the authorized resellers whose systems have been unable to detect this deception.
Ads.txt, which stands for authorized digital sellers, is an IAB-approved protocol that aims to prevent the sale of unauthorized ad inventory by allowing publishers to drop a text file onto their web servers that list all companies authorized to sell a publisher’s inventory. Programmatic platforms, such as demand-side platforms, then process this information in order to verify the validity of the inventory they purchase.
However, the bot network DoubleVerify publicized has been able to exploit loopholes in the system, as not all parties perform their due diligence post campaign, according to sources.
Roy Rosenfeld, head of DoubleVerify’s fraud lab, said that while the 2017 ads.txt was a significant step forward in helping to combat fraudsters, prolonged scrutiny is needed. “This scheme was specifically designed to take advantage of the industry-wide ads.txt initiative and commit fraud that would not trigger ads.txt violations with programmatic buyers,” he explained.
The move highlights similar vulnerabilities in the system that were uncovered in a July 2018 report by Forensiq, which demonstrated how thousands of apps are masquerading as premium publishers.
Sources approached by Adweek agreed with Rosenfeld’s sentiment but explained that while ads.txt is a step in the right direction, a much more thorough implementation of the standard is needed if it is to be effective.
Noted ad fraud researcher Dr. Augustine Fou said that while the sell-side of the sector has done its part by implementing ads.txt files, media buyers need to be more proactive in their approach, although he maintained that there’s little incentive for them to check at present.
“The publishers are putting the tags on their websites, so ads.txt is ready to go, and the only thing that needs to happen now is the second half of that equation, which is for the buyers to start checking,” he added.
Most post-campaign reports give buyers and sellers domain-based reporting, but at present, it’s all too easy for spoofed websites to be co-mingled with legitimate websites in said reports, according to Fou.
“Unless their clients—the marketer—insists that they do their homework, then there is no incentive for them to check the log files,” he added.
Marc Goldberg, CEO of TrustMetrics, said multiple steps are required, including creating a scalable whitelist with ads.txt inventory, securing a security vendor and inspecting campaign reports post-mortem. “While ads.txt is best practice it is not the only practice, a lot of buyers and sellers think they are fraud-free and [brand-]safe, but that’s just not the case,” he added.
Dr. Fou concluded, “The bad guys are pretending to be legitimate websites, and until you have a seller-based ID included in a domain-based report … what you need in a report is three things: the seller ID, the domain and the quantity [of traffic].”