Top websites struggle to guard against sophisticated bot attacks
A report released today shows that, while an average of 16 percent of websites across all industries can thwart simple bot attacks, only five percent are able to properly protect against sophisticated attacks.
The study from bot detection specialist Distil Networks, in conjunction with the Online Trust Alliance (OTA), evaluated the top 1,000 websites in retail, banking, consumer services, government, news media, internet service providers and OTA members.
The report divides bots into four categories:
- Sophisticated Bots -- coming in slowly from dozens of IP addresses, using browser automation tools that can hold cookies and maintain state
- Moderate Bots -- using normal browser user agents and headers, coming in slowly from one IP
- Simple Bots -- non-browser user agents and headers, coming in fast from one IP
- Crude Bots -- basic scripts that behave like a bot, coming fast from one IP address
The findings show that while most industries tested can adequately protect against crude bots, they struggle to effectively block the simple, moderate, and sophisticated ones. For example, federal websites block 22 percent of simple bots, but only protect against one percent of sophisticated bots, performing below any other industry tested.
Despite poor performance, this year's findings reveal a marked improvement from Distil's 2016 study, which found that websites tested could protect against only 0.7 percent of sophisticated bots. Such improvement can be attributed to gradual movement toward greater awareness and adoption of more advanced bot detection and mitigation solutions.
Looked at by industry, banks and ISPs are most effective at detecting sophisticated bots, followed by retailers. Against simple and moderate bots though banks’ detection rates lag behind those of retailers.