‘Bad bots’ make up nearly two-fifths of all internet traffic

‘Bad bots’ make up nearly two-fifths of all internet traffic

A new report from Barracuda networks, a trusted partner and leading provider of cloud-enabled security solutions, has revealed that automated traffic takes up nearly two-thirds of internet traffic (64 per cent) – and whilst just 25 per cent of automated traffic was made up by good bots, such as search engine crawlers and social network bots, nearly two-fifths (39 per cent) of all traffic was from bad bots.

These bad bots include both basic web scrapers and attack scripts, as well as advanced persistent bots. These advanced bots try their best to evade standard defences and attempt to perform their malicious activities under the radar. The report revealed that the most common of these persistent bots were ones that went after e-commerce applications and login portals.

The report, titled Bot attacks: Top Threats and Trends – Insights into the growing number of automated attacks, also included a breakdown of bad bot traffic by location. It revealed that North America accounts for 67 per cent of bad bot traffic, followed by Europe (22 per cent) and then Asia (7.5 per cent). Interestingly, the European bot traffic was more likely to come in from hosting services (VPS) or residential IPs than the North American traffic, most of which originated from public data centres.

Also Read: The global IaaS Public Cloud Services Market Value has grown by almost 40.7% in 2020

The research also revealed that most bot traffic comes in from the two largest public cloud vendors, AWS and Microsoft Azure, in roughly equal measure. This is likely because it is easy to set up a free account with either provider, and then use the account to set up the bad bots.

Finally, Barracuda researchers observed that bad bot traffic tends to follow the standard workday, allowing them to hide within normal human traffic streams to avoid raising alarm bells.

Nitzan Miron, VP of Product Management, Application Security, Barracuda said:

“While some bots like search engine crawlers are good, our research shows that over 60% of bots are dedicated to carrying out malicious activities at scale. When left unchecked, these bad bots can steal data, affect site performance, and even lead to a breach. That’s why it’s critically important to detect and effectively block bot traffic.”

Check Out The New Enterprisetalk Podcast. For more such updates follow us on Google News Enterprisetalk News.

Previous articleSUSE Released its Primary Version of Rancher 2.6
Next articleAptum Launches Enhanced Tiers for Managed Azure Cloud