Press "Enter" to skip to content

Menace of bad bots: India third most-blocked country

image

Bengaluru: Are you seeing your content appearing on other sites? Do you have unexplained website slowdowns and downtime? Are you seeing an increase in failed logins? Or are you seeing an abnormal increase in new account creation? If you answer any of these questions in the positive, it could mean that your company has a bad bot problem that can devastate your business.

The fact, according to the latest annual report by Distil Networks, titled ‘Bad Bot Report 2019: The Bot Arms Race Continues’, is that human traffic comprises only 63% of all internet traffic. In 2018, bad bots accounted for 20.4% of all website traffic–a 6.35% decrease over the prior year. However, good bots also decreased by 14.4%, accounting for 17.5% of all traffic. The bad bot traffic percentage has decreased slightly for the first time since 2015, but still accounts for 1 in 5 web requests.

While good bots ensure that online businesses and their products can be found by prospective customers, bad bots interact with applications the same way a legitimate user would, making them harder to prevent. They indulge in malicious activities including web scraping, competitive data mining, personal and financial data harvesting, brute-force login and digital ad fraud, spam, and transaction fraud.

Not surprisingly, Financial Services companies have the highest percentage of bad bots with 42.2%, and such companies typically suffer from bad bots attempting to access user accounts. Governments–with 29.9% of bad bots–are interested in protecting business registration listings from scraping bots, and in stopping election bots from interfering with voter registration accounts.

With most bad bot traffic originating from data centers, the US remains the ‘bad bot superpower’ with over half of bad bot traffic coming from the country. Many companies use geofencing blacklists to choke off large swaths of unwanted traffic. Geofencing uses radio frequency identification (RFID), WiFi, GPS, or even cellular data to set up a virtual geographic boundary within which companies can sound alerts about these bots.

However, despite most bad bots coming from the US, India is now the third most-blocked country for bad bots at 15.2%–a significant increase from the prior year when it was the 10th most blocked country (at 2.1%), according to the latest annual report by Distil Networks, titled ‘Bad Bot Report 2019: The Bot Arms Race Continues’.

A third of companies tracked by Distil Networks, blocked Russia the most for the second year running, followed by Ukaraine. Amazon, according to the report released on Wednesday, was the source of the most global bad bot traffic at 18.0%, while about 50% of bad bots report their user agent as Google Chrome.

“Bot operators and bot defenders are playing an incessant game of cat and mouse, and techniques used today, such as mimicking mouse movements, are more human-like than ever before,” said Tiffany Olson Kleemann, CEO of Distil Networks.

While it’s well known that bots were used to exploit social media sites in an attempt to influence political dialogue and elections, the real motivation behind the majority of bad bots is more simple–money, according to the Distil Networks report. For instance, online travel agents, aggregators, and competitors use bots to scrape content—including flight information, pricing, and seat availability. Ecommerce companies use bad bots to scrape pricing and inventory information. Criminals use bad bots to commit fraud by stealing gift card balances and to access user accounts and credit card information.

Even the financial investment sector deploys bots to scrape for information such as inventory levels and pricing data. The Distil Networks report estimated that hedge funds are expected to pay $2 billion in 2020 just to collect and store data that was scraped from websites, which indicates the business value of running bad bots.

Simple bots, which are easiest to detect, accounted for 26.4% of bad bot traffic. Meanwhile, the majority of non-human traffic (52.5%) came from those classified as moderate. And sophisticated bad bots, the most difficult to detect, comprised of 21.1% of automated traffic last year, the Distil Networks report noted.

It cautioned that just protecting your website will not help if your mobile apps and application programming interfaces (APIs) are vulnerable. Kleeman concluded, “As sophistication strengthens, so too does the breadth of industries impacted by bad bots. Now is the time to understand what bots are capable of and now is the time to act.”

Source: Livemint