Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Web scraping is increasingly used to extract a website's content and data, often conducted through automated via price bots and crawlers. For instance, competitors may target your site this way to retrieve content for various reasons. To discourage scraping of your Customer Self Service eCommerce Platform site, you can enable the Honeypot setting. This helps detect suspicious IP addresses and temporarily restricts them from accessing your site. Administrators can view the list of restricted IP addresses and remove them if needed. Separately, a suspicious activity report can be set up and automatically emailed to specific recipients to enhance monitoring.    


Info
titleWhat are bad bots?

Not all bots are bad. But you will want to be aware of these ones that could be bad for business:

  • Website scraper bot: This type of bot will generally send a series of HTTP GET requests and then copy and save all the information that the web server sends in reply, making its way through the hierarchy of a website until it's copied all the content. More sophisticated scraper bots can use JavaScript to, for instance, fill out every form on a website and download any gated content. "Browser automation" programs and APIs allow automated bot interactions with websites and APIs that try to trick the website's server into thinking a human user is accessing the content. An individual can manually copy and paste an entire website of course but bots can crawl and download all the content on a website often in a matter of seconds, even for large sites like e-commerce sites with hundreds or thousands of individual product pages.

  • Price scraping: This is when a company downloads all the pricing information from a competitor's website so that they can adjust their own pricing accordingly.

...