A bot (also known as a web bot or internet bot) is a software program that performs automated tasks on the internet. Typically, bots are programmed to perform tasks much more rapidly and accurately than a human could. Bots are widely employed across thousands of applications, including searching the internet, indexing a search engine, making reservations, and much more.
A specific type of bot, called a “crawler” or a “spider”, is used mainly by search engines and web indexing services. A crawler bot systematically browses the internet, usually for the purpose of web indexing (web spidering).
Web indexing involves collecting information from web pages to create entries for search engine indexes. The crawlers follow links from page to page, gathering and processing data from the web pages, such as the URLs and the links they contain, key terms, and how those are distributed.