A spider is a program that visits sites and collects information for search engines to use. Also known as “crawlers” or “bots,” all search engines use spiders in order to create entries for their indexes. Spiders can travel from link to link, gathering website information as they do so. Just like the real thing, these spiders are constantly crawling the web. Webmasters can see which spiders and bots have visited their site, and can even block bots from crawling certain pages of their site in the future.