Crawling is the process that Google goes about finding your website and adding it to their index. Crawler programs known alternatively as “bots”, “scutters”, “web-spider”, or “indexers” will trawl the Internet picking up links in websites and adding them to Google’s index of sites. Google Bot is the program particular to Google and it constantly scans the website. Originally running every 3-4 months, Google realized that their content would be out of data much of the time it was searched, so the company put Google Bot on permanent active-duty.
Google Bot has a list of websites that it must visit each time it scans, generated from its previous scan. It goes through these sites, searching for and exploring new links to sites it has not already found. When it finds a new link, G-Bot files it in the Google indexes to be surveyed next time the bot runs.
View original post 84 more words