Robin Smith

Crawling is the process that Google goes about finding your website and adding it to their index. Crawler programs known alternatively as “bots”, “scutters”, “web-spider”, or “indexers” will trawl the Internet picking up links in websites and adding them to Google’s index of sites. Google Bot is the program particular to Google and it constantly scans the website. Originally running every 3-4 months, Google realized that their content would be out of data much of the time it was searched, so the company put Google Bot on permanent active-duty.

Google Bot has a list of websites that it must visit each time it scans, generated from its previous scan. It goes through these sites, searching for and exploring new links to sites it has not already found. When it finds a new link, G-Bot files it in the Google indexes to be surveyed next time the bot runs.

Indexing is…

View original post 84 more words

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s