Googlebots are small programs running on Google’s servers that continuously browse the Internet indexing web sites in Google’s main database. The Googlebots scan the cache of your web page and send the data back to the database. The easiest way to “attract” Googlebots to index your site is if you have a site map. The site map is probably one of the most important pages that the Googlebots can turn to, as the links contained their point back to your own web pages. This creates a trail that Google will use to trace the rest of the links to your site. You can find out if your site was indexed by the Googlebots, with the help of the Google toolbar. Check your Google toolbar to see if your page has been indexed. If the ranking value is zero, you should probably revise your web site’s design so that the Googlebots can find it (eliminate some of the Flash and see if that helps).

Another way to attract search engine bots is to have a small sized website. Keeping your website under 101kb is sure to attract Googlebots, if it’s much larger, there is a chance Googlebots will avoid it, as Google’s cache doesn’t go higher than 101kb. This is, of course, a general guideline, but keeping your site small you have several benefits besides that of easier attracting Googlebots: it will load easier, making it better for human navigation, it will be easier to manage and upload and it will be a lot easier to promote.

Besides the initial Googlebots launched by the search engine a few years ago, it seems Google has begun using another spider in their scanning and indexing of web sites. News of a second Googlebot was discovered by a number of site owners who, while studying their site logs, noticed two Google spiders; with different IP address ranges; visited and scanned their respective sites.


Related Posts

  • No Related Posts

Filed Under: SEO

About the Author:

RSSComments (0)

Trackback URL

Leave a Reply

If you want a picture to show with your comment, go get a Gravatar.