Home | News | Search | New Sites | Top Sites | Popular Sites | Submit Site

Today's Google Bots and What They Do

Google currently indexes over 8 billion web pages. However, before these pages were placed in the index, they were each crawled by a special spider known as the GoogleBot. Unfortunately, many web masters do not know about the internal workings of the virtual robot.

In fact, Google actually uses a number of spiders to crawl the Web. You can catch these spiders by examining your log files.

This article will attempt to reveal some of the most important Google spiders, their function and how they affect you as a web master. We'll start with the well-known GoogleBot.

GoogleBot, as you probably know, is the search bot used by Google to scour the web for new pages. GoogleBot has two versions, Deepbot and Freshbot. Deepbot is a deep crawler that tries to follow every link on the web and download as many pages as it can for the Google index. It also examines the internal structure of a site, giving a complete picture for the index.

Freshbot, on the other hand, is a newer bot that crawls the web looking for fresh content. The Google Freshbot was implemented to take some of the pressure off the GoogleBot. The Freshbot recalls pages already in the index and then crawls them for new, modified or updated pages. In this way, Google is better equipped to keep up with the ever-changing web.

This means that the more you update your web site with new, quality content, the more the GoogleBot will come by to check you out.

If you'd like to see the GoogleBot crawling around your web property more often, you need to obtain quality inbound links. However, there is also one more step that you should take. If you haven't already done so, you should create a Google Sitemap for your site.

Creating a Google sitemap allows you to communicate with Google, telling them about your most important pages, new pages and updated pages. In return, Google will provide you with some valuable information. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. This allows you to pinpoint problems and fix them so that you gain increased exposure in the search results.

The next Google bot in our lineup is known as the MediaBot, which is used to analyze Adsense pages. It decides which ads are displayed.

Google recommends that webmasters specifically add a command in their robots.txt file that grants Mediabot access to their entire site. To do this, simply enter the following code into the robots.txt file:

User-agent: Mediapartners-Google* Disallow:

This will ensure that the Mediabot is able to place relevant Adsense ads on your site.

Keep in mind that ads can still be shown on a page if the MediaBot has not yet visited. If that is the case, the ads chosen will be based on the overall theme of the other pages on the site. If no ads can be chosen, the dreaded public service announcements are displayed instead.

It is a matter for debate whether or not the MediaBot is giving websites with Adsense an advantage in the search engines. However, getting your website updated is an advantage in itself.

This is very similar to Google Analytics , which also promotes a slightly higher degree of spider activity.

Anyone who runs Google Analytics on their site can expect additional spider activity.

However, you certainly shouldn't depend on any of these tools for getting your site indexed. The key to frequent spidering is having quality inbound links, quality content and frequent updates.

Do you have images on your site? If so, you have more than likely been visited by our next Google spider, the ImageBot.

The ImageBot prowls the Web for images to place in Google's image search. Images are ranked based upon their filename, surrounding text, alt text and page title.

If you have a website that is primarily image based, then you definitely want to optimise your images to receive some extra Google traffic.

On the other hand, some websites may not benefit from Google image search. In most cases, the traffic from the image search engine is very low quality and rarely converts into buyers. Many people are just looking for images to swipe. So, if you want to save some bandwidth, use your robots.txt file to block ImageBot from accessing your image directory.

One of the few exceptions I would make is if you have a site dedicated to downloadable images.

Our final bot is completely dedicated to the Google Adwords program.

AdsBot is one of Google's newest spiders. This new crawler is being used to analyze the content of advertising landing pages, which helps to determine the Quality score that Google assigns to your ads.

Google uses Quality score in combination with the amount you are willing to bid to determine the position of your ads. Therefore, ads with a quality score can rank higher even if other advertisers are paying more than you.

Can you still block being spidered? Of course, but it will lower your overall Adwords quality score, which could end up lowering the position of your ads. If possible, it is best to give AdsBot complete access to your site.

Today's Google bots are becoming more advanced all the time. However, nothing beats relevant, quality, updated content. Deliver that and the search engines will eat it up.

About the Author

After building freeways as a civil engineer, I became a video producer. Over the past five years I have been involved with a number of internet businesses.

http://www.HomeBusinessPilot.com  Work at Home Ideas and Opportunities

http://www.PlugInProfitSite.com/main-14107  Plug-In Profit Site - 3 EASY STEPS to Make Money Online!

Site News
Good And Bad Google SEO Optimizing
Use Website Directories To Get Noticed
Today's Google Bots and What They Do
Web Site Structure With Sitemaps
Getting Top 10 Position in Google Is Easy
Welcome to Business Directory
About Us | Privacy Policy | Contact Us

Copyright © MSECC.ORG 2006. All Rights Reserved.
MSECC.ORG is not related or affliated with any organizations with similar names.