What is it?
It is a tool provided by Google to help make your site more search engine accessible. Google, being a search engine itself, wants to help site owners make their sites more accessible so that they can bring up the best search results. Doing this can help bring your site up in related search results, which can then bring more people to your page.
Adding your site
- Register for an account with Google if you have not already.
- Once logged in go to the Dashboard and enter the URL of your site where it says “Click here to add a site”. Click enter and the site should be added.
Verifying Your site
Google needs to verify that the site you entered is your own and not someone else’s. To do this you need add something onto your server that the Google tool can find and see that you own the site and have the power to change and add things. Google offers two ways of doing this. The first is by adding a meta tag at the head of the homepage. The second is adding a new page with a specific name to a specific directory. You can pick whichever way you find to be easier.
Adding a meta tag to the homepage
- Go to the Dashboard and click “Verify” at the right most column of the main table.
- In the drop down menu choose “Add a meta tag” as your method.
- Copy the meta tag provided and paste it into the head of your homepage.
- Once you have done that and saved the page click “Verify”.
Uploading an HTML file
- Go to the Dashboard and click “Verify” at the right most column of the main table.
- In the drop down menu choose “Upload an HTML file” as your method.
- Copy the name of the file you need to make and name a new document exactly that by pasting it into the correct field. Be sure to make this file in the home directory, the same one that your homepage is in.
- Make sure that the file is saved and uploaded by entering “http://www.yoursite/thenameofthefile” into a browser.
- Once you have done that and the page is viewable click “Verify”.
These are XML pages with all of the pages on your site listed. These pages are used by Google to find all of the pages on your site much easier. Once Google learns of all the pages through Sitemaps it can more easily go through all of them and index them correctly for the search engine results. This can be useful with pages that aren’t easily crawled by the Google bots such as dynamic pages or pages with a lot of flash and AJAX. Also, if there aren’t a lot of other sites that link to yours it will be more difficult for the bots to travel to you page as they travel from link to link.
Entering a Sitemap to Google also gives it more information. The Sitemap will have information on each of the pages like the date it was last modified, how often they are modified, and the importance of each page. With this information Google can do an even better job at analyzing your site and bringing it up in results.
Generating a Sitemap
Google offers a script that will generate a XML Sitemap as well as offering third-party generators with different approaches. There are programs, plugins, downloaded tools, and online generators. Pick whichever is best for you and generate a Sitemap. Upload it to your server and be sure that it is a .XML document.
Adding a Sitemap
- Make sure your .XML document is uploaded on your site and is visible through a browser.
- Go to the dashboard. The middle column for the main content table is for Sitemaps. Click the link reading “Add” in this column.
- Choose “Add General Web Sitemap” from the drop down menu.
- Enter the name of your .XML Sitemap document into the form provided. You do not need to put in the URL of your page as it is already there from when you added the site earlier.
- Click the “Add General Web Sitemap” button.
Once it is submitted Google will process the Sitemap file to add it to its system. You will have to wait for a while for this processing to occur. Once it has it will display “OK” under status in the Sitemap overview page.
These are resources that help you discover errors and things wrong with your site that may inhibit Google’s ability to ideally bring your site up in its results. Learning about these errors and fixing them are crucial if you want your site to run optimally with search engines.
This diagnostic run through will tell you if there are any errors on your site that inhibit the crawling of your site. If there are errors they may prevent Google from bringing up your site where it should have been in results which could mean lost potential visitors.
If you have any errors in your meta tags or title this will tell you. This can be helpful because when your site comes up in results the title is displayed. If there is an error with the title then someone may just skip over it and go to the next result which has a title.
Mobile crawl errors
This is basically the same as Web crawl but is for the bots that crawl for mobile devices instead of browsers.
The crawlers that Google uses to find information to bring up relative results in their searches travels from place to place through links. In this process Google gathers information about the links on your site. The links section of this site shares some of the information that they bots have found in their travels through your pages.
- Pages with external links: Shows the pages on other sites that link to yours. These bring in visitors from others sites to yours.
- Sitelinks: These are links that are displayed with the results for your site in a search. They are put there so that visitors can navigate your site more easily directly from the search.
- Pages with internal links:This shows which pages on your site link to another page on your site.
This page offers you ten tools that may be useful in syncing your site with Google to help with search engine optimization.
- Analyze robots.txt: Runs a diagnostic run of your robots.txt page. It can tell you what it is doing for the bots whether it is blocking robots from crawling certain URLs on your site or blocking all robots. It will not work if you do not have a robots.txt file on your server.
- Generate robots.txt: This is a generator for a robots.txt page. With this you can control the bots that visit your site to a certain degree. You can block them all together, block certain URLs, or just block bots from a certain category such as the bots for mobile devices. You put in what you want and the tool will create the code for you to put on your robots.txt page.
- Enhance 404 pages: This tool is a script for a 404 page that has a Google search. Using this could be useful if your current 404 page isn’t very useful for the visitor that found it. The one provided with this tool gives the visitor the ability to find what they were looking for in the first place with the search.
- Set geographic target: This tool sets up a geographic location for your site. This means that Google will bring up your site in results if the person is within the geographic region that you set for it. Doing this may be useful for sites that are for local businesses. This way the visitors will be more likely to be potential customers.
- Enhanced image search: This tool allows Google to better utilize the images on your site as results in the images search.
- Manage site verification This shows all of the people who have added this site as their own and have had it verified. You can also see the meta tag for verification on this page in case you have deleted or lost it.
- Set crawl rate: With this tool you can set the rate that the bots from Google will crawl your site. You can choose from Faster, Normal, and Slower. With fast the bots will crawl your site more often, but they will use up bandwidth as they do, putting more strain on your server. Normal is the rate that it usually crawls, and if set at slower the bots will crawl less often, putting less strain on your server.
- Set preferred domain: This allows you to choose if you want your site to be displayed with the www before the name or not. So http://www.site.com would show up as either site.com or www.site.com. With this you can pick whichever you prefer.
- Remove URLs: You can remove certain URLs from Google’s list to crawl. If you want to remove your whole site or a directory it is best to use a robots.txt file instead of this tool.
This section shares some of the information Google has gathered while crawling through your site. By looking through these resources you can find out what Google finds and if it is not what you want it to find you can take measures to try and fix it.
- Top search queries:This gives statistics about the queries that people entered into Google to get your site as a result. The table holds information about the query itself, the position it was in the results, and what percentage each of these queries were of all of them. The first one listed was the one most entered, the last the least.
- Subscriber stats:This gives information about subscribers to your site who are using Google related programs to subscribe. This only shows the Google related ones so the users not using iGoogle, Google Reader, or Orkut will not show up.
- What Googlebot sees: This shows information about what the bots pick up while they are crawling through your site. The keywords are words that the bots thought were important while on your site. These words help Google bring up your site in appropriate searches. If the keywords displayed have nothing to do with your site then there may be problems. There are two sections for keywords, one for your own site’s content and one for the content around the links to your site from external sites.
- Index stats: Here you can see which pages on your site are in Google’s database, which external pages link to your home page, the latest cache of your site, information Google has about your site, and similar pages to your site.