Google updates Sitemaps
Search engine Google has expanded the scope of its Sitemaps tools to provide a richer set of data for web masters wanting to optimise their presence and ranking in the search results and analyse the traffic.
Google Sitemaps is a set of tools provided by the search engine to make it easier for web masters to (legitimately!) get all their pages listed and ranked in the company’s search index. Normally, the googlebot spiders will find a site, follow the links and index all the pages on the way. Sitemaps allows webmasters to submit a ‘sitemap’ on what they consider to be the most important pages and how often they change. It can be thought of as the web masters’ alternative to Google Analytics which is more geared to marketeers.
Sitemaps also feeds back information as to any problems the spiders may be having in indexing particular pages as well as providing the webmaster with information about crawling issues, what Google knows about the site (including such information as the most highly trafficked content) and what the site ranks most highly in response to particular key phrases.
The Sitemap now provides a complete list of 404 errors instead of just a list of 10 in the earlier version. It is also providing finer information about search results. Previously data about search queries was aggregated, now it is possible to drill down and tease out information about referrals from Google properties such as Images or Web results and from the country of origin.
In addition, Google’s robots.txt analysis helps to ensure that the robots.txt file on the site blocks pages which are intended to remain private and allows only those for public view. Google says that it has also added the ability to test against the new Adsbot-Google user agent, which crawls AdWords landing pages for quality evaluation.
Finally, for those who have a compulsion to add sites, Google has increased the maximum number of sites in one account from 200 to 500.