Sitemap Submission Why Do I Need A Sitemap ?

HTML Sitemap = A simple to follow route for the search engine spiders to crawl your site.

It"s a web page that displays the structure of (a part of) your website, with links to all relevant pages. This is useful for visitors to quickly find every page within your site. This is especially convenient for larger websites where not every page is easily accessible through a menu. But also smaller sites can benefit from an HTML sitemap as it makes for better search engine optimization.
With an HTML sitemap you enable search engines to easily and quickly crawl and index your entire site. A search engine robot uses hyperlinks to surf the Internet from one page to another. When every page on your site contains a link to a sitemap, a search engine bot will quickly land on the sitemap page from which he can follow the links to all the other pages on your site. An HTML sitemap is also a clever method to equally disperse PageRank and link juice across your site.

With a link to your sitemap from your homepage, you can let this link juice flow to your sitemap. Your sitemap on its turn contains links to all the other pages on your site so it can nicely disperse all that link juice internally within the site. This makes a sitemap a useful tool to make sure that all the webpages within your site receive some link juice.

When you submit your sitemap to the various search engines, it is telling them to visit your website and giving them the route of travel YOU want them to take. Making sure they find all the material you want them to find and giving your website the scoring for ALL of your content.

In this instance, we"ve submitted an XML sitemap and received an error that URLs in the sitemap are also featured in the robots.txt file. (your file telling the robot spider what to index and when, and what material NOT to include as well)

It"s important to pay attention to this type of error and warning information. They may not be able to even read the XML sitemap. And, we can also glean information on what important URLs we are withholding from crawls in the robots.txt file.

As a follow-up to the point above, on the negative aspect of dynamically-generated sitemaps, these can often include many URLs that are excluded from search engine view intentionally in the robots.txt file. The last thing we want to do is tell a search engine to both crawl and not crawl the same page at the same time. That confusion could certain hurt our rankings and also not give us credit for the quality material linked off of those pages.