Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.
The common features:
- Supports multilangual categories and products.
- Supports Search-Engine Safe URLs.
- Could be accessed by http or command line.
- Autogenerates multiple sitemaps for sites with over 50.000 URLs.
- Autogenerates multiple sitemaps if filesize exceeded 10MB.
- Writes files compressed or uncompressed.
- Using index.php wrapper http://domain.com/index.php?main_page=sitemapxml
- Using languages file and etc.
- Auto-notify Google, Yahoo!, Ask.com and Microsoft (both live.com and msn.com)
- You can use the gzip feature or compress your Sitemap files using gzip.
- Please note that your uncompressed Sitemap file may not be larger than 10MB.
- Generation of a sitemap index file.
- Generation of xml-sitemaps for (separate files):
- Products (support hideCategories).
- Categories (support hideCategories).
- EZ-pages: Multi-language support. 'EZ pages rel=nofollow attribute' support (http://www.zen-cart.com/index.php?ma...roducts_id=944), 'date_added'/'last_modified' support, check internal links ('alt_url') by "noindex" rule (main_page in ROBOTS_PAGES_TO_SKIP), toc_chapter proccessing.