Seo
A sitemap was found for your website.
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling.
A Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites.
Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata.
Why it is important
Sitemaps help search engine crawlers decide what is essential and which pages or media to crawl. If your site is really large (1000+ pages), a crawler might not find the last updated page on your website.
In some cases, you want to expose pages that are not well or not linked at all. Crawlers will not be able to find those pages unless other sites link to those pages.
Another reason to have a sitemap might be when your site is brand new and crawlers are not yet finding any links to your website.
How the audit works
Sitemaps should be placed in the root directory and this is where Sitefig starts to find the sitemaps
Fixing the problem
Sitefig looks for XML sitemaps in the robots.txt file and tries to guess several other locations.
If Sitefig cannot find the Sitemap, you can assume other crawlers will not be able to find the Sitemap either.
The easiest method to make your Sitemap available is to add a reference in the robots.txt file or to make it available at yourdomain.com/sitemap.xml