Home / FAQ

Sitemap Generator FAQ

Common questions about XML sitemaps, our free generator, and how to get the most from your SEO sitemap tool.

What is an XML sitemap?

An XML sitemap is a file that lists all the pages on your website in a structured format that search engines can read. It follows the sitemaps.org protocol and includes details like when each page was last modified, how often it changes, and its relative importance. This helps Google, Bing, and other search engines discover and index your content more efficiently.

How often should I update my sitemap?

You should update your sitemap whenever you add, remove, or significantly change pages on your website. For active blogs or e-commerce sites with daily changes, consider regenerating your sitemap daily or using a dynamic sitemap generator. For relatively static sites, monthly or quarterly updates are usually sufficient. Our free sitemap generator makes it easy to regenerate on demand — just enter your URL and download.

Does my website really need a sitemap?

Google recommends sitemaps for: websites with more than a few hundred pages, newly launched sites with few external backlinks, sites with rich media content (video, images), and sites with archived or isolated content not well-linked internally. Even small sites benefit from the improved crawl efficiency and faster indexation of new content. If Google can discover all your pages easily through navigation alone, a sitemap is less critical — but it still provides useful signals like last modification dates.

What is the difference between XML and HTML sitemaps?

An XML sitemap is designed for search engine bots — it's a structured file (sitemap.xml) that machines parse. An HTML sitemap is a regular web page visible to human visitors that lists links to your pages for navigation purposes. Both are useful, but XML sitemaps are what you submit to Google Search Console for SEO benefits. This tool generates XML sitemaps.

Which pages should I exclude from my sitemap?

Exclude pages with noindex directives, non-canonical URLs (pages with rel=canonical pointing elsewhere), redirect URLs (only include final destinations), pages behind authentication, internal search results, and paginated pages beyond the main set. Including pages you don't want indexed wastes crawl budget and sends conflicting signals.

How do I submit my sitemap to Google?

There are two ways to submit your sitemap to Google:

  1. Google Search Console — Verify your site, go to Sitemaps, and enter your sitemap URL.
  2. robots.txt — Add Sitemap: https://yoursite.com/sitemap.xml to your robots.txt file.

Using both methods is recommended. Search Console gives you reporting on indexation status and any errors Google encounters.

What are the AI features and how does BYOK work?

Our AI sitemap generator offers optional smart suggestions powered by your own API key (Bring Your Own Key). When configured, AI can suggest optimal priority values for each URL, recommend changefreq settings based on page type, identify content gaps compared to competitors, and provide URL structure optimization tips. It supports OpenAI-compatible providers (including DeepSeek, OpenRouter, and others), Anthropic Claude, and Google Gemini. Your API key is encrypted and stored locally — we never see or store it on our servers. All core features work perfectly without any AI key.

Is this tool really free forever?

Yes. All core features — website crawling, sitemap.xml generation, SEO issue detection, visual site tree, statistics dashboard, and export — are completely free with no sign-up required. The tool is open source and runs on Cloudflare's generous free tier. The optional AI features use your own API key, so we have zero AI costs to pass on. There are no hidden fees, no usage limits on the free tier, and no plans to change this.

What crawl depth should I use?

Depth determines how many link-following hops the crawler makes from the starting URL. Depth 1 only discovers pages directly linked from the homepage. Depth 3 is the default and works well for most small-to-medium sites. Use depth 4-5 for large sites with deep content hierarchies. Higher depth takes longer but discovers more pages. Start with 3 and increase if the results seem incomplete.

What SEO issues does the tool detect?

The SEO analyzer checks for:

  • Broken links (4xx and 5xx status codes)
  • Missing or empty title tags
  • Missing meta descriptions
  • Duplicate page titles
  • Orphan pages (no internal links pointing to them)
  • Pages blocked by noindex directives
  • Redirect chains
  • Missing H1 headings

Each issue is categorized by severity (error, warning, or info) so you can prioritize fixes.

Still Have Questions?

Try the tool and see for yourself — it's free and takes under a minute.

Generate Your Sitemap