Sitemap Generator
Crawl a website and generate an XML sitemap. Discover pages, set priorities, and download a ready-to-use sitemap.xml file.
How to Use Sitemap Generator
- 1Enter the URL of the website you want to generate a sitemap for.
- 2Choose how many pages to crawl (100, 200, or 500).
- 3Click "Generate" to crawl the site and discover all linked pages.
- 4Review the discovered URLs and their estimated priorities.
- 5Download the generated XML sitemap file to upload to your server.
Zenovay
Track your website performance
Real-time analytics, session replay, heatmaps, and AI insights. 2-minute setup, privacy-first.
Related Tools
Meta Tag AnalyzerAnalyze meta tags of any webpage. Check title, description, Open Graph, Twitter cards, and get SEO recommendations.
Open Graph CheckerPreview how your page looks when shared on Facebook, Twitter, and LinkedIn. Check all OG and Twitter Card tags.
HTTP Header CheckerInspect HTTP response headers of any URL. Check security headers, caching, content type, and more.
Robots.txt ValidatorValidate and analyze your robots.txt file. Check rules, sitemaps, and common crawling issues.
Frequently Asked Questions
How does the sitemap generator work?▾
The tool crawls your website starting from the URL you provide, following internal links to discover pages. It then generates a valid XML sitemap with all discovered URLs, priorities, and last modified dates.
How many pages can it crawl?▾
The generator can discover up to 500 URLs. The default is 200 pages which works well for most small to medium sites. For very large websites, you may want to use a server-side crawler.
What is the crawl depth?▾
The tool crawls up to 2 levels deep: your homepage (depth 0), pages linked from the homepage (depth 1), and pages linked from those pages (depth 2).
How are priorities assigned?▾
Priorities are assigned based on crawl depth: the homepage gets 1.0, depth-1 pages get 0.8, and depth-2 pages get 0.5. You can adjust these in the downloaded file.
Where should I upload the sitemap?▾
Upload the sitemap.xml file to the root of your website (e.g., https://example.com/sitemap.xml). Then reference it in your robots.txt with: Sitemap: https://example.com/sitemap.xml
Why are some pages missing?▾
Pages not linked from any crawled page will not be discovered. JavaScript-rendered links are also not detected since the crawler does not execute JavaScript.