Loading content...
Loading content...
Sitemap support gets more complex once you encounter indexes, compressed files, and edge-case payloads from the public web.
Public websites do not all ship a neat `/sitemap.xml` file. Some provide sitemap indexes, some use compressed `.xml.gz` payloads, and some mix useful URLs with things you should not store at all.
Processing sitemap content should be conservative. The system should enforce size limits, decompress carefully, ignore off-domain URLs, and reject malformed or obviously abusive payloads.
When sitemap import works reliably across different site setups, domain monitoring feels much more mature. Users notice very quickly whether import works on simple sites only or on real-world infrastructure too.