Website change detection
Detect website and SPA changes without screenshot noise
Monitor critical pages with deterministic HTML and rendered-DOM diffs, low-noise rules, incident workflows, and run diagnostics.
- Deterministic HTML/DOM diffs show exactly what changed, not screenshot pixels
- Incidents, baseline acceptance, and request-linked run logs keep triage fast
How it works
What you get
Deterministic evidence: snapshots and diffs per URL (raw, extracted, final)
Incident closure loop: acknowledge, resolve, accept baseline with audit trail
Operational diagnostics: engine mode, wait strategy, blocked resources, timings, request ID
Flexible notifications: destination routing, templates, and retry visibility
What we monitor
DOM contract surfaces and key selector regions, including rendered output for JS/SPA pages in Pro
SEO-critical signals: title, meta description, canonical, H1, robots, sitemap endpoints
HTTP details: status, redirects, latency, bytes captured, and failure categories
Noise controls: select/extract plus ignore selectors/regex with deterministic ordering
Use cases
SEO guardrails: detect title/meta/canonical/H1/robots/sitemap drift before ranking impact
Release verification: confirm critical routes after deploys and rollbacks
Vendor/widget drift: catch third-party script changes on rendered pages
Compliance/content changes: track legal or pricing copy edits with evidence
Incident response: correlate diffs, run logs, and delivery history via request ID
Deterministic HTML/DOM diffs vs screenshot monitoring
Why this matters
Website drift can silently break SEO, onboarding flows, and compliance messaging. DiffMon turns each change into actionable evidence with closure and audit history.
FAQ
Do you crawl whole sites?
No. You explicitly add URLs to monitor
What counts as a change?
A hash change on normalized HTML for the URL
How noisy is it?
Hashing reduces baseline noise, and ignore rules (Hobby+) remove volatile sections.
What happens on errors?
Errors are stored as snapshots with request IDs; some errors do not trigger alerts
Can I monitor robots or sitemaps?
Yes, add those URLs directly
Can you monitor JS/SPA pages?
Yes. Pro adds Browser Render for routes where simple fetch misses client-side content.
What happens over limits?
Existing monitors continue; new monitors are blocked until upgrade
How is data stored?
Snapshots are stored for history; payload storage is configurable
Resources
Monitor JS-rendered pages with deterministic waits and rendered DOM diffs when simple fetch is not enough.
Read docs