SEOlust
Performance

HTML Payload Analyzer

Check raw HTML size and compression to improve speed and SEO.

All tools

HTML Payload Analyzer - Check Raw HTML Size & Compression

The HTML Payload Analyzer helps you measure how heavy a webpage’s HTML response is before the browser loads images, CSS, and JavaScript. It fetches the page, calculates raw HTML size, checks if compression (gzip/brotli) is enabled, and surfaces common markup signals like inline scripts and inline styles. Use it to improve page speed, reduce Time to First Byte (TTFB) impact from oversized templates, and keep your pages lean for better user experience and technical SEO.

What is an HTML payload?

An HTML payload is the raw HTML response your server returns when someone requests a page. It includes your markup structure, headings, content blocks, menus, inline styles, inline scripts, and sometimes embedded data. Even before the browser downloads images or executes JavaScript, it must receive and parse this HTML. If the HTML payload is too large, users feel slower load times—especially on mobile connections—and search engines may crawl less efficiently.

What does the HTML Payload Analyzer do?

This tool fetches the URL you provide and measures the raw HTML response size. It also checks whether compression is detected (Content-Encoding like gzip or br) and tries to estimate transfer size if the server provides Content-Length. Finally, it counts a few common “markup weight” signals—how many script blocks, inline CSS blocks, JSON/LD blocks, and HTML comments are present—so you can spot pages that are bloated by templates, heavy inline tracking, or embedded data.

Why raw HTML size matters for performance

Large HTML increases the amount of data that must be transferred and parsed before a page becomes usable. The bigger the HTML, the more time the browser spends parsing and building the DOM. For mobile users, a few extra hundred kilobytes can be noticeable. Also, big HTML can delay first render, increase memory usage, and slow down interaction readiness. Keeping HTML lean is one of the simplest ways to improve perceived speed—especially for content-heavy pages like category listings, search results, and long landing pages.

Compression: gzip and brotli

Compression reduces transfer size. A 200 KB raw HTML response can shrink significantly with gzip or brotli. This is why many performance audits recommend enabling compression at the web server level (Apache/Nginx) or via a CDN. The tool checks for a Content-Encoding header to detect compression. Even if your raw HTML is not tiny, proper compression can greatly improve real-world load times—especially for first-time visitors on slower networks.

How to use the tool

Paste a full URL (or just a domain) and run the analyzer. You’ll get a score and label based on practical raw HTML size thresholds. The tool then shows: the final URL after redirects, HTTP status code, content type, raw HTML size, transfer size (if available), compression status, and markup signals. Use the recommendations section to prioritize fixes if the payload is heavy.

What is a good HTML size?

There isn’t a single perfect number, but as a practical guideline: under ~100 KB raw HTML is typically lean, 100–200 KB is acceptable for complex pages, 200–350 KB often needs improvement, and beyond that you should investigate what is adding weight. Pages like homepages, category pages, and landing pages should be kept especially lean because they are common entry points and are crawled frequently.

Common causes of heavy HTML payloads

Heavy HTML often comes from repeated layout blocks, oversized navigation, massive footer link lists, too many inline scripts, or large embedded JSON data. Some CMS themes output huge chunks of hidden markup for modals, menus, or widgets that are not needed on every page. Another common cause is server-side rendering of long lists (hundreds of items) instead of using pagination or lazy loading.

How to reduce HTML payload size

Start by removing duplicate wrappers and unnecessary markup from templates. Reduce repeated UI components, limit the amount of content rendered above the fold, and paginate long lists. Avoid dumping large datasets into HTML as inline JSON; instead fetch additional data via APIs after first render. Keep tracking scripts minimal and defer non-critical scripts. If you must include structured data, keep it compact and relevant.

Technical SEO benefits

Lean HTML helps both users and crawlers. Faster pages improve user experience and can support better engagement. For crawling, smaller payloads can reduce bandwidth and processing time, which is useful when search engines crawl large sites frequently. Clean markup also reduces the chance of rendering delays or DOM complexity issues that can affect page interpretation.

Interpreting the results

The tool separates raw HTML size (uncompressed) from transfer size (what might be sent over the network, if Content-Length is present). A large raw HTML size paired with compression means transfer may be reasonable, but parsing cost may still be high. A large raw HTML size with no compression is a strong sign that enabling gzip/brotli will deliver a quick win. The markup signals can reveal if inline scripts, inline styles, or excessive comments are contributing.

Final recommendations

If your HTML payload is heavy, focus on trimming templates, reducing inline data, and enabling compression. If your HTML is already lean, your next gains usually come from optimizing JavaScript, CSS delivery, images, and caching. Use this tool regularly after design updates, CMS changes, or adding new widgets—HTML size tends to grow slowly over time if you don’t measure it.

FAQ

What does this tool measure?
It measures raw HTML size (uncompressed) and checks headers for compression and transfer clues.
Does it analyze images and JavaScript bundles?
No. It focuses on the HTML response only. It does count script blocks as a signal, not file sizes.
Why is transfer size sometimes shown as unknown?
Some servers do not send a Content-Length header (especially with chunked transfer encoding), so exact transfer size can’t be read reliably.
What is a good target for HTML payload size?
As a practical guideline, aim for under ~100 KB raw HTML for most pages, and keep complex pages as lean as possible.
Does gzip or brotli replace the need to reduce HTML?
Compression helps transfer size, but large HTML still increases parsing cost. Ideally you want both: lean HTML plus compression.
Why do script blocks matter?
Many inline scripts add weight, and too much inline scripting can slow parsing and delay rendering.
Can the tool detect if my page is not actually HTML?
Yes, it checks content-type and HTML signals. If it looks like an API response, it will warn you.
Does this require any third-party API?
No. The tool fetches the page directly from your server using cURL.
How can I enable compression?
Enable gzip/brotli in Apache/Nginx or via your CDN. Many CDNs enable brotli by default.
Why can a homepage HTML become huge over time?
New widgets, tracking scripts, expanded menus, and rendering too many items server-side can slowly inflate HTML.
Will reducing HTML payload improve Core Web Vitals?
It can improve perceived speed and parsing time, and can support better LCP/INP in some cases—especially on mobile.

Related tools

Pro tip: pair this tool with YouTube Keywords Extractor and AI Keyword Cluster Ideas for a faster SEO workflow.