Technical SEO
Technical SEO covers the infrastructure checks that allow search engines to find, crawl, and index your page correctly. Weight: 15%.
Details
SSL / HTTPS
A valid SSL certificate ensures the page is served over HTTPS, which is a confirmed Google ranking signal since 2014. An invalid or expired certificate triggers browser warnings and may cause Google to deindex the page entirely.
The audit checks certificate validity, expiry date, and issuer. An expired or invalid cert when the page fails to load over HTTPS is a critical issue (−20 pts). If the SSL check fails but the page still loads over HTTPS, it is downgraded to a warning (−10 pts). A certificate expiring within 30 days triggers a warning.
robots.txt
The robots.txt file tells crawlers which URLs they may access. A missing file means Google has to guess. A misconfigured file (e.g., 'Disallow: /') blocks all crawlers.
The audit checks that the file exists and returns HTTP 200. A missing robots.txt is a warning. The audit also checks that sensitive paths (/api/, /admin/, /auth/) are blocked — missing Disallow rules for these paths are info-level issues (−2 pts each).
XML Sitemap
A sitemap tells Google which URLs exist on your site. It should be reachable at /sitemap.xml or declared in robots.txt via a Sitemap: directive. The audit checks both the file and the robots.txt reference.
A missing sitemap or one not declared in robots.txt is a warning. The sitemap consistency check verifies that all URLs in the sitemap use the same domain as the site itself — mismatched domains trigger a warning.
Mobile / Viewport meta
The viewport meta tag (<meta name='viewport' content='width=device-width, initial-scale=1'>) tells mobile browsers how to scale the page. Missing it causes poor mobile rendering and can trigger a mobile-usability penalty in Google Search Console.
A missing viewport meta tag is a critical issue (−20 pts) because mobile-first indexing is now the default for all sites.
HTTP status code
The page must return HTTP 200 OK. A 4xx (client error) or 5xx (server error) means the page cannot be indexed. Redirects (3xx) are noted but not penalised — Google follows redirect chains up to a limit.
Favicon
A favicon (favourite icon) is the small icon shown in browser tabs, bookmarks, and Google search results. Missing favicons trigger a 404 request on every page load, wasting bandwidth.
Best practice: provide both an ICO/PNG fallback and a high-resolution PNG (192×192 or 512×512) for PWA manifests. An SVG-only favicon without ICO/PNG fallback triggers a warning. A favicon present but without a large PNG (192px+) is an info-level issue.
Site infrastructure
Infrastructure checks cover cross-cutting issues that affect the entire site:
• www / non-www redirect — both variants should not serve content simultaneously. One must redirect (301) to the other to avoid duplicate content. If both return HTTP 200, it is a warning. If the alternate variant returns 404 or a DNS error, it means only one variant exists (pass).
• Service pages noindex — /privacy, /contact, /terms should have a noindex tag (via meta robots or X-Robots-Tag header) to keep them out of search results and avoid diluting crawl budget. Missing noindex is a warning.
• robots.txt sensitive paths — /api/, /admin/, /auth/ should be blocked with Disallow rules. Missing rules are info-level (−2 pts, not −10).
• Sitemap consistency — all URLs in /sitemap.xml must use the same domain as the site. Cross-domain URLs trigger a warning.
Note: if a resource is unreachable (e.g. sitemap returns 404), the corresponding check returns no result rather than a failure. This prevents false positives for sites that legitimately don't have a sitemap.
Metrics
| Metric | Description |
|---|---|
| SSL valid | Whether the page's TLS certificate is valid and not expired. |
| SSL expiry | Certificate expiry date. A warning is raised if expiring within 30 days. |
| SSL issuer | Certificate issuing authority (e.g. Let's Encrypt, Cloudflare). |
| robots.txt found | Whether /robots.txt exists and returns HTTP 200. |
| robots.txt issues | Specific problems found in the robots.txt file. |
| Sitemap found | Whether an XML sitemap is found and contains URLs. |
| Sitemap URL count | Number of URLs listed in the sitemap. |
| Viewport meta | Whether a mobile viewport meta tag is present. Missing = critical. |
| Favicon | Whether a favicon is found and in what formats (ICO, PNG, SVG). |
| HTTP status | The HTTP response code returned by the page. |
| Redirect chain | List of redirects followed to reach the final URL. |
| Load time | Time in seconds for the page to respond (server + initial HTML). |
| www redirect | Whether www and non-www variants redirect properly to each other. |
| Service pages noindex | Whether /privacy, /contact, /terms have noindex. |
| Sensitive paths blocked | Whether robots.txt blocks /api/, /admin/, /auth/ paths. |
| Sitemap consistency | Whether all sitemap URLs use the same domain as the site. |
Related Topics
Indexability checks whether search engines are allowed and able to index the pag…
Performance measures how fast a page loads for real users, based on Google PageS…
Architecture analyses the URL structure, HTTP response headers (caching, securit…
Run a free SEO audit to see how your site performs in this category.