Technical SEO

Technical SEO covers the infrastructure checks that allow search engines to find, crawl, and index your page correctly.

Details

SSL / HTTPS

A valid SSL certificate ensures the page is served over HTTPS, which is a confirmed Google ranking signal. An invalid or expired certificate triggers browser warnings and may cause Google to deindex the page entirely.

robots.txt

The robots.txt file tells crawlers which URLs they may access. A missing file means Google has to guess. A misconfigured file (e.g., 'Disallow: /') blocks all crawlers and is a critical issue. The audit checks that the file exists, returns HTTP 200, and does not block all user-agents.

XML Sitemap

A sitemap tells Google which URLs exist on your site. It should be reachable at /sitemap.xml or declared in robots.txt. The audit checks that the sitemap is found and counts the number of URLs listed.

Mobile / Viewport meta

The viewport meta tag (<meta name='viewport' content='width=device-width, initial-scale=1'>) tells mobile browsers how to scale the page. Missing it causes poor mobile rendering and can trigger a mobile-usability penalty in Google Search Console.

HTTP status code

The page must return HTTP 200 OK. A 4xx (client error) or 5xx (server error) means the page cannot be indexed. Redirects (3xx) are noted but not penalised — Google follows redirect chains up to a limit.

Favicon

A favicon (favourite icon) is the small icon shown in browser tabs, bookmarks, and search results. Missing favicons trigger a 404 request on every page load, wasting bandwidth.

Best practice: provide both an ICO/PNG fallback and a high-resolution PNG (192×192 or 512×512) for PWA manifests. SVG-only favicons are not supported by all browsers.

Site infrastructure

Infrastructure checks cover cross-cutting issues that affect the entire site:

• www / non-www redirect — both variants (www.example.com and example.com) should not serve content simultaneously. One must redirect (301) to the other to avoid duplicate content.

• Service pages noindex — pages like /privacy, /contact, /terms should have a noindex tag to keep them out of search results and avoid diluting crawl budget.

• robots.txt coverage — sensitive paths (/api/, /admin/, /auth/) should be blocked with Disallow rules.

• Sitemap consistency — all URLs in the sitemap must use the same domain as the site itself.

Metrics

Metric Description
SSL valid Whether the page's TLS certificate is valid and not expired.
robots.txt found Whether /robots.txt exists and returns HTTP 200.
Sitemap found Whether an XML sitemap is found and contains URLs.
Viewport meta Whether a mobile viewport meta tag is present.
Favicon Whether a favicon is found and in what formats (ICO, PNG, SVG).
HTTP status The HTTP response code returned by the page.
Load time Time in seconds for the page to respond (server + initial HTML).
www redirect Whether www and non-www variants redirect to each other.
Service pages noindex Whether service pages (/privacy, /contact, /terms) have noindex.
Sensitive paths blocked Whether robots.txt blocks /api/, /admin/, /auth/ paths.
Sitemap consistency Whether all sitemap URLs use the same domain as the site.

Related Topics

Indexability

Indexability checks whether search engines are allowed and able to index the pag…

Performance

Performance measures how fast a page loads for real users, based on Google PageS…

Architecture

Architecture analyses the URL structure, HTTP response headers (caching, securit…