Digital Radar | AI, Technology & Digital Marketing
A
practical, tool-by-tool guide for marketers, SEOs, and site owners who want to
diagnose what is limiting their organic search performance — and fix it in
priority order.
Most websites have more SEO
problems than their owners realise — and most SEO audits surface more issues
than the team has time to fix. That combination produces a familiar outcome: an
audit report with hundreds of flagged items, no clear prioritisation, and a
team that does not know where to start. Three months later, the same problems
persist.
An SEO audit is only useful if
it leads to action. And action requires knowing which issues actually affect
organic performance, which tools surface those issues most reliably, and in
what order the fixes should be applied. A crawl error on an orphaned page
matters less than a site speed problem affecting every URL. A missing meta
description matters less than a canonical tag conflict fragmenting your ranking
signals.
This guide covers how to perform
an SEO audit with tools — not as a checklist exercise, but as a diagnostic
process that produces a prioritised fix list. You will learn which tools to use
for each audit category, what to look for inside them, and how to translate
audit findings into decisions that move organic performance in a measurable
direction.
📌 Key Takeaways
•
An SEO audit covers five domains: technical health,
on-page optimisation, content quality, backlink profile, and Core Web Vitals.
•
Google Search Console and Google Analytics are
non-negotiable starting points — no paid tool replaces them for first-party
performance data.
•
Audit findings should be triaged by impact on organic
performance, not by the volume of issues a tool reports.
•
Screaming Frog, Ahrefs, and SEMrush cover the majority
of technical and competitive audit requirements for most sites.
•
AI is changing SEO auditing — surfacing patterns and
recommendations that manual analysis would take hours to identify — but human
interpretation of findings remains essential.
What an SEO Audit Actually Covers
An SEO audit is a systematic
evaluation of a website's ability to rank in organic search — identifying the
technical, on-page, content, and authority factors that are limiting
performance or creating risk. It is not a single scan. It is a multi-layered
diagnostic process that spans five distinct domains, each requiring specific
tools and specific expertise to interpret correctly.
|
Audit Domain |
What It
Examines |
Primary
Tool(s) |
Impact Level |
|
Technical SEO |
Crawlability,
indexation, site structure, redirects, canonicals |
Screaming
Frog, GSC, Ahrefs |
High —
foundational |
|
On-Page
Optimisation |
Title tags,
meta descriptions, heading structure, keyword targeting |
Screaming
Frog, SurferSEO, SEMrush |
Medium —
page-level |
|
Content
Quality |
Thin content,
duplication, topical coverage gaps, keyword cannibalisation |
Ahrefs,
SEMrush, Google Search Console |
High —
traffic-determining |
|
Backlink
Profile |
Link quality,
toxic links, anchor text distribution, competitor gap |
Ahrefs, Majestic,
Google Search Console |
Medium-High —
authority |
|
Core Web
Vitals |
LCP, INP, CLS
— loading, interactivity, visual stability |
PageSpeed
Insights, Google Search Console, GTmetrix |
High —
ranking factor |
Each domain requires a different
toolset and produces different types of findings. The mistake most teams make
is treating an SEO audit as a single-tool exercise — running Screaming Frog and
calling it done, or pulling a site audit report from SEMrush and treating its
error count as a performance score. A complete audit requires intentional
coverage of all five domains, with findings from each synthesised into a
prioritised action plan.
Step 1 — Start with Google Search Console and Google Analytics
Before running any third-party
tool, extract first-party data from Google's own platforms. No crawler or SEO
suite has access to the performance data that Google Search Console provides —
actual impressions, clicks, average position, and click-through rate at the URL
and query level. This data tells you what Google already thinks about your site
before you start diagnosing why.
What to Pull from Google Search Console
•
Performance report: Filter by page to identify which
URLs are generating impressions but not clicks — a signal of title tag and meta
description problems, or ranking positions too low to attract traffic.
•
Coverage report: Identify URLs that are excluded from
the index (not the same as crawl errors), URLs with 'Discovered — currently not
indexed' status (a crawl budget signal), and any 'Valid with warning' pages.
•
Core Web Vitals report: Check the proportion of URLs in
'Poor' status for both mobile and desktop — this is your CWV remediation
priority list.
•
Manual actions and security issues: Check both. A
manual action is a direct ranking penalty. Security issues (malware, hacked
content) cause immediate deindexation risk.
What to Pull from Google Analytics 4
•
Organic traffic trend over the past 12 months: Identify
whether traffic is growing, flat, or declining — and whether any drops
correlate with known Google algorithm update dates.
•
Landing page performance by organic channel: Which
pages receive organic traffic, and what is their engagement rate and conversion
performance? High-traffic pages with poor engagement are content quality
candidates.
•
Geographic and device breakdown: Mobile performance
issues often appear in GA4 before they surface in Search Console.
Step 2 — Run a Technical Crawl with Screaming Frog
Screaming Frog SEO Spider is the
industry standard for technical crawling — not because it is the only option,
but because it provides the most granular, configurable, and reliable crawl
data available outside of enterprise platforms. The free version crawls up to
500 URLs; the paid version (approximately £250 per year) is effectively
mandatory for any site above that threshold.
Configure the crawl before
running it. Default settings will crawl everything, including URLs you do not
need audited. Set the following before crawling:
1.
Under Configuration → Spider, disable crawling of URLs
with parameters if your site has significant parameterised URLs that should not
be indexed.
2.
Enable 'Crawl linked XML Sitemaps' to ensure all
sitemap-listed URLs are included even if they are not internally linked.
3.
Connect Screaming Frog to Google Search Console and
Google Analytics under Configuration → API Access. This overlays GSC and GA
data directly onto the crawl results.
Critical Issues to Identify in the Crawl
•
Broken internal links (4xx errors): Internal links
pointing to 404 or 410 pages waste crawl budget and create poor user
experience. Export and fix.
•
Redirect chains and loops: A URL that redirects through
three hops before reaching its destination loses link equity at each hop.
Redirect chains of more than two hops should be collapsed to direct redirects.
•
Duplicate title tags and meta descriptions: Export from
the Page Titles and Meta Description tabs filtered by 'Duplicate.' Duplicates
either signal content duplication or inadequate metadata management.
•
Missing canonical tags or conflicting canonicals: Pages
that are accessible via multiple URL variants (with and without trailing slash,
www vs non-www, HTTP vs HTTPS) without a canonical tag are consolidating zero
ranking signals. Every indexable page needs a self-referencing canonical.
•
Orphaned pages: URLs that appear in the sitemap but
have no internal links pointing to them cannot be discovered by crawlers
through the link graph. They rely entirely on being in the sitemap — a single
point of failure.
•
Pages blocked by robots.txt that should be indexed:
Check the Directives tab for URLs with 'Noindex' or 'Blocked by robots.txt'
that should be ranking. This is one of the most common causes of sudden traffic
drops.
Step 3 — Audit Content Quality and Keyword Cannibalisation
Technical SEO establishes
whether Google can access and index your content. Content quality determines
whether Google considers that content worth ranking. The two most common
content issues that suppress organic performance — and are frequently
overlooked in tool-led audits — are thin content and keyword cannibalisation.
Identifying Thin Content
In Screaming Frog, filter the crawl
by word count (available in the Content tab) and export all pages under 300
words that are set to index. Each of these is a candidate for either expansion,
consolidation into a more comprehensive page, or noindexing. The presence of
low-quality, thin content across a domain is a site-wide quality signal to
Google — not just a page-level issue. Sites with a high proportion of thin
indexed pages consistently underperform their technically equivalent
competitors.
Identifying Keyword Cannibalisation
Keyword cannibalisation occurs
when multiple pages on your site target the same query — causing Google to
split ranking signals between them rather than concentrating authority on one.
The result is that neither page ranks as well as one consolidated, authoritative
page would.
In Google Search Console, export
the Performance report filtered to a specific keyword. If two or more URLs
appear in the 'Pages' breakdown for the same query, you have a cannibalisation
candidate. In Ahrefs or SEMrush, use the Site Explorer to identify URLs
competing for the same organic keywords. Both platforms have cannibalisation
detection features that automate this analysis at scale.
Step 4 — Analyse the Backlink Profile
Backlink analysis in an SEO
audit has two goals: understanding your current authority position relative to
competitors, and identifying link-related risks that could be suppressing
performance or creating future penalty exposure.
Authority Benchmarking with Ahrefs or SEMrush
Pull your domain's Domain Rating
(Ahrefs) or Domain Authority (SEMrush / Moz) and compare it to the three to
five competitors ranking above you for your primary target keywords. If your
content and technical health are comparable and you are still outranked, the
authority gap is often the explanation — and building it requires a link
acquisition strategy, not more on-page optimisation.
Identifying Toxic or Low-Quality Links
In Ahrefs Site Explorer, filter
your backlink profile by 'DR 0-10' (very low authority referring domains) and
review the anchor text distribution. An unnatural concentration of exact-match
anchor text from low-authority domains is a pattern Google's SpamBrain
algorithm is specifically trained to identify. Use Google Search Console's
Links report to cross-reference — GSC shows which linking domains Google has
actually crawled and is aware of.
The disavow file is not a
routine audit output. It is a last resort for sites with documented, actionable
link spam that has survived manual outreach attempts. Most sites with healthy
backlink profiles from legitimate sources have no need to disavow. Do not
disavow links speculatively — incorrect disavowal removes legitimate link
equity.
Step 5 — Measure and Diagnose Core Web Vitals
Core Web Vitals became a
confirmed Google ranking factor in 2021 and were updated with the replacement
of FID by INP (Interaction to Next Paint) in March 2024. The three metrics —
Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative
Layout Shift (CLS) — measure loading speed, interactivity, and visual stability
respectively.
|
Metric |
What It
Measures |
Good
Threshold |
Common
Causes of Failure |
|
LCP (Largest
Contentful Paint) |
Time for the
largest visible element to load |
Under 2.5
seconds |
Unoptimised
hero images, slow server response, render-blocking resources |
|
INP
(Interaction to Next Paint) |
Response time
to user interactions (replaced FID) |
Under 200ms |
Heavy
JavaScript execution, third-party scripts, inefficient event handlers |
|
CLS (Cumulative
Layout Shift) |
Visual
stability — elements shifting during load |
Under 0.1 |
Images
without dimensions, late-loading ads, dynamic content insertion |
Use Google Search Console's Core
Web Vitals report for field data — real user measurements from Chrome users on
your site. Use PageSpeed Insights for lab data — a controlled test environment
that shows you exactly what is causing failures and which resources to address
first. GTmetrix provides additional waterfall analysis that helps identify
which specific assets are causing LCP delays.
The critical distinction: fix
CWV issues on your highest-traffic pages first. A CWV failure on a page with 50
monthly visitors has negligible ranking impact. The same failure on your top 20
landing pages directly affects the organic performance of your most
commercially important URLs.
Expert Insight: How AI Is Changing SEO Auditing
The traditional SEO audit
process is labour-intensive: crawl the site, export data, filter in
spreadsheets, cross-reference across tools, manually identify patterns, and
produce a report. That process is being materially accelerated by AI features
now embedded in the major SEO platforms.
SEMrush's Copilot feature uses
AI to analyse your site's performance data and surface prioritised
recommendations — identifying which issues from a site audit are most likely to
be affecting organic performance rather than simply listing everything the
crawler found. Ahrefs has introduced AI-generated content gap analysis and is
integrating AI into its keyword research workflow. Google itself has begun
incorporating AI-driven search features (AI Overviews) that are reshaping which
queries return traditional blue-link results and which return AI-generated
summaries — a structural change with significant implications for how SEO
audits should interpret organic traffic patterns.
The most consequential AI
application in SEO auditing right now is automated pattern recognition across
large crawl datasets. A 50,000-URL site generates crawl data that is genuinely
difficult to analyse manually. AI tools embedded in platforms like Lumar (formerly
DeepCrawl) and ContentKing can identify systemic patterns — a category of page
where canonical implementation is consistently incorrect, a template-level
title tag problem affecting hundreds of URLs — that manual review would take
days to find.
What AI does not replace is the
interpretation of findings in the context of your specific site, audience, and
competitive landscape. An AI audit tool can tell you that 340 pages have
duplicate title tags. It cannot tell you whether those pages should be consolidated,
rewritten, or redirected — that decision requires understanding the content
strategy, the user intent behind each page, and the competitive dynamics of the
keywords being targeted. The audit tools get faster and smarter. The judgment
required to act on what they surface remains human.
Frequently Asked Questions
What is an SEO audit?
An SEO audit is a systematic
evaluation of a website's technical health, on-page optimisation, content
quality, backlink profile, and Core Web Vitals performance — identifying the
factors that are limiting organic search visibility and ranking potential. The
goal of an SEO audit is not to produce a list of issues but to produce a
prioritised set of actions that, when implemented, measurably improve organic
search performance.
How long does an SEO audit take?
A basic technical SEO audit for
a site under 1,000 URLs can be completed in three to five hours using Screaming
Frog and Google Search Console. A comprehensive audit covering all five domains
— technical, on-page, content, backlinks, and Core Web Vitals — for a site of
5,000 to 10,000 URLs typically takes two to three days for an experienced SEO.
Enterprise sites with hundreds of thousands of URLs require dedicated tooling
(Lumar, ContentKing, Botify) and significantly more time.
What tools do I need to perform an SEO audit?
The minimum viable SEO audit
toolkit consists of: Google Search Console (indexation, performance, CWV field
data — free), Google Analytics 4 (traffic trends and engagement — free),
Screaming Frog SEO Spider (technical crawl — free up to 500 URLs, approximately
£250/year for the full version), and PageSpeed Insights (Core Web Vitals lab
data — free). For backlink analysis and competitive benchmarking, Ahrefs or
SEMrush are the industry standards. Most professional SEO audits use all of the
above.
What is the difference between Screaming Frog and SEMrush for SEO audits?
Screaming Frog is a desktop
crawler that gives you granular, configurable technical crawl data — it is the
most precise tool for identifying redirect chains, canonical issues, broken
links, and crawl directives at the URL level. SEMrush's Site Audit is a
cloud-based crawler that provides a higher-level health score and categorised
issue list, integrates with its keyword and backlink data, and is better for
ongoing monitoring across multiple projects. Most professional SEOs use both:
Screaming Frog for deep technical investigation and SEMrush for ongoing site
health tracking and competitive context.
How often should I perform an SEO audit?
A full SEO audit — covering all
five domains — should be conducted at minimum annually, and ideally every six
months for sites in competitive verticals or undergoing active development.
Technical crawls should run monthly on any site that publishes new content
regularly, since new pages introduce new potential issues with every
deployment. Google Search Console should be reviewed weekly — not as a formal
audit, but as an operational monitoring practice that catches indexation
problems, coverage errors, and Core Web Vitals regressions before they
compound.
What should I fix first after an SEO audit?
Prioritise by impact on organic
performance, not by issue volume. The hierarchy for most sites: first, fix
anything blocking Google from crawling or indexing key pages (robots.txt
exclusions, noindex tags on indexable pages, canonical conflicts on high-value
URLs); second, resolve Core Web Vitals failures on high-traffic pages; third,
address keyword cannibalisation on pages competing for commercially important queries;
fourth, fix broken internal links and redirect chains; and fifth, address
on-page issues like missing or duplicate title tags. Low word count pages and
backlink toxicity are typically lower priority unless you have specific signals
(manual actions, traffic drops correlating with link pattern changes) that
suggest otherwise.
Conclusion: Auditing as a Continuous Practice, Not an Annual Event
The value of an SEO audit is not
in the document it produces. It is in the changes that document drives. A thorough
audit that surfaces 200 issues and results in three fixes has less impact than
a focused audit that surfaces twelve prioritised problems and drives all twelve
to resolution within a sprint cycle.
The most effective SEO teams
treat auditing not as an annual project but as a continuous operational
practice. They use Google Search Console as a weekly monitoring tool. They run
Screaming Frog after every significant site deployment. They track Core Web
Vitals in the field, not just in lab tests. They review content performance
quarterly and consolidate or update underperforming pages before they
accumulate into a larger structural problem.
The direction of the industry
adds urgency to this discipline. Google's AI Overviews are changing the organic
search landscape in ways that are not yet fully understood — some query
categories are seeing significant CTR changes as AI-generated summaries absorb
clicks that previously went to organic results. Understanding which of your
pages serve queries where AI Overviews appear, and whether your content is
being cited within them, is becoming a new dimension of SEO auditing that did
not exist two years ago.
Audit regularly, prioritise ruthlessly, fix systematically, and measure the outcome. That cycle — more than any single tool or technique — is what compounds into durable organic search performance.





0 Comments