How to do a complete SEO audit?

How to do a complete SEO audit?

Declining clicks on your website is traumatizing. Flat lines are gloomier even so. They give us sleepless nights, and when we do sleep, the SEO monster is chewing our minds. 

So let's fix your website's health and your sleep at the same time. To fix it, we need to understand it. 

What is an SEO audit?

You’re posting blogs and telling all your subscribers to visit your website. But nobody is able to visit your blogs because your pages show up nowhere on the search page.  

It means something (or a lot of things) is terribly wrong with your website’s health, making it far from SEO-friendly. 

Through the SEO audit, we will understand these issues and start busting the bugs. By the end of the audit, you’ll be able to analyse the following:

  1. Crawlability

  2. Indexing

  3. Loading Speed

  4. Domain Authority

  5. Page authority

  6. Quality backlinks and much more! 

 

Audits help you to decide the action items and identify the underlying issues that degrade your ranking over time. 

P.S. You don’t necessarily need paid tools to conduct your SEO audit; however, it's always a cherry on top to have a paid tool by your side. (Not to mention that I am not fond of cherries). 

How to do an SEO audit step by step?

To stay neutral, we are going to refer to major free tools that can be used by you to perform this audit. Let's start with the first step: 

Step 1: Create an SEO audit checklist

This checklist is supposed to be followed for some common issues that are usually addressed by website managers. Here is the checklist: 

  1. Study Organic Traffic Chart 

  2. Crawlability test through SEMrush

  3. Website Experience on Mobile and Desktop Google Search Console

  4. Rectify Indexing issues through the Google Search Console

  5. User Activity through Google Analytics

  6. Keyword Gaps in Your Content

  7. Interlinking Gaps

  8. Loading speed issues and Image optimization

  9. Increase domain authority. 

  10. URL Structures 

  11. Header Tags and Meta Tags

Step 2: Setup the essential tools for the audit

It is not essential, nor is it advisable to use 10 different tools to analyse your website’s health. SEO is generally monitored through these tools: 

  1. Google Search Console

  2. Google Analytics

  3. SEMrush SEO dashboard

  4. Ahrefs Webmasters Tool

Most of the issues can be identified and worked upon when you methodically map out and use these tools. If you want to refer to more tools, we have a blog on 10 Best Free SEO Audit Tools for 2025 that’s insightful on SEO audit tools. 

Going back to the checklist, let's start ticking off each item one by one. 

Step 3: Study Organic Traffic Chart 

If your website is connected to the Google Search Console, you need to discover the date during which the traffic on your website has decreased to bits. Under the performance chart, you can find out the top queries that have been sending people to your website and why the other pages have been declining lately. 

 

Step 4: Crawlability Test through SEMrush

Crawlability ensures search engines can navigate through your website to index its pages correctly. Performing a crawlability test helps you spot issues that may prevent your pages from appearing in search results.

  1. Set up SEMrush Site Audit. Enter your domain and configure the crawl settings to include all subdomains and HTTPS.

  2. Review crawl errors. Pay attention to broken links, server errors (5xx), or pages with redirects (3xx). Fix these issues promptly, as they directly impact your website’s user experience and search rankings.

  3. Check Robots.txt and Sitemap. Ensure your robots.txt file isn’t blocking important pages.

  4. Verify that your XML sitemap is updated and submitted to Google Search Console.

  5. Spot Duplicate Content and Missing Metadata

  6. SEMrush highlights duplicate content issues and missing title tags or meta descriptions, which can harm SEO.

  7. Rewrite or optimize duplicate content to avoid penalties.

Running a crawlability test is like performing a health checkup for your website’s architecture, ensuring search engines can navigate and index your site effectively.

Step 5. Website Experience on Mobile and Desktop by Google Search Console

By returning to the Google Search Console, we can identify another important metric that may be impacting your website’s overall performance. 

 

 

Under this chart, we can see several insights that are auto-generated by Search Console after crawling your website. 

Some common issues that are encountered by website owners are also listed here. 

 

Here are the explanations for the issues mentioned in the search console that may be hampering the website’s visibility across search engines. 

4.1. CLS Issue: More than 0.25 (Mobile)

CLS (Cumulative Layout Shift) measures the visual stability of a webpage. A score higher than 0.25 is considered poor for mobile. This means that elements on the page (e.g., images, text, buttons) are moving unexpectedly as the page loads, leading to a frustrating user experience.

  • Error/Poor: CLS exceeds 0.25, indicating severe layout instability.

  • N/A/0: No CLS issues are detected.

  • Cause: unoptimized images, dynamically injected content, or poorly implemented fonts.

How to fix this issue:

  • Use dimensions for media elements (width and height).

  • Avoid inserting content above existing elements.

4.2. LCP Issue: Longer Than 4 Seconds (Mobile)

LCP (Largest Contentful Paint) measures the time it takes for the largest visible content element (e.g., hero image, main heading) to render. An LCP longer than 4 seconds is flagged as poor on mobile, indicating slow loading.

  • Warning: LCP is between 2.5 and 4 seconds, suggesting improvements are needed.

  • Error/Poor: LCP exceeds 4 seconds, significantly affecting user experience.

  • N/A/0: No LCP issues detected.

How to fix this issue:

  • Optimize images by compressing and using modern formats (e.g., WebP).

  • Minimize server response times and use a Content Delivery Network (CDN).

4.3. INP Issue: Longer Than 200 ms (Mobile)

INP (Interaction to Next Paint) measures the responsiveness of your webpage when users interact with it (e.g., clicking buttons or links). A delay longer than 200 milliseconds is flagged, indicating a sluggish response time on mobile.

  • Warning: INP falls between 100 and 200 milliseconds, signalling the need for improvement.

  • Need Improvement: INP issues cause noticeable lags in user interactions.

  • N/A/0: No INP issues detected.

How to fix this issue:

  • Optimize JavaScript execution and remove unused scripts.

  • Reduce third-party scripts that delay interactivity.

4.4. LCP Issue: Longer Than 2.5 Seconds (Mobile)

This is a less severe threshold for LCP. Pages with an LCP between 2.5 and 4 seconds are flagged as needing improvement. While not as critical as exceeding 4 seconds, it still indicates potential loading speed issues that can impact the user experience.

  • Warning: LCP is slightly delayed, between 2.5 and 4 seconds.

  • N/A/0: No issues detected.

How to fix this issue:

  • Implement lazy loading for images.

  • Preload key assets like fonts and large images.

Step 6: Rectify Indexing issues

 If Google doesn’t even know that your web pages exist, how will people discover them? They’ll stay hidden under the rocks until the end of time. 

If you want to check what your indexing status is, go to the Google search console and tap on Pages under the Indexing option. You’ll be presented with a chart of indexed and non-indexed pages of your website that looks like this: 

This chart will not only show you a specific number of pages that are reducing the visibility of your website but also the reason behind them being non-indexed. Here are some of the common issues that are listed under this option: 

Let's refer to the image shown above, for example. We will understand the common reasons for these issues and take the correct approach to fix them.

1. Blocked by robots.txt

  • Issue: 80 pages are blocked by robots.txt. This file restricts search engines from crawling certain pages.

  • Impact: Search engines cannot crawl these pages, which might prevent important content from appearing in search results.

  • Fix: Review the robots.txt file to ensure only non-essential pages (e.g., admin pages, duplicate content) are blocked. Remove blocks for important pages that you want indexed.
    |

2. Excluded by ‘no index’ tag

  • Issue: 51 pages have a no-index tag, which instructs search engines not to index them.

  • Impact: These pages will not appear in search engine results.

  • Fix: Audit these pages to ensure they should remain excluded. Remove the noindex tag from any valuable content that should be discoverable.

3. Page with Redirect

  • Issue: 28 pages redirect elsewhere.

  • Impact: Excessive redirects can confuse search engines and users, leading to lower rankings.

  • Fix: Ensure the redirects are intentional and valid (e.g., for merging duplicate pages or site restructuring). Avoid redirect chains or loops.

4. Not Found (404)

  • Issue: 12 pages return a 404 error, meaning they don’t exist.

  • Impact: Broken links harm the user experience and can negatively affect SEO.

  • Fix: Identify and fix or redirect broken links. If the page is no longer relevant, implement a 301 redirect to the most appropriate existing page.

5. Crawled—Currently Not Indexed

  • Issue: 177 pages have been crawled by Google but are not indexed.

  • Impact: These pages are not visible in search results despite being accessible to Google.

  • Fix: Check if the content on these pages is unique, valuable, and relevant. Optimize these pages with proper metadata, internal linking, and content improvements. Resubmit them for indexing via Google Search Console.

6. Server Error (5xx)

  • Issue: 8 pages encounter server errors.

  • Impact: Google cannot access these pages due to server-side issues, which can lower site authority.

  • Fix: Investigate server logs to identify the root cause (e.g., hosting issues, timeouts, configuration errors). Resolve the errors and test the pages to confirm accessibility.

7. Soft 404

  • Issue: 2 pages are flagged as soft 404s, meaning they don’t return a 404 error but still lack meaningful content.

  • Impact: Search engines may consider these pages low-quality and exclude them from indexing.

  • Fix: Add relevant, unique content to these pages. If they’re no longer needed, redirect to a more appropriate page.

8. Duplicate; Google Chooses a Different Canonical than User

  • Issue: 5 pages have a canonical conflict where Google prefers a different canonical tag than specified.

  • Impact: This can dilute SEO efforts as Google may prioritize the wrong page.

  • Fix: Ensure canonical tags are correctly implemented and point to the preferred version. Align the canonical tags with internal linking structures and sitemap entries.

9. Alternative Page with Proper Canonical Tag

  • Status: 34 pages passed with proper canonical tags.

  • Action: No issues here. Maintain this as a best practice.

 

Step 7: User Activity through Google Analytics

Make sure your website is connected to your Google Analytics account. After that, you will be able to monitor real-time user activity that gives insight into major anomalies and gradually declining traffic. 

Without this data, you would just be shooting in the dark because the underlying reason for your diminishing website health resides in these tools. 

Once you identify which pages and sections are receiving the most traffic, you should interlink to other relevant pages that need more traffic and discoverability. 

Step 8: Loading speed issues and Image optimization

Sometimes, almost everything is perfect in terms of content and on-page SEO. Yet, the traffic keeps diminishing due to this minor element called loading speed. Here are 15 major reasons why your website might be taking a lifetime to load and, hence, results in the wheezing of your website. 

  1. Large image file sizes

  2. Too many HTTP requests

  3. Unoptimized code

  4. No browser caching

  5. Excessive redirects

  6. Poor server performance

  7. Lack of Content Delivery Network (CDN)

  8. Unoptimized JavaScript

  9. Too many plugins

  10. Unoptimized fonts

  11. Inefficient database queries

  12. Absence of Gzip compression

  13. Heavy use of external scripts

  14. Unresponsive design

  15. Excessive animations or multimedia

We are going to cover the solutions to the long loading time issue in another blog, so make sure to head over to our home page and check for them too. Honestly, no user wants to spend more than three seconds for a website to load. They’ll just smash the back icon and visit your competitor’s website. Make sure to fix this issue as soon as possible. 

Step 9: Increase Domain Authority (DA)

Domain authority is a metric that predicts how well your website will rank on search engines. It's influenced by factors like the quality and quantity of backlinks, overall site health, and your content's relevance.

A higher DA means your site is more likely to rank higher in search results, making it easier for users to find your content organically.

Issues to Work On:

  • A lack of backlinks from reputable websites can lower your DA.

  • Poor site performance (e.g., slow load times, errors) negatively impacts credibility.

  • Irrelevant or low-quality content reduces user engagement and backlink opportunities.

Step 10: Optimize URL Structures

URL structures refer to how your web page addresses are formatted and presented. Clean, logical URLs make it easier for users and search engines to understand your content.

Search engines prioritize user-friendly URLs because they provide clarity about a page's topic. A well-structured URL can also improve click-through rates.

Issues to Work On:

  • Long, complicated URLs with numbers or parameters can confuse users and search engines.

  • URLs lacking keywords miss opportunities to boost page relevance.

  • Inconsistent formats (e.g., mixing underscores and hyphens) can negatively impact SEO.

Step 11: Optimize Header Tags and Meta Tags

Header tags (H1, H2, H3) are used to structure your content, while meta tags (meta titles, descriptions) describe your page to search engines and users.

Header tags improve readability and help search engines understand your content hierarchy. Meta tags, on the other hand, directly impact how your page appears in search engine results and can affect click-through rates.

Issues to Work On:

  • Missing or duplicate header tags confuse search engines about the content's structure.

  • Poorly written or missing meta titles and descriptions reduce your page's visibility and attractiveness in search results.

  • The lack of alt text for images misses an opportunity to rank for image searches and improve accessibility.

By focusing on these areas, you'll improve both your website's technical and content SEO, leading to better rankings and user engagement.

If you want to know more about improving SEO and get hands-on marketing tactics, you can explore more blogs on humann. You can leave the remaining queries in the comment box, and we will get back to you.