8 Steps to Perform a Technical SEO Audit for a Client’s Blog

Anna Kochegura

Nov 22, 20217 min read
bakery-blog-technical-audit

TABLE OF CONTENTS

In our most recent SEO Reality Show episodes, we took readers through the agency’s process for setting up and launching Edelweiss Bakery’s blog:

  1. Finding potential topics
  2. Preparing a content plan
  3. Choosing an impactful blog theme
  4. Creating technical instructions for third-party content writers tasked with bringing the content to life. 

In the meantime, the bakery’s web developers installed a blog theme on a bakery’s test domain to ensure that there were no technical errors before officially launching the blog on the bakery’s website. 

You may remember that earlier in the SEO Reality Show, the agency conducted a technical site audit of the bakery’s homepage. They’ll use a similar process here to ensure the blog performs well in Google search. 

The Challenge 

Preparing technical audits will almost always turn up some sort of issues that need to be resolved, and this process was no real exception. The agency received a great deal of feedback from the client on how the bakery wanted their business to be represented on the blog and site. This is a crucial part of the process where the agency can build trust with their client by compromising where appropriate, and by explaining the importance of certain technical aspects that they feel will strongly benefit the bakery. The agency concluded that moving forward, they’d need to come up with meaningful solutions to client concerns. This would involve the following steps: 

  1. Coordinate with the client in stages to avoid lengthy delays in the site design process. This would allow the client to give feedback prior to live site changes (when it would be difficult and time-consuming to make more changes).
  2. If necessary, postpone uncritical edits until after the release of the live site. Businesses get busy and can't always give in-depth feedback to the agency on their timeline. Flexibility is key on both the part of the client and the agency.

The Process

Technical audits involve checking the bakery’s site for potential issues that could negatively impact Google’s ability to index the site, which in turn prevents it from showing up in organic search results. 

During the audit, SEO experts do and check many things: 

  • Setting up the website’s main mirror 
  • Checking robots.txt
  • Availability of a correct site map sitemap.xml
  • Mobile rendering
  • Page Response Code
  • Canonical pages and their alternate versions
  • Page loading speed
  • The presence of duplicates, broken links

Let’s take a look at how the agency conducted a technical blog audit for Edelweiss Bakery’s website

Step 1 — Setting up the site’s main mirror

A site’s main mirror is a replica of the main site. It allows brands to test the site structure before they actually launch changes. If the site is on a test subdomain, there is no need to check the main mirror's configuration because no SSL certificate is installed and no redirects are configured. 

When you’re setting up a site’s main mirror, remember to use https://httpstatus.io for verification. You must check the following combinations:

  • site.com with www and without www
  • site.com with HTTPS and HTTP (if you have an SSL or TLS certificate installed)
  • site.com/index.html
  • site.com/index.php
  • site.com/index.htm
  • site.com/home.html
  • site.com/home.php
  • site.com/home.htm

There may be additional options depending on your CMS. If the main mirror is not configured, it is necessary to task the programmer with configuring 301 redirects.

Step 2 — Robots.txt compilation

Earlier, the agency was working on a new robots.txt file for the bakery's website. Since they were supposed to create a blog page without choosing a theme on the main domain, the agency advised the client against indexing the blog section. Now, after transferring the blog section to the main website the agency opened the indexing.

When you’re creating your own blog, don’t forget to open it up to indexing after you transfer it to your main website so that you can take advantage of all the potential SEO benefits when you’re ready for it to go live. To do so, simply remove the Disallow: /blog* line, which you can see in the example below:

SbbeHXxFYEyvv-3AkOoOLFiIOy3-nITW29zYXBKwp2vSW4ONEVf_dwLN_Ei-RjWoKn9qwgFSpSL1I4jMPcCwLluw8FhDiAUVf8QrRdIN7lsTnpW7yavJbov9yckiYancVrHHOEyV
It is important not to forget to open the blog section for indexing!

You must prevent search engines from indexing test domains. In WordPress, you can do this in the “Reading” section of the CMS settings.

UR7vPdpZvyYPfnsQZ_QaCotS-uks1ggqfAKKDH4tmvhTskvV4ZqihhDP0vbtA5NfO1URAEosiSRFJo8PKmbSogLf_hZ6RYU_4KZmWdjovhqgvVUv-RR746N4I-DSABTskVXWojqX

Step 3 — Checking the sitemap.xml file

An XML sitemap is not required for sites that have 1000 pages or less and an excellent internal linking structure. Blogs on WordPress will have an XML sitemap created automatically. Check that your sitemap is free of errors and contains only the pages that need indexing. When our partner agency ran the technical audit for Edelweiss, the bakery had published three articles. No issues were found during the testing of the sitemap, so WordPress did its job well. 

DN48s8aNTIWq7zd7gw_if9fMktRXtTlN0O6ffYpjRr64Bq60XMr-qMnDnl3DrY1hp3EhmGKrLSRg0ewVXkiCTiQh1QeIqKavHrAwXq47M14AHh_uvh3f3yd3srUHlkrlqUEJn01N
The first level of the /sitemap.xml

When checking for errors, keep in mind that the sitemap can be indexed at different levels, so be sure to check all levels.

RDT5aVknqqHR9BN7SLsXXFlEZbowJgyEwQWJBhhIRn6j08r3li_dYKoTJ7iOtIcyQPPgNhWbebre3ryLcTcAGpXca6S_MJl1eJPcNC0wC07A2jhR4VbgdVDpytHDdkG_OTAKmGuH
All published articles are in place.

The agency used this validator to make sure that the sitemap was in good technical shape.

1p9IGnow2nNhup1jxP_wND7ZoTlsMAJ1SL6eIs-NzYqP9HCg4EVeTmrM1WsicLYvMfThJHFyKgcp6nZZ1kaIsCA-n6_lsFccI9EjL3sZTmgZ6BP0-8rPv7Uf5GRfd_UFYH-JVx5F
No errors detected

They also used the Semrush Site Audit tool, which can also check for broken links, oversized XML map files, and orphaned pages. These can each cause their own problems. Fortunately, no issues were found — as you can see in the example below.

HxrKosMH0crJvr8AobG9J7-FXF87UyBdGGvSLKy40G9ldtO8RRzRIds1et2cLHJCqzdyShoZkGakPqOlYRMc87vd1kvI1QzYTlczYREl1Nvemzc9eGbK8qmo63kSzx8VoPYw3HE0
Validation of sitemap in Semrush Site Audit Tool

Step 4 — Mobile Rendering

Before launching the site, you always want to make sure the search robot perceives the site the same way visitors will, especially visitors on mobile devices.

The agency used Google's Mobile-Friendly Test tool for this purpose. When you’re assessing how your site renders, the agency recommends you pay close attention to the screenshot showing the Google Search Robot's view of your website. If it differs from your mobile device's view, you’ll need to identify and correct errors that prevent normal page rendering.

This happened in the test version of the Edelweiss blog. Although Google’s own tools indicated that the page was mobile-friendly, the rendering impression didn’t align with the actual user experience. 

TSbF8DIbaZ49HrrkshTqZGA6My0OpVB4fHW7VtMgOJmU91joq-RrdqFjQCaSexFateQrK_XcVXkuCMJJrfSNbiyMJBP2e5PPrtfPKV_tpHGTlTTNlRqiYqp0MDWzPisJnGfzwACn
Make sure to check the rendering cast.

They determined that this was caused by a lack of Allow Directives for CSS, JS, and images. They were already registered in robots.txt, so these directives needed to be expanded. 

NjY59AYsHRj2RTvnNBeYRXkc2sDPAmyAwE2agAsy273mMWxJkrKo1flOICXSfdW5nhCXknD6CzkY01W6Voy9b--5730W-JgwjvOG4PNiIDpSbqqmCfm2nFWSs51r3_m3pwTh64q8
There are no Allow directives for CSS, js files, and images.

Step 5 — Page Response Codes

Page Response Codes (also known as “Status Response Codes”) play an important role in the functionality of your site. When a user tries to access a site, the server will receive and process that request. Most of the time users won’t see the codes; just a functional page. But sometimes they’ll see a 404 error code. 

This step of the audit is simple. You need to ensure that most pages return with a 200 status code, and non-existent pages send a 404 status code. 

To check, use the Bulk URL HTTP Status Code, Header & Redirect Checker Tool, or a Chrome plugin like Link Redirect Trace.

After moving from the test domain to the live one, check the response codes of the site pages because the server settings may be different!

UlPdQDeS_QHuzmI1H0OhQtU5W3nm2ZmocJ74bDPt-LLyUcREwC_RWZBWC1PK-AtX-DsOWXTyunLKtO_F7YP0IBTzPyddUlQMj5Utx6ZFkBkYPP6PxwIctOzOepw-fx4Nv2mtLikW
8Q4GXREqMBmVN3My-4P7BwxRZ9hPrS77UHVY5ZdqzRtIzcvI-wazllsIfVmgECMMNtQYLpouYcLZs_AiYKYhe7sWHqTUccxxEH02ib11xmXhj6hsYMOM1gr3P3h0tV02egCD5BJH
Everything is configured correctly on the test subdomain.

Step 6 — Canonical tags and their alternate versions

Canonical tags point search robots to a preferred version of a specific page, which mitigates issues around duplicate content. Canonical tags can also check the source code for links to non-existent alternate versions.

You can check the source code directly or use any convenient Chrome plugin (the agency uses META SEO inspector).

During their audit, the agency discovered:

  • There were no canonical tags
  • The alternate tags contained incorrect feed links

This was something that the web developer would need to tackle. 

lWPeqS7DRrJVXO-2DYRVFezKvrF4K2ekjQlQnJ13LjX7MHDPFajXMmpbsnhkyhF-C1HcQwmLQwb4PsrxMeGqskYPBrSjCILNlWT60Olae9JRfL6GFcPUW9-GirVZQjHy4nagb_eQ
Checking the correct tag settings using the Chrome META SEO Inspector plugin

Step 7 — Page Loading Speed

Page loading speed is known to be important to the user experience, and it’s also a site ranking factor. The faster your site loads, the better. 

Use Google PageSpeed Insights to check download speed parameters for both the mobile and desktop versions of your site. Check the doorway pages, categories, and the blog's main page.

v3bIyRx9KFaq5smbOR2E_b6KrUrmO64t459raG9_MR489FWxSk0WX5NhuFnN7Z1hWd0KJYfQGUkrvJDZ68rIAQ3mqN1PsGWV5cAIHhIqOhVvnE01dCmE1Tz4vZlF0KtE87BjYzTG
Checking PageSpeed Insights for the test version of the Edelweiss site

The agency discovered that the speed indicators for the page’s mobile and desktop versions were performing at an acceptable level, but that they could be better.

To improve them, they looked at the tips that Google recommended. 

The FCP, LCP, and TTI metrics needed to be improved on the bakery's test website.

To improve these parameters on your own site, follow Google's recommendations, which include:

  • Exclude rendering blocking
  • Show text before loading fonts
  • Use WEBP image format
NccZj0Wq_Yu1vdLC7kz52NfpKNLqqUEpL8nGDth1rzcCEe9ezPPToMl4WFcTmC8QV6ra-OUufn2CHi2Zf_-v8KU6jJX2nj-GD9z63TT0xGks7zkxZ-S8sY1iWHelGEeCijfXlnTd
Google’s advice is based on data and is often a great place to start. 

Since the site is not indexable, no online tool can check for broken links at this stage. It's best to do this via Google Search Console, which shows the exact same errors that Google’s search robots will find. 

But in this case, it's critical to open the site for indexing so that you can identify and fix any errors before it’s officially launched. 

To do this, you can use a crawler that can bypass indexing restrictions. The agency used Semrush's Site Audit tool for this purpose during their audit, which showed that there were no linking issues on the test version of the Edelweiss blog.

OuNgSB17EyPhsPWBpiRlVx8pOEYt0XfbkI8PtAGJhCz7Dsh97BkahUm9xQ7ayWlW5E6vB9aeWbWo6nsna-jC5wf4bn1Q2o0lJcAgtu1rx5V3XVSv6dSft7IQJ1P0w-rB9x74Cfvo
Checking the site for duplicate pages
oPQUFceZTcam0pFNLwG6CUQPCY92e8OGUz5uJb-35_h7YNYmrChzHXh-MYaTPPBB4XzB5k9jjn3UT4z59kcDLTSIey9FCKH9r50SNUxrzwgxVn8l6jgpbiYqEGX6hJs6quTXPrf2
Checking the site for broken links

The agency strongly advises using Google Search Console to check the index for duplicate pages. If duplicates are found, they must be closed by meta-robots, or the source of the error must be identified.

Next Up

At this point, the agency conducted its technical SEO audit of the bakery’s new blog. They compiled all of their findings into a report for the bakery’s site developers to tackle. After the developers make those changes, the agency will then do a final recheck before making the site live.

So what happens next? The agency is going to check the “Shop” section’s settings, which is the commercial section of the site with categorized product cards. It’s a vital part of the site for the business because this is where the highest chunk of conversions happen. Stay tuned so you don’t miss out! 

Infographic - Blog Techincal Audit in 8 steps

img-semblog
Share
Author Photo
Bringing over 5 years of marketing experience to Semrush awesome products in France. Now exploring new horizons of the digital world and shaping inspiring data-driven projects.
More on this