Home / A Guide to Technical SEO for eCommerce Stores
eCommerce is a cutthroat world, and having a subpar website performance isn’t going to cut it. It’s time to get serious about technical SEO, the unsung hero of any successful eCommerce SEO strategy. When research shows that 76% of consumers will go to a competitor’s site if they encounter issues with a website’s performance, technical SEO will take up a cut above the rest. In this comprehensive guide, you will learn what it entails, how to audit technical aspects of SEO, and every facet of this grand underbelly of website maintenance.
Technical SEO is the backbone of your website’s performance, ensuring that search engines can easily crawl, index and understand your content. It involves optimising various aspects of your site, from site speed and mobile-friendliness to connection security and structured data. For more information, consult our technical SEO services.
A technical SEO audit assesses a website’s technical health to ensure it can be crawled, indexed and ranked. This involves numerous smaller subsets of checks, including reviewing robots.txt files, XML sitemaps and broken links. This may sound like a foreign language now, but we expand on each of these and more further down the guide.
Regular audits help to maintain usability, performance and SERPs. For more information, read our guide to the website audits that will highlight its true performance.
Discoverability is a search engine’s ability to find your content amongst others. Search engines and users discover new content by following links on other pages that search engines already know about, through internal linking and backlinks. For Google, you can jumpstart this process yourself by submitting an index request through Google Search Console or its Indexing API Quickstart, or IndexNow for Bing. Key components such as a sitemap and internal linking play significant roles in this process.
Crawlability is the ease with which search engines can navigate your website once it has found content on it. Aspects like site architecture, robots.txt files, canonical tags, and addressing broken links and duplicate content all help to improve crawlability.
The final stage of improving your rankings from a technical standpoint, indexability is about getting your site to show on search results. If a page isn’t indexed, it cannot appear in search engine results pages (SERPs), regardless of its relevance or quality. Focussing on structured data, optimising page load speed and addressing thin content will all aid in your site’s indexability.
Google’s Core Web Vitals is the Rosetta Stone that determines the quality of a user experience from a technical standpoint. By responding to regularly monitoring your website’s Core Web Vital statistics through Google’s PageSpeed Insights, your site will float up the rankings of Google’s search rankings and likely the ranking of other search engines, too.
Core Web Vitals is split into three categories:
LCP is a stopwatch measuring how well your site loads. But here’s the kicker: it’s not about which element loads quickest, but which loads lowest. As the name suggests, LCP times how long it takes for the largest content element, which is usually a content or image block, to become visible. As a rule of thumb, aiming for a 2.5-second LCP time is a good strategy, and to do so, there are a number of ways to speed it up:
INP times how responsive your page is when a user interacts with the page, such as clicking a button or hovering the mouse over an element with a hover feature. This has to be razor-sharp reflexes, as Google view 200 milliseconds or less as the standard INP score. You can achieve this by:
Have you ever noticed the stuttering that can take place as you enter a site? These unexpected layout shifts are the mark of a poorly optimised website, even worse if they occur further on in the lifespan of the page. CLS measures this with a score, of which you should aim for 0.1 or less. Strategies for reaching this include:
Read more about Google’s Core Web Vital from the source here
58% of searches are on mobile devices today, so it’s imperative that it performs smoothly.
Since 2020, Google has been indexing webpages mobile-first, meaning the mobile version of your website is considered the primary version. As mobiles are less powerful than desktops, this presents further challenges when it comes to load speeds. Nonetheless, it is essential, and there are plenty of strategies to get you up to (mobile) speed:
Find out more about improving the mobile experience of your website from a design innovation perspective in our blog on shaking up your website design.
If you’re going to build a grand piece of web architecture that customers will flock to, you need a solid blueprint. Sitemaps acts as a roadmap for search engines and users, laying out the structure and hierarchy of your website all in one directory. They are instrumental in ensuring that all significant pages are discovered, crawled and indexed.
There are two types of sitemaps, both of which are equally important:
As the Sagrada Familia has been for over 140 years, websites are an evolving structure, and ensure that your sitemaps are consistently updated and audited for broken links and errors to maintain accuracy. Be sure to assign priority levels to pages based on their importance, guiding search engines to focus on the most critical content.
Navigation is an integral part of the user experience of your website, and not matter how complex it is, the structure should be like an ordered corridor system in a building, not a hedge maze designed by M.C. Escher. To help the user follow where they are on your website, include main menus that centralise key sections of your website and use breadcrumb navigation to display where the user is in the hierarchy of your website. For example, on a clothing website, it could display “Home > Men > Footwear > Running Shoes”.
Additionally, an organised sitemap needs clear and descriptive URLs that provide insight into the page’s content before users even load the page. Do this by keeping them concise and incorporating relevant keywords to improve search engine visibility.
For years, website addresses began with “http://”. Now, they have added an “S” onto that string of letters, which stands for “Secure”. This secure protocol encrypts the data exchange that occurs when accessing a webpage, ensuring that sensitive information remains protected. Obtaining an SSL/TLS certificate and installing it on your web server will enable HTTPS, which will aid in your search rankings.
Structured data is a form of code that provides explicit information about a webpage’s content, which makes it easier for search engines to display it in a format that suits it. For rich snippets which provide additional relevant information to can entice users, this is vital. Use structured data markup on your website to help better define certain types of content, from product markup to article and event markup, which can all be tested using Google’s Structured Data Testing tool. Avoid overusing markup content, though, as it should be reserved for the most relevant of content, lest you suffer potential SERP penalties.
Say you have multiple pages which display similar or identical content, which can happen. It’s important that you instruct search engine crawlers to only look at one of these similar pages to avoid being seen as spammy, which is what canonical tags are designed to do by pointing engines to the preferred version of a webpage. Adding these tags avoids confusion and cannibalisation, and using self-referencing canonical tags (which confirms to the search engine that it is on the right page) further consolidates these ranking signals. Canonical tags should also be used for non-HTML content, such as PDF files.
Like the halls of a centuries-old library, some of the links on a webpage are bound to wind up broken, either due to deleted or moved pages. The result is the dreaded Error 404 (Not Found) page, which can hinder the user’s experience, and the solution is to regularly audit the site for broken links using Screaming Frog, and setting up redirects to guide users to the right page.
A fundamental piece of code when concerning technical SEO, robots.txt specifies which parts of your site should or should not be accessed by search engine crawlers. Like canonicalisation, it is useful for preventing the indexing of duplicate content or sensitive information, though it is a voluntary directive which search engines can choose to ignore. Just make sure that the robots.txt file is located in the root directory of your website (i.e., just after “.com/”).
Orphan pages are web pages that sit isolated in your website’s structure, with no internal links pointing towards it. Heartbreaking, right? This can create crawlability issues as crawlers will have a hard time discovering and indexing these pages, but the simple solution is to ensure all pages are internally linked through other pages. When internally linking, using descriptive anchor text will clearly signpost the page for crawlers and users alike.
Soft 404 errors occur when a webpage returns a 200 status code (to indicate a page has loaded successfully), but behaves like a 404 (Not Found) error. This is caused by missing pages, incorrect redirects or thin or low-quality content, and can mislead search engines into treating non-existent pages as valid, impacting SEO performance. To curtail this, find which issue is causing the soft 404 and respond accordingly:
Unless you’re using complex animation or multimedia content, the chances are that images and videos are going to take the longest to load on your website, impacting the Largest Contentful Paint (LCP) rating. When adding images, be sure to use a web-friendly image format such as JPEG, and compress it using an online tool. Finally, enable lazy loading to unprioritise image loads and speed up initial loading times.
For videos, host the video on a reliable platform such as YouTube or Vimeo and embed
Self-hosting using a content delivery network (CDN) is possible, but ensure that the video uses a HTML5 video player, which has the best compatibility for mobile devices.
Duplicate and thin content can lead to search engines negatively viewing your website by not providing a clear distinction between pages and not offering enough value to users respectively. Use canonical tags and robots.txt or even prune low-quality pages if necessary, but the greatest solution is to enhance the content that is already there.
With technical SEO, there’s no Swiss army knife that will give you everything you need to succeed. Rather, there’s a neat collection of tools that are instrumental in achieving quality optimisation, each with its own uses and advantages. These include:
At Brave, we specialise in technical SEO for ambitious eCommerce businesses. With over two decades of experience, we know that if your site isn’t built to dominate search, it’s built to fail. We make sure that doesn’t happen, as an award-winning eCommerce SEO agency
We engineer eCommerce platforms that drive visibility, speed and conversations. No fluff, no filler – just razor-sharp strategies that deliver results. Ready to get ahead of the competition? Contact Brave today and let’s supercharge your website for maximum impact.
Please select all of the services that are relevant.
Our website projects start at a minimum of £35k and typically range all the way to £150k depending on scope and functionality. Now we’ve been upfront with how much a project can cost hopefully you can be with your budget…