When it comes to SEO, you can learn a lot from what the big brand names are doing, and not just those who do it well.
John Lewis may be one of the biggest names on the high street, but our SEO analysis has shown that they still have a long way to go to be just as visible and popular on the web.
URLs
The site’s problems start with its URL structure. Many URLs are long and buried deep within the site structure – not a great way to get indexed by the search spiders. Making the URL’s simpler is highly recommended as it would make them far easier to index.
To compound the problem, many product URLs are given generic names, like ‘product’ preceded by a string of numbers, rather than something descriptive with relevant keywords. Not only does this fail to describe the page adequately to the search engines, but it also makes it harder for web users to figure out what the page is about. To rectify the matter, “http://www.johnlewis.com/231459215/Product.aspx”, for example, could be altered to something more relevant, such as, “http://www.johnlewis.com/Organic-Surge-Gentle-Cream.aspx”
When used as the anchor text for referring links, relevant keywords in the URLs would also further boost SEO value, and help web users to identify the page’s topic simply from its URL.
Metadata
Having the metadata buried far down the site’s code does it no favors, and moving this crucial information to the top of the page where it is more easily interpreted by search engines would be highly recommended.
On the subject of meta title tags, for some strange reason, there are two on the John Lewis homepage. One has a quite reasonable title, the other is left blank. This blank should be removed, as logic dictates that a page can only have a single title, and the search engines are particularly keen on logic.
Site structure
XML sitemaps are lacking completely. These should be added to make it easier for the search engines to index the site’s many pages. Making it easy for the search engines to understand and explore your site should be a priority for any webmaster. As well as creating a sitemap index. XML file for the entire site, we would recommend the addition of a number of category and product sitemaps for sites like John Lewis’.
A variety of empty and broken pages also do little to enamor the site to search engines or consumers for that matter. People perceive sites with even just a few broken links as being low quality and are more likely to become frustrated and click away. Search engines may downgrade the ranking of a site with many broken links, for these same reasons.
We also found a total of 513 internal links on the site. This is quite an excessive number, and though having so many may not directly damage the SEO value of web pages, it can dilute the concentration of link juice that passes through them.
Content
Search engines are above all else looking for high-quality content that will be relevant and useful to their users. Good quality content must, of course, be well written, but there are other factors to consider too.
The most glaring omission from the John Lewis site is a lack of H1 tags. These are judged by search engines to have at least a little more significance than regular text, so should be used with keywords for extra benefit. Using relevant H1 tags to describe a page’s content is also held as best practice in terms of web usability too. You should avoid stuffing keywords into the headers though, as this can be seen as ‘keyword stuffing’ by the search engines, resulting in your content being penalized in the rankings. Our recommendation is usually to include no more than two keyword instances in H1 tags on any given page.
Keywords
It’s important to consider how the search engines will view your site. In John Lewis’ case, Google would see things like ‘skip to main content’, ‘help’, and ‘register’ before it sees anything else – words that are not keywords and bear no relevance to the site’s actual content.
This should be rectified, by moving these irrelevant words further down the DIV structure. The perceived importance of these words could also be reduced by converting some links into images; for example, by using an image of a shopping cart to symbolise the words.
On the subject of images, relevant alt tags should be added to all images, to ensure that they are indexed properly. Search engine spiders can’t interpret images themselves, but they do read the information provided in the alt tags, which are used to specify the text that should be displayed if the image or other element cannot be loaded.
The general aim with content should be to emphasise the importance of keywords which are relevant to the site. In John Lewis’ case, this would be things like ‘furniture’ or ‘clothing’.
Beware of duplicate content
John Lewis’ creation of a mobile version of its site is to be commended, but to ensure that this is not viewed as duplicate content, we would recommend using the rel=”canonical” tag. At present, both versions of the site contain the same content and URLs. As well as using the canonical tag, it would be advisable to rewrite some of the content, particularly the main information and category pages.
Keep it fresh
Finally, a site like John Lewis might improve its search engine rankings by adding a regularly updated news section or blog. Search engines value web pages that add new content regularly, as they are deemed as being more current and relevant to users.
Taking these factors into account, and with the right advice from an SEO expert, your website can avoid the mistakes that even the big boys can often make, creating an effective, productive site that appeals to both your human visitors and, just as importantly, the search engines too.
Posted by Ed Shutenko, SEO Manager at Bullseye Media