As the search landscape grows increasingly competitive, ensuring site visibility should be one of the main considerations when building a new website.
Peter Richards, a Senior SEO Manager at Jellyfish, highlights how developers can build sites and effectively manage migrations that are high search performers from the outset in September's print edition of net magazine.
Here is a summary of the full article:
1. Have protected staging environments
Staging versions of sites need to be protected from site crawlers whilst in development and beyond to prevent the crawlers reading it as site duplication. Three ways of doing this are:
- Include a ‘robot.txt’ file at the root of the development server so disallow spiders
- Specify the IP range, which uses the firewall to restrict access and blocks spiders
- Create a password protected log in page, which also prevents spiders accessing content
2. Be vigilant in managing redirects
Without correct redirects search engine visibility drops. No doubt you know how to manage redirects, so the main things to avoid that will impede your SEO are:
- Don’t use 302 redirects for permanently removed pages
302 temporary redirects pass no link value to the target page, whereas 301 redirects only sacrifice 1-10% of value.
- Redirect hops
Multiple redirects demand extra server requests slowing delivery page time down. It’s likely the page won’t even be indexed if there are more than five hops, for example. Note that each hop also loses a small percentage of link value – it adds up!
- Lots of redirects
Simply put overloading on redirects will clearly have a negative impact on server performance – wildcards and regex can be used to optimise speed.
3. Structure your site content with clear authority pages
Having the same content on multiple pages makes the authoritative page unclear. This splits the power of inbound links (link equity), to prevent this you ‘canonicalise’ your site.
Meaning? You can either:
- Implement a permanent 301 redirect rule to point each URL variant at one canonical or preferred URL.
(When configuring URLs do so with or without a trailing slash (/) and create a 301 redirect rule to direct users away from the discarded version.)
- Or you can use the rel=canonical tag on each variant – this indicates to search engines which is the authoritative source.
4. Be mobile first
The number of global users accessing the internet via mobile is now overtaken those on desktop (comScore). Google’s mobile friendly algorithm favours responsive web design and ‘progressive enhancement*’ as opposed to feature rich sites that cater to desktop and mobile separately.
*Progressive enhancement can be designing for the smallest screen first and progressively enhancing it towards the desktop experience.
Check whether you are suffering a Google penalty following the mobile algorithm update check here using the >> Jellyfish Google Penalty Checker.
5. Page speed rules
Page speed dramatically affects abandonment which in turn affects SEO performance.
- Minify the source code to remove unnecessary ‘white space’
- Configure the server to enable gzip compression
- Use CSS sprites where possible to reduce the number of server requests
6. Use the right HTML elements
Signpost your site using html elements like the heading and alt tags for search engines and users.
Most SEO-friendly sites use a heading tag hierarchy like this:
<h1> Main Heading<h1>
<h2>Secondary Heading 1</h2>
<h3>Sub-section of the secondary heading 1</h3>
<h2> Secondary Heading 2 </h2>
The H1 tag is the most important and should always be included on a page. Search engines still pay attention to words used within the H1 (similar to the) to check the relevancy of the content associated with it, and also to help rank the page for keywords in search queries.
Although HTML5 allows for multiple H1 elements, only one is necessary on each page to avoid confusing both users and search engines.
At best search engines may simply ignore multiple H1 tags, or they may view it suspiciously as a method of promoting more keywords.
7. Manage errors effectively
When pages within a site are deleted or moved and the original URL is no longer providing content to users, that page should return a ‘404: Not Found’ response code and a branded error page. By default, Google will not index these pages.
A common misuse of the error page occurs when it is returned with a ‘200: OK’ response code, usually on URLs that have been deleted or moved. This is problematic if widespread, as these URLs are still likely to be indexed.
If the abandonment rate is high the site may be penalised by a quality assessment algorithm such as Google Panda.
Find out more about how Jellyfish approach SEO here.