Use an SSL on the website
A Secure Socket Layer (SSL) certification is a widely deployed protocol that provides security to a website. It’s easy to know whether a website has an SSL or not because it will be displayed within the URL. Websites with an SSL certificate will either start with ‘HTTPS’ or have a locked symbol at the left-hand side. Websites without this certificate will begin with ‘HTTP’ and will be flagged as ‘unsecure’ where the lock symbol should be. Google has now incorporated the need for SSL within its algorithms when crawling a site. This means sites that have an SSL certificate from a trusted authority are likely to rank higher.
Speed up your site
It’s well known that search engines favour websites that have a quicker load time. Having a website that has a low page speed is likely to rank you higher in search results. This will, in turn, bring more organic traffic to your site. Three ways I would suggest to lower your sites page speed are:
- Use a fast hosting provider for your site. Google’s server response time benchmark states that 200ms is an optimal response time for a server. If a hosting service is slower than this, it’s likely that the site will feel ‘visibly sluggish’.
- Minimise the number of HTTP requests. This means that on your site, you should keep the requests for files to a minimum, as well as the scripts and plug-in requests.
- Optimise Images. Large image file sizes are likely to slow down your page because it takes longer to receive from the server which slows down the time the page takes to be displayed.
- Keeping a sites page speed down does matter in relation to technical SEO. If a website has a higher page speed time than a competitor, it’s likely that the competitors’ site will rank higher in Google.
Ensure your website is mobile-friendly
A responsive website will adjust itself to the size of the device that the user is viewing the website on. This will make it readable and understandable on all devices. The latest statistics now prove that most web browsing is on mobile. If your website isn’t responsive, then the user won’t use it and go to a competitors site instead. This shows how important responsive design is. Google states that ‘having a responsive website is considered to be a very significant ranking signal in its algorithms’, which also fits with the ‘mobile-first’ approach they are trying to promote.
The results for this showed in the ‘Mobilegeddon’ update, where certain sites were penalised for not having a mobile-friendly approach. A good way to test how mobile-friendly a page is on a website is to use tools such as Google’s mobile-first test. This will not only tell you if the page is mobile-friendly, but it will also suggest improvements that could be made to the page.
Fix outstanding errors on the site
Errors within a website should be fixed before going live. However, sometimes websites can have technical errors, even after a site launch. If a website has got technical errors that are logged, it’s likely that Google’s search engine will have trouble crawling a website. The worst case is they won’t crawl the website at all, meaning it won’t rank. To help mitigate the chances of errors affecting a site’s rankings in the future, a website should have an SEO audit performed every 2-4 weeks. The more frequently that this is performed, the less likely it is to affect the ranking of a website.
Systems such as SEMrush will list out how many errors your site has, as well as warnings. An error is the highest severity and must be fixed straight away. Warnings should still be fixed in a timely manner, but it won’t be detrimental if they don’t get fixed straight away. SEMrush also has ‘notices’ which are recommendations that should be changed to improve your SEO score.
Fix duplicate content on the site
Duplicate content on a website can not only be confusing for users, but it can also affect rankings. This is due to the algorithms that search engines run. This doesn’t actually incur a Google penalty, but it should be rectified on your website. Crawlers that are on the website won’t know where the original content came from, so it could link to the wrong page in the search. If this happens, a website is likely to rank lower than it should in the search results. In some cases, it might not rank at all. Removing duplicate content on a website can help, even if it doesn’t incur a penalty. The other option would be to include a “Rel=canonical” tag to the original content, so the crawlers know which pages to rank.