What makes the site technically perfect? Here are 7 factors that are likely to play the key role in the technical SEO 2017.
Despite the fact that many people believe this instrument ineffective, technical SEO optimization is a necessary foundation for the promotion. There are examples where the presence of a large amount of high quality content sites can not achieve the desired performance.
In this article, we will focus on seven main stages of the technical SEO in 2017. Some of them have always been relevant, others are pretty new and used in accordance with the latest changes in search engines.
1. Check indexing.
Let's start with the number of pages on your website that are indexed by search engines. You can check this by typing «site: domain.com» in the address bar or search engine using SEO services.
Ideally, this number should be substantially proportional to the total number of pages on your site, minus the ones you do not want to be indexed. If there is a big difference, you need to view the banned pages for indexing. Which brings us to the next point.
2. Make sure that important resources are scanned.
To test the possibility of scanning your site, you can just check the robots.txt. But often, it is as inaccurate as easy. Robots.txt is only one way to restrict the indexing of pages, so you can use SEO app (eg Screaming Frog SEO Spider) for a complete list of all locked pages, regardless of whether the instruction was found in robots.txt, NOINDEX meta tags or X-Robots-Tag.
Keep in mind that Google can now deliver pages as modern browsers. That is why in 2017 it will be very important not only to your pages, but to all kinds of resources (such as CSS and JavaScript) to be scanned correctly. If your CSS file is closed from indexing, Google will not be able to fully evaluate the site. The same situation will happen, if your JS is not scanned, Google will not index any of the dynamically generated content on your site.
If your website is developed using AJAX or assumed in JavaScript, you need to specifically scan with the help of an application that can scan and render JavaScript.
3. Optimize the passing pages with a search bot.
This is the number of pages that the search engines are browsing for a certain period of time. You can get information about the number of pages scanned per day in Google Search Console:
Unfortunately, Google Search Console will not give you a full information about scanning individual pages. For more information, you need to view the server logs.
Once you know your figure of crawling pages, you may wish to increase it. Optimizers do not know exactly how Google's algorithm determines how many pages he needs to get around, but there are two main theories:
- It affects the number of internal links to page;
- It affects the number of backlinks from other sites.
It is possible to increase the number of pages to be scanned by using the following methods:
- Get rid of duplicate pages. Every duplicate page you can be blocked - must be blocked. From the perspective of a bot crawling the pages, canonical URL is not very helpful: the search engines will continue to get duplicate pages.
- Prevent indexing pages that have no value in SEO. The Privacy Policy, the terms of the shares that ended, are good candidates for Disallow rules in robots.txt. In addition, you can set certain parameters of the URL in the Google Search console, so Google would not scan the same page with different parameters individually.
- Fix broken links. Whenever the search engine spiders scan a link to the page with the answer 5XX or 4XX server, the number of pages to crawl decreases.
- Keep your sitemap up to date and make sure that the date Sitemap.xml was loaded into Google Search Console.
4. An Audit of internal links.
The logical structure of the site is a background to a good interaction with the users and scanning capabilities. Internal links also help more effectively to rank pages.
Here are tips for the audit of internal links:
- Measure the depth. Organize your site structure so that we could get to the important pages no more than through three clicks from the home page.
- Broken links. They confuse visitors and reduce the ranking of pages and website. Most SEO scanners show broken links, but it can be difficult to find and fix them on the site. In addition to HTML-elements, do not forget to look tags, HTTP headers and Sitemaps.
- Redirect links. Even if a visitor ultimately reaches the desired page, path through a series of redirections adversely affects the load time and site crawling robot. Look for a chain of three or more call forwarding and update the links on the redirected page.
- Pages-orphans. These pages are not associated with other pages of your site - and thus are difficult to be found by search engines and visitors as well.
5. Look at a map of your site.
You already know how important the Sitemap is. It informs the search engine about your site structure and allows you to find new content faster. There are several criteria to echeck your Sitemap:
- Freshness. Your XML Sitemap has to be updated every time new content is added to your site (preferably automatically).
- Purity. Sitemap should not contain garbage (4xx page noncanonical pages, redirected the URL - the address and the page blocked from indexing) - otherwise, you run the risk that a site map will be ignored by the search engines completely. Do not forget to regularly check your sitemap for errors directly in Google Search Console.
- The size. Google limits the number of the URL in the map to 50000. Ideally, you should have it much shorter in order to have the most important pages scanned more often. Many SEOs say that reducing the number of URL - addresses in Sitemap makes it more efficient by passing.
6. Check and improve page load speed.
Page loading speed is not only one of the main priorities of the company Google in 2017, it is also a signal for ranking website in SERP. You can check the download time of your pages using Google PageSpeed Insights tool. It may take some time to check the different types of pages, but you can use other services as well.
If your page does not pass some test items, Google will give you detailed information and instructions for resolving problems. You even get a link to download a compressed version of your images if they are too big. Doesn't it really mean how important is the download speed for Google?
7. Mobile friendly.
Recently the news has appeared that Google had started indexing separate mobile versions of the site. The idea is that the issue on mobile devices and desktop PCs will significantly differ.
Here are the most important things to prepare your site for this change.
- Check your page usability on mobile via Google Mobile Friendly Websites service.
- Carry out a comprehensive audit of your website for mobile devices, just as you are doing it for the desktop version.
- Mobile Tracking Issuance. Do not forget to keep track of your site on your mobile and remember that your work in the near future will be rewarded and you will see the site in the top issuance.
These are our 7 technical SEO Tips for 2017.