Menu
2015 was an incredible year for SEO experts around the world, but what does this year hold for us? Read on to see just what Google might have planned for us in 2016.
With changes to Google Search Console (formally Google Webmaster Tools) in 2015 and improvements to tools such as mobile friendliness checkers and structured data testing tools, Google has provided us with more information than ever to make us better at marketing.
The industry came together and aligned like never before; historic tactics that manipulated search results began to disappear, mainly due to the huge effort from Google, and the quality of optimisation as a whole has improved. This is evident simply by looking at the quality of results that are achieving visibility. We have all looked at other people’s work for their clients and got a little envious, which in turn drives us to do bigger, better and more effective strategies for our own customers.
2016, however, is upon us. What surprises does Google have in store for us? We’ll likely never really know (apart from the Penguin update that it kindly moved due to the holidays! #win). I wanted to have a go at predicting the SEO landscape in 2016 – here are the changes I feel are going to make the most impact over the next 12 months.
We have all heard of marketing intelligence, this stuff is great but intelligent marketing?
With technology becoming such an integral part of SEO experts’ day to day lives, we should be using it to make us better, faster, and stronger than ever before. Having specialists is still highly important, as a machine can’t articulate the reasons why and how to fix the specific problem on your site (yet), but the information should be readily available to them.
Currently, marketers are using a number of different tools in order to deliver results to clients. Having a single tool, that can collate all the information and data from the other tools, but is also capable of machine learning and algorithmic analysis, is the future.
This wealth of information could be seen as a distraction or cause ‘data blindness’, but with focused analysis and data siloing, it can provide marketers with powerful data that would otherwise have been difficult or impossible to obtain by normal means.
Google has put a great deal of effort into providing a great mobile experience across its many platforms. Adding mobile friendly rich snippets to the search results and providing unique results for mobile device searches ensures that the user experience is maintained.
Google expanded its use of mobile friendliness as a visibility factor on April 21 2015 – this change was made to all languages worldwide and greatly impacts the search results. Google’s own documentation on mobile friendly web design provides three suitable solutions to mobile friendliness:
Google does not favour any particular URL format as long as the page(s) and all page collateral are accessible to every one of Google’s bots.
With desktop and mobile searches for the same query resulting in different results, it is clear that this area of search will continue to grow throughout 2016.
You can check your website by testing your pages with Google’s Mobile-Friendly tool. This simple test shows how Google Search sees your pages and provides specific resolutions where possible. You can also see if you page is mobile friendly within Google Search Console.
Most SEO experts know the value of content optimisation, content marketing and big campaign ideas, but core issues can often lie undetected in the HTML of a website, being overlooked time and time again ‘because it will be ok’ or ‘we didn’t see that one’. At Vertical Leap, this isn’t the case, as we treat technical SEO and visual optimisation with equal importance. Technical SEO deals with everything from servers to structured data and helps with organic search in ways that could go undetected or ignored.
In 2016 structured data will continue to grow. Google’s own newsletter (Dec 10) had a whole feature about it, highlighting its importance. With Knowledge Graph becoming more powerful, providing specific layers of data to Google to utilise can only be a good thing, even if Google doesn’t display the information.
With the onset of JSON-LD, adding structured data is becoming easier, and Google provides a large library of information on how to put individual pieces together. Google is also continuing to add more and more types of schema to the JSON-LD library and with this effort we should give back and reap those subtle rewards.
Ensuring your website is crawled cleanly removes any barriers the search engines may find whilst visiting. Although considered a small part of any technical report, Google has long since provided crawl issues in Google Search Console and has grown to now include a section for smartphones specifically. Keep them fixed and keep the crawl clean and effective.
Duplicate content is a big deal for Google; there is a specific part of the algorithm called Panda that deals with duplicate and low quality content. If your site resolves in a browser on secure and non-secure, www and non-www, as well as a variety of root extensions such as /index.php or /index.aspx etc., then there may well be causes of duplicate content that Google has to sort out, via canonical URLs or logic, before displaying your site in the search results. Ensuring that this decision process isn’t required in the first place resolves the problem of search engines changing the right or wrong URLs.
Page structure goes beyond heading tags within content or banners. Consider how a page is coded and what additional signals are being sent to the search engines during a crawl. Article tags, footer tags, header tags and many more all define specific areas of a web page. Using <article> for ever single content managed section of a webpage may well not be the most effective. Some of this content could be a sidebar, footer or an address, which hold individual and wider value to a webpage. These additional semantic HTML signals can be used to identify where the most important content on a web page is and increase crawl effectiveness and value to the search results overall.
Another key area of 2016 is Google’s growing need to understand the context of content. With patents based around improvements, to entity salience analysis that leads to a 34% better understanding of natural language within web pages with a strong baseline, means that Google will be able to filter our great content from other great content at a far greater level that previously seen.
Although a prediction of mine for 2015 was to identify and show content intent (commercial, research, brand, directional, informational and more), this prediction goes further, suggesting that Google will be able to understand the content, not just know that specific words and phrases that are present. Add to this previous analysis methods such as TF-IDF, co-occurrence and semantic distance, entity salience will pioneer a new breed of content.
RankBrain is Google’s new machine-learning artificial intelligence that is being used to help improve the search results. It learns by itself, teaching itself how to do something and how to analysis the results to identify what is ‘better’.
Although RankBrain is a unique identifier for an intelligent process management system, it is still part of the wider algorithm, called Hummingbird, that Google employs to sort through the billions of web pages in its index. As such, RankBrain takes a seat alongside ‘Panda’, ‘Penguin’ and ‘Payday’ that fight spam; ‘Pigeon’, which is growing and improving local search results; ‘Top Heavy’, designed to demote ad-heavy pages; ‘Mobile Friendly’, which aims to reward mobile-friendly pages, and ‘Pirate’, created to fight copyright infringement.
The importance of RankBrain falls into the category of how it adds value to the whole algorithm. If you consider an excellent piece of content achieving visibility for 1,000 search queries but only gets clicks for 100, RankBrain would quickly identify the correlation and remove from the results on 900 of those searches. Although a basic example, it is a key area where page level optimisation can be improved whether by semantic structure, sentencing or simply ensuring content achieves visibility for the right phrases.
These changes to search results will mean that how we optimise content, alongside entity salience and other advanced content optimisation methods, will greatly improve and intelligent marketing will grow to become a staple requirement for all SEO experts.
Another key area of search that is growing quickly. The search results are so far removed from that of two years ago because of the Knowledge Graph that we don’t really think about its impact as often as we should.
Although originally launched by Google back in May 2012, Knowledge Graph aims to provide informational intent search results direct to the user. With the introduction of Google Now, it becomes so much more powerful and involved due to the history of the searches being taken into account.
In 2012, Google commented that it had 3.5 billion facts about 500 million entities. This has grown in a big way over the past three years with further information being found and connections between entities being made. A search for someone famous now gives direct information about them.
As mentioned previously, voice search and Google Now are becoming increasingly popular. Google’s Android OS for mobile devices is powerful and connected, leveraging voice commands to take actions. RankBrain and Knowledge Graph add to the value of this with connecting questions by remembering the first question in a string of queries. This enables additional search data to be connected to other search data, further increasing entity salience and providing much cleaner search results.
Outside of Google using Google Now and voice search to benefit the search results, it is another method of search that is on the rise. With this in mind, providing answers within content to generate additional Knowledge Graph signals will help get your words in front of voice search users.
For example, an article about a football team could include the line-up on a specific day, quotes from notable publishers and broadcasters and much more, even if the article is only based on a player being transferred away. These additional connected signals are highly useful for the search engines and assist in visibility for related search queries.
The backbone of the SEO industry has evolved away from more, more, more to quality over quantity, but that doesn’t mean your site won’t pick up potentially negatively-impacting links. Getting yourself prepared early and monitoring your link portfolio is always a good idea and with an upcoming Penguin update on the horizon now is the perfect time to get your links in order.
Throughout 2016, Penguin will be continuously updated rather than having specific launches at various intervals through the year. This means that instead of waiting months for a penguin based link penalty to be lifted, it can happen at any time – similarly, a penalty could be received at any point. This change in dynamism means it’s more important than ever to monitor, measure and remove links that could lead to a Penguin-based link penalty.
These micro adjustments to the search results could mean that your site is being impacted by Penguin updates without notification. This is all due to the algorithm running constantly and behind the scenes.
Sam Osborne has worked in digital marketing for more than ten years and, prior to joining Vertical Leap as an SEO specialist, worked as head of digital for a digital media agency.
Website under-performing but not sure why? Our free review will reveal a list of fixes to get it back on track!
Categories: SEO
Categories: PPC