18 January 2013
The SEO Landscape is a term we use to describe the factors (internal and external) that affect the performance of any given website in the search engines. Due to the very nature of search engines continually updating their algorithms, shifting focus, and constantly striving to provide their users with timely, relevant, accurate and quality results, the SEO Landscape is similarly continuing to shift and change.
2012 saw a whole series of algorithm updates, refreshes, changes to the search engine indexes, and new ways of optimising websites; in fact, over the last 12 months we have seen more documented shifts in the SEO Landscape than we had in the previous 5 years or so. With this in mind it is becoming increasingly difficult to clearly determine all of the individual factors that add up to a well-optimised site, and perhaps more difficult to predict the future of the SEO Landscape.
However, recent updates have shown a clear intention from search engines; to enable users to find the most accurate and quality results in an efficient manner. Over the past 12 months we have encountered a series of factors and issues with our clients sites that we have adapted to and had to formulate strategies to counter-act previously undertaken SEO work, or to improve campaign performance based on new indicators.
This article aims to highlight what we have seen over the past 12 months in terms of the shift in the SEO Landscape, and how we believe these will impact on search marketing into 2013 and beyond.
Back in early 2011, Google rolled out, “an algorithmic improvement designed to help people find more high-quality sites in search”. This update was to be known as the Panda update, and has since had approximately 23 iterations, updates and refreshes.
Panda was designed to reward what Google deemed to be ‘high-quality’ sites, and reduce the rankings for those sites that were considered low-value add for users in terms of the on-page content. We have seen this impact many sites when it comes to duplicate content (duplicated across own site, and duplicated across multiple sites), thin content, low quality content and more.
In our experience it has had a negative impact particularly on e-commerce sites with duplicated content across category pages, sub-category pages and product pages.
Other sites primarily targeted by the Panda update have included scraped websites (content automatically copied and re-published by a website) as well as low-quality affiliate websites.
We envisage further iterations of Panda throughout 2013 as Google continue to improve their algorithm’s to detect deeper levels of unintentional duplicate content
On-page, or on-site, optimisation is still an integral part of the changing SEO Landscape. In light of the Panda update, Google values high quality, unique content now more than ever, and that includes the on-page information that has been the cornerstone of SEO over the years.
However, there is also a shift towards improvement in duplication and unnecessary pages within websites; With this in mind, it is imperative to fully evaluate your on-site optimisation strategy with regards to each page’s URL, Page Title, H1 Heading, Meta Keywords, Meta Description and Content, as well as the website structure, information architecture and more to help remove duplicate content throughout a website.
This process has become ever-more technical in its requirements to ensure that a website meets a clean, optimised set of requirements and has seen the introduction of methods to help improve canonicalization, parameter handling and other technical requirements which can hinder a website.
Comprehensive keyword research around the theme and content of your site should be undertaken regularly in order to stay ahead of shifting trends and the competitive landscape of your field or sector and in order to refresh and update the on-page optimisation in line with this.
Despite various algorithmic updates to do with content, links and more, the value of on-site optimisation cannot be underestimated in 2013.
Linked to Google’s Panda algorithm is the notion of producing quality content and devising an effective content marketing strategy for your site. The content of your website should not only be relevant, thematic and unique, but should be outstanding, share-worthy and above-all add value to users.
The mantra of ‘content is king’ has been heard for years now, and it is perhaps now more accurate a statement than ever. Whilst the value of a natural, bespoke and effective link-building campaign cannot and should not be underestimated, Google has placed far more weight on quality content when it comes to ranking sites recently than it ever has done before.
Not only does this refer to the ‘words on a page’ of a website, but to the overarching content strategy and similarly, content marketing strategy.
Each page of a website should serve a purpose, not to a search engine, but to a user. Each page should be well considered and thought-out, and add value to the site as a whole as well as the user, making it a reliable source of authority and credence within its sector or market.
A content marketing strategy refers to the ability to not only create outstanding content for one’s own site, but to create content that others will want to read, share, write about and link to. Content Marketing falls somewhere between the traditional pillars of SEO of Content and Linkbuilding; it is the process of creating high quality content for your site that will naturally acquire links via its very nature of being outstanding, thematic and relevant.
In April 2012, and after weeks of speculation, Google finally confirmed they were rolling out a ‘webspam update’ that many SEOs had referred to as an ‘over-optimisation penalty’; this algorithm update was soon given the Penguin moniker.
Penguin looked to address various ‘spam’ factors, including the act of keyword stuffing, the utilisation of link schemes and other ‘black-hat’ practices, in order to artificially manipulate the search engines in the pursuit of higher rankings. This strategy looked to exploit loop-holes and find shortcuts in terms of ranking naturally via quality content and quality links. The Penguin update clamped down on these practices and aimed to reward those sites that are valuable and ‘good’ in the eyes of users, not just algorithms.
There has been approximately 3 updates to the Penguin algorithm as Google looks to “decrease rankings for sites that we believe are violating our quality guidelines”.
We have seen new clients come to us having been adversely impacted by Penguin having previously undertaken black-hat practices when it comes to link-building, including the use of link networks, low quality links and other examples of webspam such as link-wheels, link packages (e.g. 1,000 links for $5) and directory packages (1,000 directories for $5).
In light of the Penguin algorithm update, the process and strategy of backlink analysis with the intention of pursuing backlink removal has come to the fore as something that each website owner should be au-fait with. Due to poor and low quality link-building practices, many link profiles now appear unnatural in the eyes of major search engines, resulting in penalties and a decrease in rankings, ultimately adversely impacting on traffic and revenue.
With the help of an experienced online marketing agency, website owners can undertake backlink analysis and removal in order to highlight those links contravening Google’s quality guidelines and pursue their removal via contacting site owners. In some cases, site owners are not contactable, are impossible to trace, or request a fee to remove the offending links, which is where Google’s relatively new Disavow tool comes in to its own.
When a harmful link cannot be removed via the normal channels, you can use the Disavow tool to request that Google discounts any effect that the said link/domain has on your site, thus negating any possible penalty attached to the link/domain.
We have seen many new clients come to us having been affected by Penguin, seeking backlink analysis and backlink removal ahead of a Reconsideration Request, or in order to prevent such measures from Google. We predict that 2013 will see more site owners appreciate the impact of Penguin and seek this such service.
Google first confirmed that they were using social signals in determining ranking in late 2010, by including data from Twitter and Facebook to help shape the SERPs (search engine results pages).
With the proliferation in popularity of social networks over recent years, major search engines have been prudent in adapting their algorithms in attempt at reflecting the popularity of websites, content, products, sentiment etc. in the SERPs; thus rewarding sociable, ‘liked’ and ‘shared’ content with higher rankings as these are a signal of quality, trust and popularity – two key factors of a search engines algorithmic reward-scheme.
Social signals ties together social media marketing and search engine optimisation, and its importance will continue to grow over the coming 12 months and beyond as Google herald in a new era of social search.
Through the use of structured data and schema mark-up, Google currently supports rich snippets for a variety of elements, including people, reviews, products, recipes and more. Essentially, what this means is that if your data is marked up effectively, you can alter the appearance of your listing on the SERPs in order to help improve click through rates, prominence, and perhaps even ranking.
Currently seen as more of a user experience benefit, it is possible that the use of scheme mark-up could be used as a ranking factor in the near future, particularly the use of Author Rank, whereby the author of a piece of content is shown on the SERPs.
The use of structured data and the way in which Google is using this data to alter the SERPs shows how various pieces of information and multiple factors effecting the SEO Landscape have a knock-on effect on each other, and are all inextricably linked in a quest to provide the ultimate in terms of relevant, quality search results.
With the proliferation in the use of mobile devices and tablet computers over recent months and years, the importance of Local search and International search has never been more evident.
With the creation of Google Plus and Google Places for Business Google now allows businesses and site owners to tap into a local market whereby users are searching with geo-location enabled, allowing for location-sensitive results to be displayed.
Similarly, through the effective use of geo-location and geo-targeting, site owners can look to target Global markets with relevant content and/or currency depending on how data is marked up.
As mentioned in the introduction of this document, and alluded to throughout, the SEO Landscape continues to shift, with various factors and signals coming and going each year. With this in mind, it is impossible to predict future algorithm updates with any degree of certainty.
However, as a leading Online Marketing Agency, Mediaworks is on the pulse of the SEO Landscape and so is in a position to predict future trends in terms of online marketing, with a view to future-proofing our clients websites so that the only impact their sites will feel from future updates will be a positive one.
One such potential update revolves around the use of anchor text phrases both internally and externally, with less relevance being put on the anchor text itself, and more relevance being put on the context in which the link appears. Thinking back to the Penguin and Panda updates, it is easy to see the logical step with this potential update – targeting over-optimisation and providing users with quality, thematic and relevant content.
This is just one such example of future algorithm updates that we feel we are in an informed and privileged position to speculate on with some degree of accuracy, and that we use to inform our 2013 search marketing strategy for all of our current clients.
Although not strictly part of the SEO Landscape, we predict a seismic shift in how our clients, and how the online retail market in general, approaches online marketing and online spend in 2013 and beyond.
Conversion Rate Optimisation is set to increase in terms of visibility and importance as brands and retailers come to the realisation that they operate in a saturated marketplace and look to make the most of the traffic that an effective SEO campaign brings them.
By looking at ‘best practice’ in terms of design and structure for brochure and ecommerce sites as well as the usability, user experience and user journey of the site, site owners can seek to reduce exit rates, abandonment rates and bounce rates and increase conversions, whether these are micro conversions or macro conversions.
Mediaworks has worked on successful CRO campaigns for a variety of brands and retailers and has a track record in improving the conversion rates of email sign-ups, contact form submissions, and checkout completions; leading to an increase in leads generated, transactions and revenue.
Newcastle Upon Tyne
Mediaworks online marketing ltd.
Team Valley Trading Estate
Tyne & Wear, NE11 0NF
+44 (0)191 404 0100
Mediaworks online marketing ltd.
69 Wilson Street
London, EC2A 2BB
© Mediaworks 2015
Site delivered by Cargo Creative