Onsite SEO Google Top heavy update

Published on July 24th, 2013 | by Derek Devlin

1

Learn from a Page Layout Victim

I recently attended a local business event and got chatting to one of the attendees about business on the web.  Turns out he has a successful real estate agency and has had an online presence for a number of years.  He was keen to know what I thought about his website so I had a look…

He had not even heard of Google Analytics, let alone used it (this in itself set alarm bells ringing) so instead I jumped over to SEMrush and ran an organic search visibility check to take a look at his traffic trend and that’s when I made a startling discovery…

Organic Search Visibility for his site tanked at the tail end of 2012!

You don’t need to be an experienced SEO to see that we have an issue here!

The crazy thing is that this was news to my new friend, he hadn’t even realised his sites traffic had plummeted and was trickling a quarter of the traffic it had enjoyed previously, he was astonished to find out that this action had been in place for more than 6 months!

What had caused such a drastic drop in Search Engine traffic?

There were a number of algorithm implementations and changes that may of impacted the site but first I visited to see if there were any glaring aspects of the site that would jump out at me.

Sure enough, a quick look at his website and my fears were realised….

The site itself was very dated; it was built with tables and primarily used images instead of properly marked-up text.  To the layman it looked like there was a decent amount of text on the site because you could quite clearly read an introduction and some nice testimonials.

The critical flaw was that all of the text was actually images, instead of properly marked up html, which meant that there was zero SEO value to be had from the text on the site.  Remember – SEO 101 – Google can’t spider and read the text on images!

The second issue was that the images were all top loaded, they were the first thing a user interacted with “above the fold”, whereas the ‘real’ text content was hidden right at the bottom of the page nestled in-between the middle of yet more images.

Although, I can’t share the site for confidentiality reasons, here’s a wire frame of what I saw:

What I was looking at was a classic example of a site that had been zapped by Google’s Page Layout Filter, also known as the “Top Heavy” update.

In actual fact, I managed to count just 73 words of text on the whole homepage the rest were images pretending to be text.

It was very clear that he was the victim of Google’s Page Layout Filter, also known as the “Top Heavy” update – more specifically he had been hit by the second refresh of the algorithm in October 2012.

What is the Google’s Top Heavy Algorithm?

Google first announced the Page Layout Algorithm in January of 2012 with the intention of penalising websites that purposefully stuffed too many ads “above the fold”.

The first Page Layout algorithm was implemented on January 19th 2012, with Google Webmaster Central Blog reporting that the update impacted less than 1% of queries.  Google then confirmed a refresh on October 9, 2012, which was less far reaching affecting 0.7% of all queries.

The algorithm was designed analyse the layout of the page in question and decipher the amount of ‘content’ the user sees on the page after having clicked on a search result.  If your site trips the threshold for having too densely populated the top half of your site with ads then that will likely result in a loss of search visibility since Google deems this a bad user experience.

In the official statement released on the Google Webmaster Central Blog, Matt Cutts said:

“…we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience…If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.”

No doubt in an attempt to quell the inevitable mass panic from publishers and webmasters who run any form of ads on their site, Matt was quick to point out that this algorithmic change:

“does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.”

Matt continually references ads as the culprit, although he does also cover this by suggesting that what they are looking for is “a lot of visible content above the fold”.

Ads or Images – Google Can’t Tell the Difference & Probably Doesn’t Care!

My friends website is clear evidence that this algorithm can’t distinguish between images and actual commercial advertising, since my friends real estate site wasn’t using ads of any kind.

What are ads anyway? Essentially they’re images and most but not all, are linked so it makes sense that Google would not be able to write an algorithm to detect when an ad is actually an ad and when an image is just an image with a link.

It’s my opinion that the algorithm most likely scans the page and makes a decision based on the amount of valuable user ‘content’, which to you and me means text.  Actual properly marked up html text.  The algorithm is therefore most likely based on the ratio of images to text in the top half of the page.

When Google hits your site, they expect to see a number of elements, namely a decent body of text and some semantically appropriate headings that describe the subject of the page.  Fail to deliver these simple elements and you could be looking at a loss of search visibility, just like my friend.

Breaking down the layout of the page, here’s what the search engine spider could actually crawl:


Seen through the eyes of a search engine this page doesn’t look that appealing, does it?  What gives this page the right to rank?

Ads themselves aren’t ‘evil’.  However, excessive use of ads and images at the detriment of text and valuable user content will get you into trouble.

It was very interesting to see a case study of this nature in the flesh because it’s a penalty I don’t often come across.  It helps to refocus the mind just how far the Google Algorithms have come and going back to basics is still one of the most beneficial tasks you can perform for your clients.

So what to do – how to recover from the Page Layout Algorithm?

This one’s not rocket science, the pages of the site simply need re-worked to be more user-friendly, the images need to go and be replaced by a decent structure of text.

This means semantically correct headings introducing the content on the page, alongside a nice body of descriptive text above the fold.  Google can’t read text that is displayed as an image and so it defeats the purpose of putting valuable text into images.  Write it into your html mark-up and feed the spider what it needs… loads of lovely keywords.

If you’re new to SEO, check out this useful guide from MOZ on how to optimise your web pages properly.


About the Author

Derek Devlin is a distinguished Marketing Graduate and Entrepreneur who learned most of what he knows in the trenches as an Affiliate Marketer. A digital marketing enthusiast for 10years, he is an expert in SEM, Conversion Optimization and Web Analytics and now consults as the Head of Digital Marketing Strategy at one of the UK’s best up and coming digital agencies - Made By Crunch.

Backlink Surgeon - Professional Backlink Audits


Back to Top ↑
  • Connect with Me on Twitter

  • About Derek Devlin

    I’m a experienced International Search Consultant, helping sites develop winning organic marketing strategies. I specialise in technical onsite and offsite SEO audits for some of the biggest websites on the web. I love data, research and analysis. Im an ideas guy with a track record for delivering on user acquisition and conversion targets.

    I’m also the founder of Backlink Surgeon, a stand-alone service for Google Penalty Recovery.

    I’ve been Marketing online since 2004, when I first got into Affiliate Marketing and SEO as a hobby. I have spent the last 10years developing this passion into my career.

    In a previous life, I’ve been a Pizza Hut UK Franchisee – I’m also a published Academic in the field of consumer perception.

    In 2013, I became only the second person worldwide to be recognised as a Link Research Tools Certified Xpert. You can read my latest case studies on the LRT blog.

  • Latest Posts


  • Categories



  • SEO Rocks