Getting Found Post-Panda: Understanding the New Rules for Getting to the Top in Google
By Julie King | July 4, 2011
This is a critical time in search. The rules used to rank websites have changed and many companies — including some that sell search optimization and social media marketing services — have not kept up.
Google's release of its Panda algorithm, which was rolled out in the US in February 2011 and then to the rest of the English speaking world in April 2011, was a response to pressure to filter spam and poor content out of its top search results.
This update, as well as many other algorithm tweaks, have changed some important fundamentals. If you want to do well in Google's search results, you need to understand the new rules of the game.
Here is what the expert panel from the Search Engine Strategies (SES) Toronto conference had to say about how things have changed.
Panda-pocalypse Survival Guide at SES Toronto
This session, moderated by Jonathan Allen of Search Engine Watch, saw four industry experts debate the impact of Panda as well as what small and medium-sized businesses should do to optimize their sites going forward.
The panel of speakers, from left to right, were:
- Terry Van Horne, partner of Toronto-based Reliable SEO and SEO Training Dojo;
- Dave Davies, the CEO of Victoria-based Beanstalk SEO;
- Garry Przyklenk, TD Bank Financial Group; and
- Thom Craver, a web and database specialist with Rochester-based Saunders College (RIT).
One thing that all panellists agreed on is that the Panda algorithm is only one part of many tweaks that Google has rolled out over the past six months. Here is a summary of their thoughts on search engine optimization (SEO) going forward from Panda.
What does Google think quality content is?
A key aspect of the Panda update is that Google is now using different metrics to try to sort quality content from the chaff.
To do this, they are now looking at user interactions -- or 'stick time', as Davies said.
This means that factors like the amount of time a person spends looking at a piece of content will be used to measure content quality. Davies noted that this is particularly true of entry points on a website. In the future he expects the search engine to look at actual users statistics as they come in and use that data as a metric as well.
"Another aspect of Panda is that Google finally admitted that they use user data in their algorithm," said Allen. "So click-through rates on search listings, presumably the amount of time the site's visited, which they are tracking through Chrome ... these kind of signals they've now admitted are part of the algorithm."
This means that having most of your users arrive at a page, stay for only a few seconds and then leave without looking at other pages could hurt your search rankings.
Google will also be looking to measure content quality using direct input from searchers with three new buttons:
- the new "+1" button, which is similar to Facebook's like button;
- site blocking in Chrome; and
- a new "block this site" option in Google's search results that appears if someone uses the back button after clicking a link (this option current only appears in google.com).
While these three signals may not make up a major part of the algorithm at this time, they will be important factors to monitor.
Probing the importance of user design & conversions metrics
Allen pointed out that if a panel of people are determining how Google's algorithm will both punish and reward sites, one thing site owners can do to improve their search performance is to improve their user interface.
The importance of building your site for users and focusing on site usability was a point that was repeated over and over by panellists.
When asked about the top things site owners should prioritize if they are trying to recover from Panda, both Davies and Craver said to start by looking at your site as a user.
"Make sure your visitor experience is solid," said Davies. "Review your content and don't view it as you. Try and view it as your target audience. Try and view it as potential link sources - so try and view it as an editor or blogger."
Craver pointed out that every site that has been hit by Panda had duplicate content, whether they thought they did or not. He suggested that site owners go through every link on their site, looking for duplicate content (which you should remove when you find it) and also things like extraneous code and other things that don't belong.
As you clean up the performance of your site for users (making your site load faster, look cleaner, easier to navigate) and also clean up shallow content (removing thinner content, like pages that have more keywords than actual content), you will improve in the eyes of both Google and your end users.
Google has moved Webmaster Tools data into Google Analytics, which is another tool webmasters can use to clean up their sites and in turn - hopefully - their rankings.
Craver also pointed out that sites hit by Panda have been focused on the traffic impact, but this is just one metric.
More importantly in many cases is whether there was an impact on conversions of traffic based on your goals, something that very few of audience participants indicated they were tracking.
"Build sites for users, not search engines," says Van Horne. " I want to build my site for users because users actually buy things. Google's never bought nothing from me."
Next »: The duplicate content fiasco and problem with too many ads on the page
Page 1: Panda-pocalypse Survival Guide at SES Toronto
Page 2: The duplicate content fiasco and problem with too many ads on the page
Page 3: A look at the Google +1 button, tag pages and new meta tag options
Page 4: Action steps, putting advice from this session into action