Call 02 8824 5402


Top Seven SEO Concerns for Online Marketing in Google's March 2012 Update

J Clarkson

Google’s March updates have released an interesting 50 changes. They are mostly focused on content and good search results. If you can believe the scuttlebutt around, there is a lot of confusion about the updates as they are claimed to be essentially vague. I am not so convinced that these updates are so vague, they look to be realistically providing a good overview of what Google is trying to achieve (don’t get me wrong, I am not advocating Google but I do admire some of the changes they are able to make).  

If you look carefully at the March updates, you should be able to see it in groups of SEO affected, improved search results, a little for sports nuts, a bit for mobile users and some social media stuff.  

I’m most interested in the SEO affected, there are probably 14 changes that may impact on your website ranking and have online marketing implications. Even though keywords are important, some changes reveal exclusions that could possibly break down a keyword string into more than one category. If this is so, it could also mean that there is an added dimension for choosing long-tail keyword searches so that you can appear in two places at once. Now that would be exciting.  

The remaining SEO-like changes reveal more of Google’s goals than ever, that is to improve the search experience for as many searchers as possible. This is important for all SEO managers who have started to use good-quality content and are preparing to update pages regularly (where necessary – now there’s another story) they are going to remain in good stead with the Google Searches. Let’s now look at what I believe to be the top 7 groups affecting SEO decisions. I have shown the Google reported update in italics.

For a complete list of updates check out the Google blog


1.       Anchor Text 

Tweaks to handling of anchor text. [launch codename "PC"] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust. 

Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website. 

In combining these I expect the reason why Google has improved their efficiency in analysing anchor text is to streamline the analysis of a page, thus making room for more refined analysis. With refined analysis Google can look at the surrounding information to determine the relevance of content (seems this has been their goal for some time). Some tinkering that will benefit those with relevant content on their pages and that to which they are linking.


2.       Detecting Site Quality 

Improvements to freshness. [launch codename "Abacus", project codename "Freshness"] We launched an improvement to freshness late last year that was very helpful, but it cost significant machine resources. At the time we decided to roll out the change only for news-related traffic. This month we rolled it out for all queries. 

Improvements to freshness in Video Universal. [launch codename "graphite", project codename "Freshness"] We’ve improved the freshness of video results to better detect stale videos and return fresh content. 

More precise detection of old pages. [launch codename "oldn23", project codename “Freshness"] This change improves detection of stale pages in our index by relying on more relevant signals. As a result, fewer stale pages are shown to users. 

Improvements to processing for detection of site quality. [launch codename "Curlup"] We’ve made some improvements to a longstanding system we have to detect site quality. This improvement allows us to get greater confidence in our classifications. 

High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename "Panda"] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall. 

The focus on “freshness” should be taken seriously, considering the amount of resources and the improvement of efficiency that put into resources by Google to make queries show the most timely results. This is a site quality issue that has proven to give Google what they want. The detection of stale pages is separate from relevant content so you need to consider both along with the possibility of designing your site like a blog to remain or improve its ranking. In the best case just having a blog page on your site would maintain authority. Videos need to be “Fresh” as well, you will need to ensure they show signs of user engagement and I imagine YouTube will have heavy Google weighting. 

Also, another Google Panda update. This is very important for marketers trying to maintain high rankings. It will impact more on lowering the quality of sites that are going “stale” (ie not regularly updated), and are not of much value to searchers. This means that the work put into maintaining visibility on authoritative sites only remains of value when the searcher finds it of value. So if your historical information is no longer of value to the searcher then it must be updated to remain high in the searches.


3.       Link Relevancy 

Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the respective site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks). 

Sitelinks are the links that Google selects to appear under the search results for a page. They are determined by Google and reflect the vote for authority that Google bestows on a site. If you want the user of your site to get that good experience that Google look for, then take a look at the internal linking that directs users to different part of the site. If there are any parts of your site that would be considered not a good experience then you should block them.  

This is of particular relevance to web site owners where they maintain a high ranking. If they are showing sitelinks then they will want to continue if they have close competitors as those under their site ranking are pushed further down in the search results (a sweet spot for marketers).


4.       Image Search 

More relevant image search results. [launch codename "Lice"] This change tunes signals we use related to landing page quality for images. This makes it more likely that you’ll find highly relevant images, even if those images are on pages that are lower quality. 

Improvements to Image Search relevance. [launch codename "sib"] We’ve updated signals to better promote reasonably sized images on high-quality landing pages. 

If your image is important to attracting business, then make sure it is optimized (eg descriptinve file names, ALT tags etc.) to gain the brownie points you are after. It’s a great opportunity to get display your image high on the results page even when the page the image is displayed on is of low quality. 

Also, I think that “reasonably sized” images will add value to the quality of a web page. Either way, if you want your images to be of value you need to look at optimizing your them regardless of the value of that web page.


5.       Blog/Forum Page “Freshness” 

Improvements in date detection for blog/forum pages. [launch codename "fibyen", project codename "Dates"] This change improves the algorithm that determines dates for blog and forum pages. 

A natural part of the marketing mix – promotion. Perhaps backdating your posts will no longer be possible. This will put more pressure on site owners to perform through upating their site for press releases, company updates, product updates, pricing updates, news and industry updates. 


6. Indexing Symbols 

Improvements to handling of symbols for indexing. [launch codename "Deep Maroon"] We generally ignore punctuation symbols in queries. Based on analysis of our query stream, we’ve now started to index the following heavily used symbols: “%”, “$”, “\”, “.”, “@”, “#”, and “+”. We’ll continue to index more symbols as usage warrants. 

Now here’s another winner. The use of the “@” symbol is important to email names as well as Twitter usernames. Google is getting better at social media searches, +’s anyone?


7. Navigational Queries

Improvements to results for navigational queries. [launch codename "IceMan5"] A “navigational query” is a search where it looks like the user is looking to navigate to a particular website, such as [New York Times] or []. While these searches may seem straightforward, there are still challenges to serving the best results. For example, what if the user doesn’t actually know the right URL? What if the URL they’re searching for seems to be a parked domain (with no content)? This change improves results for this kind of search. 

Better handling of queries with both navigational and local intent. [launch codename "ShieldsUp"] Some queries have both local intent and are very navigational (directed towards a particular website). This change improves the balance of results we show, and helps ensure you’ll find highly relevant navigational results or local results towards the top of the page as appropriate for your query. 

This is great. It can provide more relevant results. However, relevant may not be interpreted well by Google, but looks in this case some improvement is being made by showing results that make more sense to the searcher. Anyway if you deliberately change your search from say one town to another the search can be greatly improved by providing results that relate to the greater number of “relevant” searches in that region. Something local marketers will need to consider as it may not mean high ranking sites would receive the brownie points on some searches.