Top 12 SEO Tips for 2011

GaryTheScubaGuy

RIP Gary
webmeister
Joined
Dec 19, 2008
Location
Malta
Top 12 SEO Tips for 2011 (Q2-Q3) - Post Panda

There are many basic-to-advanced SEO tactics that you can read in earlier versions of Top 12 SEO Tips for 2006-2007 that are still spot-on in terms of the current state of SEO. This year though I timed it right and the Panda Update was just released a week or two back.

After a major update I would normally I get together with a bunch of other SEO freaks and do some testing to get real data to back my conclusions. This time however the research was already done. The same things work now that worked 4 or 5 years ago, just in a different proportion.

Most of the tips below directly relate to the Panda Update in some way because the basis behind this update in my mind is saturating the amount of information utilised to establish the most relevant pages in Google. They are doing this with LSI, social media and information from third party bookmarking companies like Furl, Facebook and Reddit. This makes it much harder for SEO’s to manipulate the results, and easier for Google to identify sites attempting to engineer the results in their favor. Companies like Furl, Facebook and Reddit. This makes it much harder for SEO’s to manipulate the results, and easier for Google to identify sites attempting to engineer the results in their favor.

I’ve spoken many times in the past on Latent Semantic Indexing or LSI as its known. It’s use by all 3 search engines started years ago for various reasons and now it is being injected into Google’s algorithm more than ever before in the newest release of the “Panda” or “Farmer” update. Unlike the Jagger or the Caffeine updates, this directly targets sites by using LSI along with other triggers.

For those of you that don’t live and breath all things SEO, LSI is best defined,"Latent semantic indexing allows a search engine to determine what a page is about outside of specifically matching search query text. By placing additional weight on related words in content LSI has a net effect of lowering the value of pages which only match the specific term and do not back it up with related terms." That was SEO Guru Aaron Wall’s description and probably the best one when put into ‘layman’s terms’.

Over the last 4 years LSI has been in nearly half of the conferences at which I have had the opportunity to speak, but unfortunately only by me it seemed. I started talking about it back when the ‘Minus 950’ penalty hit in one month and Universal Search was released the next.

At that time I told of Google buying a US based company named Applied Semantics in April of 2003 to further their research on algorithms. Google even talked about the residual integration of LSI publicly. So did MSN and Yahoo. Yahoo even applied for a patent on their own version. The odd thing was, although these conferences were considered advanced (Gaming, Affiliate and SEO) no one else was talking about it.

Google has been testing LSI in speech recognition software, cross-language document retrieval and search engine integration, as well as many others, for a long time. Its a bit scary when you think about it. The military used this stuff to identify terrorist chat amidst billions of spoken words in a short time frame.

When you apply this powerful tool to search, you reduce the potential of engineered results from guys like me - SEO’s. Because it looks at relative terms, it makes it 1000 times harder to “game” the algorithm because of the number of terms that could be considered relative. Apply this to enviornments like social networks, add in sloppy linguistics, slang, native language and a little local salt and you can get a lot of variables. That’s the data Google has at their disposal to analyse. Even scarier when you consider that Google gets their information from everything you do. Ever think about the fact that Google GMail serves up Adsense ads in your email that have some relevance to the actual content in your email?

Techniques like buying links with relative anchor text takes on a whole mew meaning when you add LSI into the mix.

I personally think that with all the data Google has now collected, they can finally utilise the findings of Applied Semantics and build their own sort of “Semantic Synonym” dictionary and use it as an algorithmic plug in at will.

It makes sense that they would use this ability and possibly even have a knob to turn it up or down. The easiest way to reduce the ability of SEO’s to manipulate the results is to spread the area (or add the information) into the playing field and water down their ability. At the same time they may truly have started their way toward delivering a top-notch set of results.

Universal Search was meant to do this back in 2007 and although it helped to jumble the type of results, there were too many blogs, articles and undoubtedly, a bit of spammy sites too. They have been tweaking it ever since, and average one update per day.

So the significance of LSI or Longtail keyword targeting is hugely now a part of life and although it seems daunting, successful SEO can still be broken down into manageable terms. It may require that you outsource much of the grinding work like content writing but much of the tasks I paid ten of thousands to get accomplished over the past 10+ years have been automated.

There are a few new elements like Social Media, Web 2.0 and Social Bookmarking but even these can be managed.

One other thing that I would like to say before jumping into the Top 12 SEO Tips for 2010 that’s is something that really needs to be said; Before April 12th the playing field was dominated and even though it still is to a point, the longtail variable in Google has opened the gates for people that have struggled in the past.

Here’s the facts: 25% of all searches are 1st time searches. That is a stunning statistic.

When you combine that fact, and, if you are measured on conversions and conversion rates in whatever you do, then this next statistic will really open your eyes. 60% of all conversions come from this 25%. That’s a conversion rate of 15%. When you consider the average conversion rate across all sectors is between 3-5% and in gaming its 1 ½ to 2%, 15% sounds pretty good doesn’t it. Well, you would think.

When I get ambitious SME’s or Top FTSE/Fortune 500 companies questioning whether or not an investment in a SEO, PPC and Social Media mix for 12 months is worth the risk it “twists me melon” (I live in Yorkshire England and have been trying to fit that in somewhere but coming from an American it just doesn’t sound right).What other investment can you be out of the red in 6-12 months? What business school did these people go to?

Whether its gaming, forex, binary options, memory foam mattresses or pet supplies, you can make money very quickly when SEO, PPC, Social Media and Conversion Optimisation are implemented correctly. It has always been a “sure bet” scenario., but now the Farmer Update has increased these odds.

If you are an Affiliate this is huge because we have always targeted long tail, or had to target long tail because of big investment and/or brand protection in PPC.

So where some feel this will hurt them, look at it this way; Affiliates have always delivered 40% of the income of any given company regardless of sector since time (Internet time that is) began, and this will not change. Plenty of the top Super Affiliate website have had good content and are neck-deep in social media and everything I have suggested to this point for some time now. Have a look at Bingoport, they are both Top Notch SEO’s and Social Media Strategists.

So just to finish on this thought and move on; if you haven’t financially prepared yourself for a 12 month building process then you shouldn’t be investing in what you see as such a high risk endeavor. Some will get top 10 very quickly depending niche, and most will start to offset the expenses almost immediately using PPC (pay per click), but its now going to take a bit longer and take more effort.

Here is a mix of on and off page tips that I feel are currently the most significant. I do Top 12 SEO Tips every year and the previous versions can be found




1. Navigation (on-page SEO elements)
Use your canonical tag
Keep your url’s clean
Don’t use ‘rel’ tag when you can use absolutes
Check for infinite loops
Identify duplicate content issue with cross-domain rel="canonical" link element
Canonical Link Element at page level
First Link Rule


2. Check your site for errors

I regularly run Xenu or Screaming Frog SEO Spider to identify orphan files, links or other on-page issues. I’ve used Xenu for years but since it is not an SEO Tool the elements that it checks have grown outdated. Screaming Frog checks the following;
Errors – Client & server errors (4XX, 5XX)
Redirects – (3XX, permanent or temporary)
External Links – All followed links and their subsequent status codes
URI Issues – Non ASCII characters, underscores, uppercase characters, dynamic uris, over 115 characters
Duplicate Pages – Hash value / MD5checksums lookup for pages with duplicate content
Page Title – Missing, duplicate, over 70 characters, same as h1, multiple
Meta Description – Missing, duplicate, over 156 characters, multiple
Meta Keyword – Mainly for reference as it’s only (barely) used by Yahoo the last time I checked. Missing, duplicate, multiple
H1 – Missing, duplicate, over 70 characters, multiple
H2 – Missing, duplicate, over 70 characters, multiple
Meta Robots – Index, noindex, follow, nofollow, noarchive, nosnippet, noodp, noydir etc
Meta Refresh – Including target page and time delay
Canonical link element
File Size - SPEED COUNTS
Page depth level
Inlinks – All pages linking to a URI
Outlinks – All pages a URI links out to
Anchor Text – All link text. Alt text from images with links
Follow & Nofollow – At link level (true/false)
Images – All URIs with the image link & all images from a given page. Images over 100kb, missing alt text, alt text over 100 characters
Custom Source Code Search – The spider allows you to find anything you want in the source code of a website! Whether that’s analytics code, specific text, or code etc.



3. Write your content with a LSA (Latent Semantic Analysis)

Semantic technology is the process of signaling what kind of content you are publishing on an item-by-item or field-by-field basis, publishers can help make the meaning of their text readable by machines. If machines are able to determine the meaning of the content on a page, then our human brains don't have to waste time determining, for example, which search results go beyond containing our keywords and actually mean what we are looking for.

Because of this move , we will now be able to clearly designate content on a page as related to other particular content. More sites than ever are now utilising CMS (Content Management Systems) and these provide a fantastic spring-board for this type of tool to flourish. Where this will play into both flat, html based websites and CMS populated sites is that much like the meta tags and descriptions of the past, content owners will now be able to provide structured data to Yahoo for potential display in enhanced listings in search results.

One example would be to use a Wordpress style application that will interlink similar content pages by using keyword sets identified through your log files or a program like Hittail. This is still in its early stages, but Google has also shown signs of adopting this technology as well. Forward thinking here will definitely pay off in the future. Yahoo wishes to incorporate microformatting FOAF, geoRSS, hReview and hAtom in order to initially fuel this engine. This means that website elements such as forums, reviews and feedback. This means that social networking and viral content will play an increasingly significant role in the way that websites rank organically. There is a ton of other information available on this, but not enough room or time right here. Just mark my words…Do your homework on this and you could find yourself riding the top ten with nothing but bluer skies to come.

Hot Tip #3 – Use Hittail to find actual search queries that people are using to find your website and inject these terms (3,4,5 & 6 keyword phrases) into your content. Even if there is a different word that basically has the same meaning (I.e. Lorry & Truck), use these as well. This is what LSI is designed to do. Spending the time and money to build a CMS that uses all of these variations will put your site well ahead of the competition.


4. On-Page Advertising - Ditch it

Remove advertisement from your index page, unless its making you rich take them all off. I have proven this to be a negative factor and even one that can draw a marginal penalty of 10 or so spots. Your own banners are okay, its the 3 adsense blocks, 4 adbrites and the rest of the lot that makes it look just like an Affiliate portal.

5. Use Subomains Effectively

Let me say this loud and clear; Google treats subdomains as individual websites. So blog.yoursite.com is a completely different website than yoursite.com. The key to this strategy is to link the main site to the subdomain. This passes the trust and authority from the main site and passes it to the subdomain website.

In the above example you can see that bingoplayeronline.com is ranking second and third for a longtail search phrase. This is one terrific reason to us subdomains rather than subdirectories.

viewer

6. Geotargeting for Language or Regional Targeting

The various ways that people search and the results the search engines are delivering are evolving rapidly. Smarter queries and more complex algorithms mean that you need to use various techniques to be sure you are showing up in the results. Local search, advanced search, regional search and language-based searches are some of the filters an end-user or a search engine can use in determining who shows up, when they show up and where they show up.
Geotargeting is one tool Google has refined and one that you can manipulate to a point in order to increase saturation in any market. Beyond the obvious on-page considerations, different searches will deliver (in most cases) a different set of results.

The results can differ greatly depending on several considerations;
1. The IP of the end-user
2. The server location of the website
3. Any geographically targeted settings in Webmaster Central
4 . The relationship between the search filters and the resulting web pages (I.e. Did they search for Pages from [region] or Pages in [language]
5. If the end-user is searching a different extension than the defaulted engine (they manually enter Google.com searching for US or English results in a non-US region.


The other elements that will affect rankings will be back links;
1. Are the links from a TLD that matches the destination URL (I.e. .nl linking to an .nl website)?
2. Is the IP linking website located in the same region and the linked URL?
3. Page rank, linking anchor text, additional outbound links on the page linking to you
4. On-page relevancy
5. Language based meta-tags
6. Everything in the above 5 items relating to the linking website/page

Any one of these elements can give you an edge over your competition. Searching any of Google's (non-US) datasets will generally return a variety of websites when no language or location filter is selected. These can include internal pages in a website, subdirectories (www.yoursite.com/french), Subdomains (www.french.yoursite.com), and various TLD's (top level domains like .com and .nl). All 11 of the above factors are present (but not exclusive) in the automatic algorithm.

The problem is that no one really knows which approach is best, or which algorithmic attribute is the most effective, so what can we do with this? What we want to do is to look at the existing results using the available search filters, and the existing websites that are ranking high and determine what the best strategy for your website is. This takes deep page analysis of your competitors.

The important thing to note is that there is a hierarchy between one and the other in terms of which is the best solution. Every website has its own individual solution based on their demographics, site mechanics and available resources.

What you need to consider are;
1 .Your target market?
2. If you need or don't need geographical targeting?
3. If you need language based subdomains or subdirectories?
4. Should you move hosting?

Can I afford to do it all?

How & When to Use Geographical Targeting
Here's what to do if you wish to;

Geographically target a region
1 .Create a subdomain or a subdirectory in the native language and use Webmaster Central to geographically target it
2 .Host the subdomain on a server in the native region and use geographical targeting
3 .Build back links from similar TLD's

Target a specific language
1 .Create a subdirectory in the native language (I.e. www.yoursite.com/nl/)
2 .Build back links from same language websites
3 .Do not use geographical targeting

The reason that you do not want to use geographical targeting along with a language-based strategy is that if the end-user searches in the native language on Google.com, a site using content in that language will be stronger than the same site with geographical targeting in place. (This isn't dependent on whether you use subdirectories or subdomains unless you hosted the subdomain in the target region).
The answer for me is that I want it all...and NOW!!

I've recently had subdomains rank with geographical targeting turned on and in the native language rank top 10 in 6 weeks. I've had brand new websites with the appropriate TLD's (I.e. .nl, .de & .es) show up in 8 weeks. I've even had a .com hosted in the US without geographical targeting show up in the top 10 results for “Hollywood” terms when they had never been in results in the UK.

You can start with subdomains. Look at your log files to determine where the current traffic is coming from to tell you what to do first. Bounce rates can also tell you a lot.
For example, if your secondary traffic source is Germany and you have a high bounce rate, start with a language-based subdirectory, then maybe move onto creating a subdomains, hosting it in Germany, then set the geographical targeting to Germany in Webmaster Central. Then go back and start all over again using the region that has the next highest contribution.

Important Things to Remember!
• To target a language using only subdirectories do not use geographic targeting
• You can target a language with both subdomains and subdirectories but if you have a top-level TLD (.com) use subdirectories versus subdomains.
• You can use Google geographical targeting on subdomains and subdirectories
• Your title should be in the native language and/or use regional slang terms where they apply.
• Use language-based meta tags whenever targeting language-based searches
• Host subdomains that are for geographical targeting in the target region
• When you implement the subdomain strategy, link to it from the original website
• Create new sitemaps for each subdomain
• When creating meta tags and content be sure to use native slang. (If you sold pants in the US and the UK. Pants are referred to as trousers. Sweaters are referred to as jumpers.
• Get back links from same TLD's (get a .nl link to your .nl site in the native language)
• If you have a ccTLD (like .nl or .de) do not use geographical targeting. These domains are already associated with its designated region


TIP – In the past when you moved domains to a new host (or in this case Subdomains) it could take up to a week. Google WMC now has a tool that makes this almost instantaneous. Just get your ‘A’ address, move your content and any redirects from the parent site (Remember linking to the new subdomain from your parent site will pass nearly 100% of the PR, trust and authority, even though its seen and treated as a stand-alone website)

In a nutshell, I recommend that if you already have an existing website with a TLD like a .com or .cu.uk, and they are your target market, do not use the geographical targeting option. Start building subdirectories using the top native language determined by looking at Google Analytics or your log files. Identify your top referrer language. If the languages are close, as it the case between the US, UK, New Zealand and Australia, use native slang in the title, metatags and content. Build a new xml site map and manually submit it through all the main search engines.

The next step is to create a subdomain and get it hosted in the region that you are targeting. Build content in the native language and get r submit it, as well as setting up the geographical target in Webmaster Central.

By implementing this strategy, you will have a significant advantage over most of your competition (or a little less after this article is released).

Whether the search is initiated in the region or outside the region, whether your site is located in the region or just hosted there, or even if they search in the native language or manually enter a specific Google engine like Google.com.mx or Google.es, you will have improved saturation.

UPDATE - If you own a country specific ccTLD like .nl, .be or .ca, build this site as soon as possible. These are finally starting to outrank TLD’s like .com. In past issues of Top 12 Tips I recommended using TLD’s and subdomains. I still recommend subdomains but now ccTLD’s are finally coming to fruition in terms of Google’s past ccTLD prejudice and I said to be on the lookout for this. Well now its happening so get your ccTLD’s in order with some fresh content.

The easiest short-term solution until you can do it right is do buy a drupal or Joomla (or even Wordpress for that matter) from Template Monster (under $100) and put an RSS feed, a phpBB forum and a blog in place to get it crawled regularly. Work on your permanent site and when you launch you can either use redirects or bolt it on to the site and benefit from any rankings/traffic plus get a head start as well. Also remember to link to it from your existing domain. This will give it a little push as well.

7. Canonicalisation

Because of development issues, server settings, programming platforms, and even natural site progression a site may have multiple version of a homepage, or even an entire site. Let’s say you upgraded your site from an html site to a php site. Many times in the crossover pages are left on the server and are crawlable to the robots. These present several issues. You may have the same content and the new page never gets indexed. Links back to you may be to different versions. To mention just two of the most serious issues.

I’ve seen a website that had 8 individual and crawlable versions of their homepage live at the same time. This can reduce the strength of your page. (Ever see a site that has a lower homepage PR than an internal page?) The simplest and least painful way to fix this is with 301 redirects.

8. Links or Websites?

I have a lot to say about links, and since at least half of the questions I get in the forums, at conferences and from clients is about links, I’m guessing it’s what you are most interested in. So Google has said over and over not to buy links. If you do buy them, then you have probably seen some damage, depending on your techniques.

Establishing what links to buy and where you buy them from has always been a bit speculative because everything from page size, site size, the number of other links, the quality of the links, the relevancy of the site and various other factors have affected what a link is actually worth. Then you had to deal with things like checking the integrity of the link was still intact, or even still there. There are dozens of tools out there that were built just for this purpose.

So what do you do? I had a look at all the different methods available and several are still effective (I.e. blog posting, blog rolls, establishing forum links, Yahoo Answers and so on) and in the end, what I’ve found is that the clear winner for my money is going out there, doing the homework and negotiation and actually buying sites. If I own the site then I also own the content, and more importantly the links I’ll put in the existing content. I’m looking for related content or a website with a silo of related content that I can take existing keyword phrases and change them into hyperlinks.

I usually look at the overall spend on "sponsorship – wink, wink" x 18, in order to establish a maximum offer. So if the total amount the client is spending on links is £1000 per month, I look to buy a site for up to £18k. Anyone that has looked around at brokerages that sell sites knows that £5k can buy you a lot, let alone £18k. Take into account the money saved on maintaining purchased "sponsorship" dues versus your campaign expectations. Outside of hosting and domain renewal you will own the links forever – they will never expire.

There are a few other factors to look at before buying a site;
• Domain Age – 2 years minimum unless other factors outweigh this factor
• Page Rank – yes it does matter when buying a site because it represents G-Trust
• Number of existing back links
• Quality of those back links and are they to any deep pages?
• How many pages are indexed with Google?
• Install McAfee Site Inspector on your computer. It will flag any issues to do with bad history, malware or any one of many other problems it may have
• Other pluses are host location vs. target location, the number of registrants of the domain, DMOZ listing and rankings. If it’s a forum, the number of active members and recent posts are important.


9. Create a Mobile version of your website.

Mobile is the “next Big Thing” and creating a mobile version is free and easy on sites like Zinadoo and Mobeezo.

You can also get mobile apps built for a few hundred quid. This can be used as a viral and social media tool, used to get numerous links from great ranking sites like a link from Apple if you build an iphone app.

Next, use sites like : Foursquare, Gowalla, Twitter Places, Loopt, MyTown, BrightKite and Facebook. You could use tools like TwitResponse to set up auto Twits for a month in advance using a basic spreadsheet.


There are also many Affiliate offerings for cross selling gaming mobile apps. I saw one at the EGR conference and the Internet World conference in the last 2 weeks that was really impressive. Its a universal Smart Phohe application that acts as a betting interface that works in real time with any gaming platform (e.g.William Hill, Ladbrokes, etc) and allows you to manage, place bets and it even determines probabilities.

The best part is that you can white label it and sell it to your punters.

This is just one example of the many mobile applications that are emerging in 'M Commerce'

There are also incredibly 'soft' markets in mobile search, both paid and organic.

10. Optimise for Local Search

If you have a brick and mortar shop or an office where you trade and don't already know this, you must be hiding. Local search is the easiest way to smash the top spots, and will soon take on even more power.

Top tips here are;
Put your address and phone number on the site and in the meta tags. Do not use 800, 0800. 0945 etc. numbers.
Add yourself to Google Maps
Add yourself to Bing Local and Yahoo Local
Don’t use URL redirects on your homepage
Be sure your whois information is showing the correct address and phone number. It does not matter whether or not you paid extra to ‘hide’ your information, Google is a Registrar and can get to it.
Add the geo-tag to your coding.

<META NAME="geo.position" CONTENT="latitude; longitude">
<META NAME="geo.placename" CONTENT="Place Name">
<META NAME="geo.region" CONTENT="Country Subdivision Code">
e.g.
<META NAME="geo.position" CONTENT="49.2;-123.4">
or
<META NAME="geo.position" CONTENT="49.2;-123.4">
<META NAME="geo.placename" CONTENT="London, Ontario">
<META NAME="geo.region" CONTENT="CA-ON">
Spend a couple hours submitting yourself to business directories. A few quid is worth the link alone and Google uses many of these directories. Good ones are;
GetListed.org
Yelp
Hotfrog
Superpages
InsiderPages
Citysquares
Mojopages
Google Places
Business-directory-uk.co.uk
Thompson local
Yahoo Local Merchant
Ask Business Search

If you have more than one location, add a landing page or even a website for each. If you have an established website use subdomains as explained in tip #4

Set geographic targeting in Google Webmaster Tools
Get local links. Sponsor a local kids sports team and give incentives for facebook links for a free pizza night or offer an auction item for a charity event that is web-based
Upload a dozen pics with keywords and descriptions that target your area to Picaso and Flickr and geo-tag them. Link to these from your site.
Google Local Business offers the option of including videos in your business listing.Since Google bought YouTube, now you can include links to your YouTube videos and tag them with your target keywords/phrases/location which not only improves your rankings on Google Local Business but will also help visitors searching on your keywords come across you in the organic search results as Google typically displays video results on the first page.


11. RSS Feeds for Building Free Links

You create RSS feeds to supply information on site updates, increase traffic to your site and place links in your content that will be republished and provide free links back to your site. Using the scraping technique mentioned above, Black Hat SEO’s would search out RSS feeds and build a data base with the content for future use, spin the content and reuse it or just republish the content. In all cases they would insert their own links. They build programs or plug-ins (such as the Word Press All-In-One SEO plug-in) and these will take any chosen word or phrase and search through the content and when found change the code to make it a link to their chosen site. There are many other ways this can be used but you get the basic idea.

The neat thing here is that they need to do this automatically for speed and being able to make rolling URL’s and websites quick enough to beat the spam-bots for a few weeks for this technique to pay off. Because of this most don’t have the time to clean everything up properly, so your links remain.

Publishing the URL of your RSS feed on the many different sites that list these (I.e. FeedBurner) will insure a nice number scrapers posting your links to their sites.

I guess the first criticism of this would be that they will put you on Viagra sites – well even if they do I haven’t seen this particular method draw penalties when I messed around with it. But remember – I don’t condone or endorse....blah, blah, blah.

Here are some simple steps to get started with RSS:
• Set up a FeedBurner account to allow your feed to be consumed by others.
• Share your feed on your various social networking profiles. Many social networking sites have the ability to consume and display your RSS feed to your friends, followers, and fans.
• Submit it to multiple RSS submission directories (do a search for RSS submission for some options).
• Here’s a starter list for you of where to submit your RSS feeds through;
You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.

You do not have permission to view link Log in or register now.


12. Use Social Bookmarking & Social Media

I first talked about social bookmarking and how it could be used as a tool 5 years ago when I was at a gaming conference and someone asked me what could be done to improve their rankings just a spot or two. In the Gaming industry top positions for ‘Hollywood’ keywords like Poker and Bingo within the top 10 results can be significant. One position in the top 3 results can result in the difference between 6 figure ROI’s and 7 figure ROI’s.

Based on the graph below showing click through rates in relation to the top 10 ranking positions, the difference between the 3rd and 4th positions is 2.44%. The number of global searches for Poker is 24,900,00. Based on what we know, 2.44% click-through difference between the 3rd and 4th position, this ONE SPOT could equal over £600,000 per month or over 7 million per annum.

The difference between the next 2 spots is over 20% less because if you are not in the top 3 you are not in Google’s syndication network which includes every one from AOL, iWon, Ask and Netscape, to dozens of other top search engines. And this isn’t PPC or paid results, these are their organic results.

So I understood why someone might be concerned over one position...

What I recommended from this point was scrutinised (especially at the SMX and SES conferences with primarily SEO’s in attendance and other SEO’s on panels. So I proved myself by taking the top 10 results for poker (this was on Google.nl because I was in Amsterdam) and there was a ‘kicker’ listing, which is what I call this;

I targeted someone that was below the target site and wasn’t a competitor. In this case it was Pokerlistings.nl. I used Onlywire, a mass submission social bookmarking site to submit Pokerlistings.nl. There are 100 times as many sites and tools like this now available that submit a chosen page or site through multiple venues such as Facebook, Twitter, Gmail, Blogger, etc. Back then I used Onlywire. The entire process took 10 minutes and that was all I did.
By the next day the target site (the guy in the audience) was ranking a position higher.

I remember practically getting chastised at SMX Advanced for “teaching blackhat techniques” and even took criticism from a couple forums I moderate like GPWA and SEO Chat.

At the end of the day Google themselves have created a toolbar plugin that allows simultaneous submission to GTalk. Twitter, Facebook and various other social bookmarking sites. At that time though the critics said it was ‘grey’

viewer

Social Bookmarking can improve your rankings. Five years ago it lasted 30-45 days, now its a week at best and involves LSI and a few other factors. Its just one of the things any good SEO will be including in his/her arsenal.

There are lots of companies and software that do this. Some will even ping the page or automatically submit it through article directories and post it to Wordpress blogs and Phpbb forums. I use these for saving me time. Some think its grey hat SEO but if I were a normal person that wanted to put something out to the public without the hours or days of work required, I would use tools like this.

Google, Yahoo and Bing all have plugins for their toolbar which allows you to submit to multiple venues.Use these tools under strict control as they are easily abused and can cause more harm than good.

Facebook and Twitter are examples of Social Media and it has been stated by Google that they are now using this as a significant ranking element. I always assumed that they did because what better set of un-engineered data on a mass level could you possibly find?

Take advantage of the automated tools for your daily tasks.
On-site Website Checks - Xenu and Screaming Frog
Article Creation or unique spins - The Best Spinner or Spinner Wizard
Link Analysis (yours and your competitors) - Majestic SEO
Blog Building & Population - Scrapebox & Market Samurai
Automated Article submission, SBM, Pinging, RSS & Press Releases - SENukeX
Third Party Outsourced Agencies for Facebook, MySpace, Bebo, etc.
Twitter Professional Tools - TwitResponse & Tweet Adder
 
That was a very uesful post. Thanks alot...

Gi Joe
 

Users who are viewing this thread

Meister Ratings

Back
Top