- Joined
- Dec 19, 2008
- Location
- Malta
It drives me nuts when I come across an article, or even worse spend good money on going to a conference and someone spends an hour of my time telling me about their company or what their company can do to make me #1, when what I came for was some in-depth knowledge or recommendations, or even just an idea or two that I hadn't thought of to improve my SEO knowledge. I'll bet you have been there eh?.
Even worse is spending time in the wrong forum and getting bad advice or propaganda-led guesses.
SEO is a marathon, not a sprint, but these tips will give you a definite advantage over your competition.
1. Get your page titles 100% optimised for searchPage titles are the single best element of overall on-site optimisation that you can control, and one of the top things that a search engine looks for in its almighty power that decides the destiny of the page. Will it rank your page or send it to the depths of the supplemental results where a bloodhound would it find difficult to sniff out? Theories vary on how best to format the title for 100% optimisation. Long tail titles, key word stuffing, commas, density, bars and dashes have been tested and debated for many years. I have found the best performance using the following method; First, I sit down and pick my top keyword, then I run it through one of many keyword tools out there that will show me the the number of queries, demographic and geographic data, annual search trends, competiton stats and so on. I take these results and start classifying them by this information to establish the pages that I will build. I continue to do this until I get down to 4 or even 5-word phrases. From here I'll start diagramming the navigation of the new section using themed-based threads from the top to the bottom. In some cases I also use buffer words to control keyword weighting. So if I had a "blue widgets" page the next might be "Find Blue Widgets" and below that "Find Blue Wigets in Akron" and maybe even one more "Where Can I find Blue Widgets in Akron Ohio?" if it's been searched to some degree. This is called going after the longtail and some friends of mine over at a company called Hittail made a cool little tool that I put on all my clients websites. It gives you real time results for the keywords that people are typing to find your keyword, as well as what search engine. Traditional logfile analysis can be expensive and difficult for the inexperienced, and many times if you have a site getting 10,000 unique visitors a day or even a week you lose many of the prime longtail key word strings in the piles of data. Its also a good way to monitor related buzz on your product/service/offering. It is perfectly okay to have the following title: Buy Widgets | Blue Widgets in Akron. I try not to stuff the title tags at all, but I always make sure I use it twice and I don't duplicate the page titles. Then when you are building content in the next steps you will have unique but relevant text to use in the link.
2. Optimise your content
There are many on-page elements that can enhance the way the search engines rank your website. Assuming you have completed #1 above, the next step is to optimise the content based on the title that you have used for the page. In some cases where dynamic insertion is used, or an application like Wordpress is installed, you can optimise the way these elements are pulled into the on and off-page fields such as alt ags for images. If you don't use anything like this or you wrote it with Dreamweaver, there are other plugins and ways to make this process less painful, but I assure you its worth the TLC. Here is a general list of the on-page and off-page SEO elements that I concentrate on;
In the url - they become highlighted in results and increase click through rate.
In meta keywords (2-3 max.) - doesn't hurt to use so why not.
In the first and last sentence of the body content, and in bold as well.
In several places throughout the content but in a different form(I.e. plural )
In header tags. If it makes sense using h1, h2, h3 and then h4 in the hierarchy of a page then use them.
In alt tags for the images.
In the title tags for the images.
In html comment tags.
In meta description - see #5 for more more details
In an external link on the page - see #4 for more details
In a variation of the key word (I.e. plural) pointed to another internal relevant page (I.e. Concert Tickets page with Concert Tickets in Akron as the linking text linked to a page optimised for concert tickets in Akron)
Generally speaking I like to try and keep the pages with at least 250 words of relevant and themed content. This is a very important element to invest your time in because search engine robots parse, or remember your template is its static, which most are. Any optimisation that you have within the template won't have a significant factor on the SERP's (Search Engine Ranking Positions)so optimising your content is the best way to be sure your pages are not dropped into what in essance is the dreaded "sandbox" or supplemental results on Google.
It's also worth mentioning that even if you have an existing site, you can still bolt-on to your existing site in many useful ways. Add a community section, a "widget" news area, or anything similar you can drop these new pages into.
Hot Tip#1 - Do a search for the name of your website and copy the URL string and use it as the link for your logo. It is a popular belief that the number of searches for your brand and the number of end-users that navigate through to your website influences Google results, and I have to believe that it is a ranking factor that all the search engines use. I can't say exactly how much, but I think lobg-term this is a good strategy and have seen it work with no reduction in any of my traffic stats other than reducing the bounce rates throughout.
3. Optimise internal linking
Internal link structure is alot about getting the end-user to the conversion point. It can also be used for search engine optimisation in several ways. If you are using a content management system (CMS) that has a key word tagging feature you can have it search for key words within the content and link to other pages. This will increase conversions and increase the time the end-user spends on your website. Robots also like internal links within content that point to other, unique relevant content and they follow these links. Wordpress and other applications like VBulletin with the SEO upgrade can also accomplish this. This is where the use of a "nofollow" attribute comes in handy. According to Wikipedia, the nofollow was intended to reduce the effectiveness of certain types of search engine spam, thereby improving the quality of search engine results and preventing spamdexing from occurring in the first place. Matt Cutts of Google and Jason Shellen from Blogger created it around 2005. What it does is tell the search engines that you do not endorse the page you have linked to. Using this on internal links like your About, Contact or other pages will increase the "linkjuice" that is passed on to the important pages. A good example of this is if your template navigation is always the same, add nofollow attributes to all of the links beyond the front page so that the key word links I talk about in #1 and #2 that you place in your content will get all of the benefit of the "linkjuice". Just be sure not to confuse the search engines by using the same key word anchor text as the key word you are optimising the page for (don't link "buy blue widgets" in anchor text on your "buy blue widgets" page and link out to the "blue widgets" page.
Hot Tip#2 - Another great way to use this tip is when you are creating new pages based on the keyword selection I mentioned above, you can link to them from the front page or an internal doorway page built for 'closer-to-the-root-file' navigation. Put nofollows on everything except your anchor text that points to these new pages. If your homepage carries a good Page Rank (PR) it will pass it down to the new page and will give you a boost in the SERP's. The goal is that you want to find a niche or longtail keyword phrase, build an optimised page for it, add a link to it from a well ranking page and suddenly your ranking at the top for the term.
4. Use external linking wisely
While who links to you can't affect your websites credibility or SERP's, who you link to does. For good measure I try to add one outbound link using relevant anchor text on each page. I don't use the nofollow attribute on this as I want to be associated with it. I do a Google search and use the allinurl function and my keyword to search for .edu or .gov websites related to my page and start there. You would be suprised what you'll find. If that fails I'll do one of two things; I do a search for the keyword I am optimising for and find a non-competitor that ranks well and link to them; or I'll link to the definition page in Wikipedia.
5. Write your meta descriptions
Meta descriptions are part of the off-page code you find when you go to a page and look at your source code and usually near the top you'll see '<META NAME="Description" CONTENT=' This is another element the search engines look at to determine the theme of the site. More importantly they almost all use it to describe the page in your search results. So if you are searching for 'blue widget', the results you will get have that keyword in bold. This will make it stand out more and increase conversions. It will also bold the partial word (I.e. buy blue widgets in Akron) in the title and the URL as well.
6. Check your internal canonicalisation
Websites can have more than one URL. (I.e.
7. Finding What Terms Are Converting Into Sales/Tracking Keywords to Conversion With Weighting
Having 100,000 unique visitors a day really doesn't matter in the end if you aren't getting any conversions (new members, info requests,
sales).
Measuring successes and failures for landing pages, on-page content like CTA's, and especially keyword to sale are some of the most important pieces of information that you can gather and use to improve and optimise your overall website.
Here are two scenarios to better illustrate this point;
Paid Advertising A car insurance company starts a paid advertising campaign on Google and after a week or so they see that the name of their company or their 'brand' seems to be converting the majority of their sales. Because of this discovery, they target the majority of their budget on their brand terms like ABC Insurance and ABC Insurance Company.
A week later they see that their CPA (cost per acquisition) has sky-rocketed almost two-fold and can't figure out why this is. When they look at Google analytics and other third-party tracking software, they both say the same thing.
So why is this?
Let's take a look at the buying process (also called funnel tracking) to see where they went wrong; Mrs.INeedInsurance hopped online while enjoying her morning java to look for insurance because last night when Mr.INeedInsurance opened his renewal notice he got a significant premium hike. At dinner they decided to start shopping around for insurance. Mrs.INeedInsurance searched 'car insurance' between 6-8am that day, going in and out of different companies websites, learning what she was up againsttens of 1000's of results. So at work (11a-2pm is the #1 time people shop online not necessarily making purchases) Mrs.INeedInsurance has learned a bit about search and decides to add her city in the query. This time she searches 'car insurance London', and still gets several thousand results, but at least they are localised, and there are a few that she recognizes from this morning so she goes in and fills a few of the forms out to get quotes. Throughout the rest of the day she gets the quotes either immediately from the website or via email. Now she's getting somewhere. Jump forward to after dinner that evening. Mr.INeedInsurance looks through the notes his wife brought home and decides that ABC Insurance offers the best deal for the money, then goes to Google and searches for ABC Insurance and makes the purchase.
See what happened here? I use this as an example because this is exactly what I identified for a client a few years back that inevitably led to changes that doubled their conversions.
The problem is that all the data pointed to ABC Insurance's brand name as being the top converting term, so that's where they concentrated the bulk of their budget. In actuality, 'car insurance' and then 'car insurance London' were the terms that actually led up to the sale.
The reason that this is important for PPC campaigns, or any paid advertising, is that many will allow you to do keyword weighting. This is where you increase your bids or decrease your bids by a percentage according to day parting. Day parting is turning your ads up or down according to the time table that you put in place.
In this instance I would turn my bids up to 125% on 'car insurance' and 'car insurance London' in the morning and afternoon, then down at night. On 'ABC Insurance' I would turn the bids down in the morning to 50%, and then back up to 125% in the evening.
Keyword weighting also allows you to weight your keywords and track them to conversion. It places a cookie on the end-users computer to track what keyword brought them to the sight, what keyword resulted in a quote, and what keyword resulted in a sale.
This is beneficial because I can further adjust my bidding strategies according to demographics and geographical metrics.
With these cookies I can also successfully measure and establish LTV (Lifetime Values) of the average customer. This allows me to adjust the conversion value, which allows me to go back to my company/client and potentially get a higher advertising budget.
Using this same insurance company as an example; initially they gave me a conversion value of $25. Now, since we were able to identify other sales made by this customer, the conversion value is $40.
Offline this company spends 100,000 on advertising through different venues, acquiring customers at a cost average of /$56. Guess what happened the next month? They increased the budget by 100,000.
Organic Advertising Same scenario as above, except ABC Insurance Company identifies through log files or Google Analytics that his top converting keyword that is getting sales is car insurance.
In light of this, the decision maker decides to create a landing page that is fully optimised so that the relevancy grade that all 3 search engines use will increase their organic positions, which it will.
The problem here is that the term that was actually bringing them to the website to buy was 'cheap car insurance'. If they had identified this they could have built the page around the term, 'cheap car insurance' rather than just 'car insurance'. This would have served double-duty and acted as a great landing page for both keyword phrases.
This is why tracking your keywords to conversion is so important. It can save thousands on paid advertising and identify the actual keyword phrases that need pages built around for improving organic rankings.
If you are experiencing a high bounce rate or what you feel is high cart abandonment, you might be surprised to find that many didn't buy elsewhere; they actually came back to you and bought. This is also helpful in refining your stats. Rather than show this customer as 3 separate visitors, it identifies (through the cookies) that they were actually just one visitor, and the bounce rate or cart abandonment is significantly reduced. This information can very invaluable as well.
For instance, maybe I was getting high unique cart abandonment from unique users that was significantly higher once they went to checkout. I know that happens when I add shipping costs into the total. So I might try to do some A/B testing with and without shipping costs listed separately, added into the price initially and adding it during checkout and see which converts better. Or I may set the website up to recognize the cookie and create a drop down that offers free shipping today with any purchase over $/XX.XX.
There are endless possibilities to use this information for.
8. Bump Your Competitors Multiple Listings Out of Google and Pick up a Position or Two
Every wonder why during a search you find a competitor that has two pages listed above you? I call them kicker listings. The home page is always the second listing, and the first is an internal page that actually has relevant content.
Here is why this happens. When you submit a query Google looks at its rank and if they are close to each other in their results, they group them together. If you are showing up in the SERP's first couple pages then it is most likely that you are listed again much deeper in the results. But when two pages are close, like top ten, or top 20, then Google shows them side-by-side. The second, usually the index page, will be listed below and also indented.
By going into 'advanced search' the number of default result can be changed, or you can add this bit of code to the end of the url string that it shows after a search for your keyword, just after the search? And the results will be more refined. Add this 'num=8&' to the end of the url. This number may change the results, but if not reduce the number. This will show you where your competitor's second page should
actually
be.
Okay, so now should go back to the original search that showed the double listing. Within the search results look where your competitor is showing up, then look below his listings for a non-competitor. It could be anything, a video, a news story or a Wikipedia or eBay listing. Use the guide in Tip #11 to do some social bookmarking, or even link to the page from your website (preferably on a second level subdirectory).
What this will do is add a little boost to the non-competitive website and bump the 'kicker' listing that your competitor has, back to where he belongs, below your listing.
This is surprisingly easy and quick using a combination of bookmarks and back links. It may even boost your trust rating with Google by having an outbound link to a high ranking website.
Using this method on eBay sometimes provides a double-boost because if it is an auction rather than a store item it may drop off the SERP's once the auction is over.
9. Avoid common penalties
HTML Validation/W3C - may cause crawl issues with the spiders. If they can't get past the error then the pages beyond it may not get crawled, and therefor they won't get indexed. There are free W3C compliance tools free on the web.
302 redirects - Black Hatters use these for various reasons other than what they were intended and you may end up getting a penalty if you keep one in place over time. Do a site crawl with Xenu and verify that any 302 (temporary) redirects are identified and changed to 301 (permanent) redirects.
Duplication - As mentioned above in #6, be sure you only have one primary URL, but if you already have more than one in the search engines, be sure to do a page strength comparison (You can use Firefox browser with the SEOQuake extension for this) and use a 301 with nofollows everywhere to pass on the maximum amount of linkjuice. Some websites, especially older websites may have several versions of the same page, and some search engines may have cached versions of both pages as well. You may have a .com, a .com/index and a .com/index.asp all in your root and wide open to be crawled. If these are duplicates I believe that all of the pages will suffer a penalty to some extent so either do a 301 redirect to pass on any backlinks, PR and authority that the page has to one primary page. Internal navigation needs to be checked to be sure all links go to the correct version as well. Especially with websites constantly being populated or worked on by many individuals, bad navigation is common. Pick one and use it throughout. The other problem with onsite duplication is many sites are using product feeds to populate their stores. Others may populate their content with an RSS feeds. The problem with these and other dynamic websites is that other websites may being using the same content. If this is the case you run the risk of being penalised for duplicate content. Use Copyscape to search for other duplicate content on the web. If you are not ranking well for the term/terms you'll need to change the content on the page, especially if you are not the originator of the content.
XML Sitemaps and Robots.txt - Universal XML Sitemaps - Providing an XML sitemap is one of the easiest things you can do to help search engines traverse your site. Google, Yahoo and MSN have all adopted this "standardized" tool. Having a sitemap and then submitting it through WEBMASTER Central will tell you not only when the crawl (usually 1-2 days) is complete, but also if there are any errors that the bots found. Robots.txt file - By defining rules in this file, you can instruct robots to not crawl and index certain files, directories within your site, or at all. For example, you may not want Google to crawl the /images directory of your site, as it's both meaningless to you and a waste of your site's bandwidth.
Spiderability - There are many tools available (Google's Webmaster Central for one) that will crawl your site and identify any problems. This like javascript, java applets, and flash navigation all create spiderability issues, as well as many others. Be sure to check these.
10. Use social bookmarking websites for short-term ranking boost and blogs/forums to establish long term trust and authority
Social Bookmarking - Wikipedia defines it: In a social bookmarking system, users store lists of Internet resources that they find useful. These lists are either accessible to the public or a specific network, and other people with similar interests can view the links by category, tags, or even randomly. Most social bookmarking services allow users to search for bookmarks which are associated with given "tags", and rank the resources by the number of users which have bookmarked them. Many social bookmarking services also have implemented algorithms to draw inferences from the tag keywords that are assigned to resources by examining the clustering of particular keywords, and the relation of keywords to one another.
GaryTheScubaGuy defines it this way:
One of the best free ways to get increased ranking, back links and traffic, for very little time commitment other than setup.
This very moment most search engine algorithms are placing a ton of weight on end-user bookmarking, tagging or one of various types of end-user generated highlighting.
Before doing any of this run a rank report to track your progress. I have tested this on terms showing on page one, on terms ranked 11th through 12th and others buried around pages 5-10. It works on them all in different time frames, and they last for different periods of time. This you will need to test yourself. Be careful because you dont want to be identified as a spammer. Be sure to use genuine content that provides a benefit to the user.
Here is how I recommend doing this;
1. Download this; Roboform. (It says it will limit you but Ive had as many as 30+ passwords created and stored in the trial version) This will allow you to quickly fill out signup forms and store passwords for the 10 Bookmark sites that I am going to be sending you to.
2. Within Roboform go to the custom area and put a username and password in, as well as your other information that sites usually ask for to register. This way when you are using these different bookmarks its a 1-click login in and becomes a relatively quick and painless procedure.
3. Establish accounts with these Social Bookmark Sites;
a. Digg
b. Technorati
c. Del.icio.us
d. NowPublic
e. StumbleUpon
f. BlinkList
g. Spurl
h. Furl
i. Slashdot
j. Simpy
k. Google Toolbar (w/Google Bookmarking)
4. Internet Explorer, Firefox and most other browsers have an "add a tab" option, but I use Firefox because I can bookmark the login pages in one file, then "open all tabs" in one click. From here I click on each tab and in most cases, if you set it up right, Roboform will have already logged you in. Otherwise youre on the login page and by clicking on the Roboform button everything is prefilled, all you need to do is click submit. (some of the bookmark sites will allow you to add their button into your browser bar, or you can get an extension from Firefox like the Digg Add-on to make things quicker)
5. Lastly, Install the Google Toolbar. It has a bookmark function as well, and you can import all your bookmarks from Firefox directly into it. Google looks at many different things when assigning rank and trust. For instance, when you search for something and go into a website, Google will remember how long you stayed, how deep you went, and if you came back out into the search to select another site, which means you didnt find what you were looking for. This is all part of the Privacy Issues that have been in the news.
Heres what Google actually says! "The Google Toolbar automatically sends only standard, limited information to Google, which may be retained in Google's server logs. It does not send any information about the web pages you visit (e.g., the URL), unless you use Toolbar's advanced features."
They practically spell it out for you. Use their bookmark feature just like you were doing the social bookmarking I outlined above. This is just one more click.
Some of the elements that Google looks at when grading a website are;
How much time did the average visitor spend on the site?
What is the bounce rate on the landing page?
How many end-users bookmarked the page?
How many users returned to the search query and then on to a different site?
Each time you publish an article put a Google Alert on a unique phrase. Each time Google sends you an alert, bookmark it on every bookmark site. This will take some getting used to, but will eventually become second-nature. Remember what I said in the beginning; "One of the best free ways to get links and traffic, for very little time commitment other than setup".
When you start seeing traffic coming in and your SERPs getting better you will use the heck out of this. Im waiting for someone to come out with software that will automate this process completely, but by the time that hits nofollows may come into play. But for the time being it works and it works well.
(Update: Found one) Bookmark Demon and Blog Comment Demon. It automates the process.
Blogs - When looking for blogs to post on I use Comment Sniper. It takes your chosen keyword and will search through MSN and Google blogs (Wordpress too!) for your keyword and come back with a long list of blogs that it found your keyword on. You can then sign up to monitor it and every 15 minutes or so it will update you on any posts that have been made the forums that you selected. You can then find out if it is a post that you can reply to and maybe get a link to one of your sites. This generally works great because these are aged pages that have earned page rank so you immediately reap the benefits on a backlink on a relevant page. Many have said that with some blogs, they have a nofollow, but that doesn't take away the fact that you have a link to your site, just that the sender (blog/page) doesn't endorse your site.
One more thing regarding posting to blogs and forums. When I create an account I sign up and use a very unique member name (eg. GaryTheScubaGuy). This is because many blogs and forums have a no-follow, which means the link in your signature or on your member name wont show up. So I also sign, or add my member name to the bottom of my post, then add a Google Alert on my signature so that when Google finds the post, it will alert me, and I will then start bookmarking the forum page.
Hot Tip#3 - Comment Sniper can be used to find aged blogs that have been abandoned but have great PR. Posting on these is a quick way to get a boost in the SERP's. Many of these older blogs were created before the addition of the nofollow tag and were not updated when this (blogger) update was released so you get 100% of the benefit rather than just the link. You can also filter the results that it returns by PR so you spend your time posting on pages with PR.
Forums - I use the search engines 'allinurl' feature using forum and "my keywords" to locate these forums. This will take a bit more time to establish as you need to sign up and become a member in good standing before posting any links. Some will allow you to have a screen name that you can link back to your website from, so I'll use my keywords or just a unique name (Like GaryTheScubaGuy) so that I can identify the links as they become indexed.
I recommend spending the time to find a few good forums and regularly post good recommendations or advice. I've done this on SEO forums (like SEO Chat) for a long time and I have 1000's that link to my websites and forum.
11. Optimise Your 404 Page
The search engines look at traffic in their algorithms to grade a page. If you have a complicated URL, one that is commonly misspelled, or do something else that could endanger losing any existing links that are published out on the WWW, this is the landing page the visitor will get sent to. If it has your template and navigation from the rest of the site it will get indexed like a normal page. Change your title and meta to one of your keyword strings, add an image and relative content that reflects your keywords as well. I avoid placing the actual term 404 on the page.
Some of the ways 404 pages are reached are:
Bookmarked sites that have since been moved
The end-user made an error when typing in a url
A moved page is still indexed in the SERPS
There are broken links in your link structure
What are some tips when customizing your 404 error pages?
1. Put a link to your FAQ page
2. Put a link to your top level categories
3. Put a link to your sitemap
4. Create a template 404 page that blends with your site
5. Add a search box
6. Make your 404 pages look as close to your site theme as possible
7. Add true navigation to it.
8. Optimise this page with the same elements as your other pages (See Tip #21)
A simple statement like, You have found this page in error, please select from the menu on the left side of this page will do here, and you will retain more traffic.
12. Get Your Pages Out of Supplemental Results What They Are, How to Find Them and How to Get Out of Them
"Supplemental sites are part of Google's auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Google's supplemental index.
The index in which a site is included is completely automated; there's no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank."
Nonsense!
At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had ' - Supplemental Result' just after the page size. They aren't showing these anymore. Here's what they had to say,
"Since 2006, we've completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results." Of course, you will continue to benefit from Google's supplemental index being deeper and fresher."
Google then said that the easiest way to identify these pages is like this; "First, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files.
Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you don't see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out."
Nonsense!
The easiest way to identify your supplemental pages is by entering this query 'site:www.yoursite.com/&'
Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere.
Now we need to identify why they are there. The main reasons that a page goes to supplemental results are;
1. Duplicate Content
2. 301's. Redirected Pages that have a cache date prior to the 301 being put in place
3. A 404 was returned when Google attempted to crawl it
4. New Page
5. Bad Coding
6. Page Hasn't Been Updated in Awhile
7. Pages That Have Lost Their Back Links
8. And according to Matt Cutts of Google,"PageRank is the primary focus determining whether a URL is in the main web index vs. supplemental results"
Now this isn't the end-all, but it covers about 95% of the reason that you may be in the supplementals.
So now we know what they are, how to find them and why they are most likely in the supplemental results. Now let's get them out of there.
Here are the different methods that I use when I find that a page has gone supplemental;
1. Add fresh content to the page
2. Add navigation to the page from the main page
3. Move the pages to the first subdirectory if it is not already there
4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page
5. Do some social bookmarking on the page
6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central.
7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isn't listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page.
This is the first of many posts I'll be doing here on Bryan's site. We met in Amsterdam and CAC and he's a good guy so since I resigned from CRAP, sorry, CAP (Sorry its just too easy , as a Moderator I've found some time to post other places and I'll be popping back and forth between here and GPWA.
Please feel free to PM me with any questions or just post them here.
Anything on SEO, PPC, Viral or so on. If I don't have the answer I'll research it and get it.
One last point...if I post anything here you can count on the fact that spent either the time, money or resources to validate it. No regurgitated propaganda here.
Cheers,
GaryTheScubaGuy
Even worse is spending time in the wrong forum and getting bad advice or propaganda-led guesses.
SEO is a marathon, not a sprint, but these tips will give you a definite advantage over your competition.
1. Get your page titles 100% optimised for searchPage titles are the single best element of overall on-site optimisation that you can control, and one of the top things that a search engine looks for in its almighty power that decides the destiny of the page. Will it rank your page or send it to the depths of the supplemental results where a bloodhound would it find difficult to sniff out? Theories vary on how best to format the title for 100% optimisation. Long tail titles, key word stuffing, commas, density, bars and dashes have been tested and debated for many years. I have found the best performance using the following method; First, I sit down and pick my top keyword, then I run it through one of many keyword tools out there that will show me the the number of queries, demographic and geographic data, annual search trends, competiton stats and so on. I take these results and start classifying them by this information to establish the pages that I will build. I continue to do this until I get down to 4 or even 5-word phrases. From here I'll start diagramming the navigation of the new section using themed-based threads from the top to the bottom. In some cases I also use buffer words to control keyword weighting. So if I had a "blue widgets" page the next might be "Find Blue Widgets" and below that "Find Blue Wigets in Akron" and maybe even one more "Where Can I find Blue Widgets in Akron Ohio?" if it's been searched to some degree. This is called going after the longtail and some friends of mine over at a company called Hittail made a cool little tool that I put on all my clients websites. It gives you real time results for the keywords that people are typing to find your keyword, as well as what search engine. Traditional logfile analysis can be expensive and difficult for the inexperienced, and many times if you have a site getting 10,000 unique visitors a day or even a week you lose many of the prime longtail key word strings in the piles of data. Its also a good way to monitor related buzz on your product/service/offering. It is perfectly okay to have the following title: Buy Widgets | Blue Widgets in Akron. I try not to stuff the title tags at all, but I always make sure I use it twice and I don't duplicate the page titles. Then when you are building content in the next steps you will have unique but relevant text to use in the link.
2. Optimise your content
There are many on-page elements that can enhance the way the search engines rank your website. Assuming you have completed #1 above, the next step is to optimise the content based on the title that you have used for the page. In some cases where dynamic insertion is used, or an application like Wordpress is installed, you can optimise the way these elements are pulled into the on and off-page fields such as alt ags for images. If you don't use anything like this or you wrote it with Dreamweaver, there are other plugins and ways to make this process less painful, but I assure you its worth the TLC. Here is a general list of the on-page and off-page SEO elements that I concentrate on;
In the url - they become highlighted in results and increase click through rate.
In meta keywords (2-3 max.) - doesn't hurt to use so why not.
In the first and last sentence of the body content, and in bold as well.
In several places throughout the content but in a different form(I.e. plural )
In header tags. If it makes sense using h1, h2, h3 and then h4 in the hierarchy of a page then use them.
In alt tags for the images.
In the title tags for the images.
In html comment tags.
In meta description - see #5 for more more details
In an external link on the page - see #4 for more details
In a variation of the key word (I.e. plural) pointed to another internal relevant page (I.e. Concert Tickets page with Concert Tickets in Akron as the linking text linked to a page optimised for concert tickets in Akron)
Generally speaking I like to try and keep the pages with at least 250 words of relevant and themed content. This is a very important element to invest your time in because search engine robots parse, or remember your template is its static, which most are. Any optimisation that you have within the template won't have a significant factor on the SERP's (Search Engine Ranking Positions)so optimising your content is the best way to be sure your pages are not dropped into what in essance is the dreaded "sandbox" or supplemental results on Google.
It's also worth mentioning that even if you have an existing site, you can still bolt-on to your existing site in many useful ways. Add a community section, a "widget" news area, or anything similar you can drop these new pages into.
Hot Tip#1 - Do a search for the name of your website and copy the URL string and use it as the link for your logo. It is a popular belief that the number of searches for your brand and the number of end-users that navigate through to your website influences Google results, and I have to believe that it is a ranking factor that all the search engines use. I can't say exactly how much, but I think lobg-term this is a good strategy and have seen it work with no reduction in any of my traffic stats other than reducing the bounce rates throughout.
3. Optimise internal linking
Internal link structure is alot about getting the end-user to the conversion point. It can also be used for search engine optimisation in several ways. If you are using a content management system (CMS) that has a key word tagging feature you can have it search for key words within the content and link to other pages. This will increase conversions and increase the time the end-user spends on your website. Robots also like internal links within content that point to other, unique relevant content and they follow these links. Wordpress and other applications like VBulletin with the SEO upgrade can also accomplish this. This is where the use of a "nofollow" attribute comes in handy. According to Wikipedia, the nofollow was intended to reduce the effectiveness of certain types of search engine spam, thereby improving the quality of search engine results and preventing spamdexing from occurring in the first place. Matt Cutts of Google and Jason Shellen from Blogger created it around 2005. What it does is tell the search engines that you do not endorse the page you have linked to. Using this on internal links like your About, Contact or other pages will increase the "linkjuice" that is passed on to the important pages. A good example of this is if your template navigation is always the same, add nofollow attributes to all of the links beyond the front page so that the key word links I talk about in #1 and #2 that you place in your content will get all of the benefit of the "linkjuice". Just be sure not to confuse the search engines by using the same key word anchor text as the key word you are optimising the page for (don't link "buy blue widgets" in anchor text on your "buy blue widgets" page and link out to the "blue widgets" page.
Hot Tip#2 - Another great way to use this tip is when you are creating new pages based on the keyword selection I mentioned above, you can link to them from the front page or an internal doorway page built for 'closer-to-the-root-file' navigation. Put nofollows on everything except your anchor text that points to these new pages. If your homepage carries a good Page Rank (PR) it will pass it down to the new page and will give you a boost in the SERP's. The goal is that you want to find a niche or longtail keyword phrase, build an optimised page for it, add a link to it from a well ranking page and suddenly your ranking at the top for the term.
4. Use external linking wisely
While who links to you can't affect your websites credibility or SERP's, who you link to does. For good measure I try to add one outbound link using relevant anchor text on each page. I don't use the nofollow attribute on this as I want to be associated with it. I do a Google search and use the allinurl function and my keyword to search for .edu or .gov websites related to my page and start there. You would be suprised what you'll find. If that fails I'll do one of two things; I do a search for the keyword I am optimising for and find a non-competitor that ranks well and link to them; or I'll link to the definition page in Wikipedia.
5. Write your meta descriptions
Meta descriptions are part of the off-page code you find when you go to a page and look at your source code and usually near the top you'll see '<META NAME="Description" CONTENT=' This is another element the search engines look at to determine the theme of the site. More importantly they almost all use it to describe the page in your search results. So if you are searching for 'blue widget', the results you will get have that keyword in bold. This will make it stand out more and increase conversions. It will also bold the partial word (I.e. buy blue widgets in Akron) in the title and the URL as well.
6. Check your internal canonicalisation
Websites can have more than one URL. (I.e.
You do not have permission to view link
Log in or register now.
and
You do not have permission to view link
Log in or register now.
). If you have been around for a while and people are linking to you they could be linking to either URL. By designating a primary it gets 100% of the above benefits. Go to Google's Webmaster Central and in the tools section designate one as your primary. Do a 301 redirect on the non-primary page to pass on any backlink juice, PR and authority that the page has to the primary page. Internal navigation needs to be checked to be sure all links go to the correct version as well. Especially with websites constantly being populated or worked on by many individuals, bad navigation is common. Pick one and use it throughout.7. Finding What Terms Are Converting Into Sales/Tracking Keywords to Conversion With Weighting
Having 100,000 unique visitors a day really doesn't matter in the end if you aren't getting any conversions (new members, info requests,
sales).
Measuring successes and failures for landing pages, on-page content like CTA's, and especially keyword to sale are some of the most important pieces of information that you can gather and use to improve and optimise your overall website.
Here are two scenarios to better illustrate this point;
Paid Advertising A car insurance company starts a paid advertising campaign on Google and after a week or so they see that the name of their company or their 'brand' seems to be converting the majority of their sales. Because of this discovery, they target the majority of their budget on their brand terms like ABC Insurance and ABC Insurance Company.
A week later they see that their CPA (cost per acquisition) has sky-rocketed almost two-fold and can't figure out why this is. When they look at Google analytics and other third-party tracking software, they both say the same thing.
So why is this?
Let's take a look at the buying process (also called funnel tracking) to see where they went wrong; Mrs.INeedInsurance hopped online while enjoying her morning java to look for insurance because last night when Mr.INeedInsurance opened his renewal notice he got a significant premium hike. At dinner they decided to start shopping around for insurance. Mrs.INeedInsurance searched 'car insurance' between 6-8am that day, going in and out of different companies websites, learning what she was up againsttens of 1000's of results. So at work (11a-2pm is the #1 time people shop online not necessarily making purchases) Mrs.INeedInsurance has learned a bit about search and decides to add her city in the query. This time she searches 'car insurance London', and still gets several thousand results, but at least they are localised, and there are a few that she recognizes from this morning so she goes in and fills a few of the forms out to get quotes. Throughout the rest of the day she gets the quotes either immediately from the website or via email. Now she's getting somewhere. Jump forward to after dinner that evening. Mr.INeedInsurance looks through the notes his wife brought home and decides that ABC Insurance offers the best deal for the money, then goes to Google and searches for ABC Insurance and makes the purchase.
See what happened here? I use this as an example because this is exactly what I identified for a client a few years back that inevitably led to changes that doubled their conversions.
The problem is that all the data pointed to ABC Insurance's brand name as being the top converting term, so that's where they concentrated the bulk of their budget. In actuality, 'car insurance' and then 'car insurance London' were the terms that actually led up to the sale.
The reason that this is important for PPC campaigns, or any paid advertising, is that many will allow you to do keyword weighting. This is where you increase your bids or decrease your bids by a percentage according to day parting. Day parting is turning your ads up or down according to the time table that you put in place.
In this instance I would turn my bids up to 125% on 'car insurance' and 'car insurance London' in the morning and afternoon, then down at night. On 'ABC Insurance' I would turn the bids down in the morning to 50%, and then back up to 125% in the evening.
Keyword weighting also allows you to weight your keywords and track them to conversion. It places a cookie on the end-users computer to track what keyword brought them to the sight, what keyword resulted in a quote, and what keyword resulted in a sale.
This is beneficial because I can further adjust my bidding strategies according to demographics and geographical metrics.
With these cookies I can also successfully measure and establish LTV (Lifetime Values) of the average customer. This allows me to adjust the conversion value, which allows me to go back to my company/client and potentially get a higher advertising budget.
Using this same insurance company as an example; initially they gave me a conversion value of $25. Now, since we were able to identify other sales made by this customer, the conversion value is $40.
Offline this company spends 100,000 on advertising through different venues, acquiring customers at a cost average of /$56. Guess what happened the next month? They increased the budget by 100,000.
Organic Advertising Same scenario as above, except ABC Insurance Company identifies through log files or Google Analytics that his top converting keyword that is getting sales is car insurance.
In light of this, the decision maker decides to create a landing page that is fully optimised so that the relevancy grade that all 3 search engines use will increase their organic positions, which it will.
The problem here is that the term that was actually bringing them to the website to buy was 'cheap car insurance'. If they had identified this they could have built the page around the term, 'cheap car insurance' rather than just 'car insurance'. This would have served double-duty and acted as a great landing page for both keyword phrases.
This is why tracking your keywords to conversion is so important. It can save thousands on paid advertising and identify the actual keyword phrases that need pages built around for improving organic rankings.
If you are experiencing a high bounce rate or what you feel is high cart abandonment, you might be surprised to find that many didn't buy elsewhere; they actually came back to you and bought. This is also helpful in refining your stats. Rather than show this customer as 3 separate visitors, it identifies (through the cookies) that they were actually just one visitor, and the bounce rate or cart abandonment is significantly reduced. This information can very invaluable as well.
For instance, maybe I was getting high unique cart abandonment from unique users that was significantly higher once they went to checkout. I know that happens when I add shipping costs into the total. So I might try to do some A/B testing with and without shipping costs listed separately, added into the price initially and adding it during checkout and see which converts better. Or I may set the website up to recognize the cookie and create a drop down that offers free shipping today with any purchase over $/XX.XX.
There are endless possibilities to use this information for.
8. Bump Your Competitors Multiple Listings Out of Google and Pick up a Position or Two
Every wonder why during a search you find a competitor that has two pages listed above you? I call them kicker listings. The home page is always the second listing, and the first is an internal page that actually has relevant content.
Here is why this happens. When you submit a query Google looks at its rank and if they are close to each other in their results, they group them together. If you are showing up in the SERP's first couple pages then it is most likely that you are listed again much deeper in the results. But when two pages are close, like top ten, or top 20, then Google shows them side-by-side. The second, usually the index page, will be listed below and also indented.
By going into 'advanced search' the number of default result can be changed, or you can add this bit of code to the end of the url string that it shows after a search for your keyword, just after the search? And the results will be more refined. Add this 'num=8&' to the end of the url. This number may change the results, but if not reduce the number. This will show you where your competitor's second page should
actually
be.
Okay, so now should go back to the original search that showed the double listing. Within the search results look where your competitor is showing up, then look below his listings for a non-competitor. It could be anything, a video, a news story or a Wikipedia or eBay listing. Use the guide in Tip #11 to do some social bookmarking, or even link to the page from your website (preferably on a second level subdirectory).
What this will do is add a little boost to the non-competitive website and bump the 'kicker' listing that your competitor has, back to where he belongs, below your listing.
This is surprisingly easy and quick using a combination of bookmarks and back links. It may even boost your trust rating with Google by having an outbound link to a high ranking website.
Using this method on eBay sometimes provides a double-boost because if it is an auction rather than a store item it may drop off the SERP's once the auction is over.
9. Avoid common penalties
HTML Validation/W3C - may cause crawl issues with the spiders. If they can't get past the error then the pages beyond it may not get crawled, and therefor they won't get indexed. There are free W3C compliance tools free on the web.
302 redirects - Black Hatters use these for various reasons other than what they were intended and you may end up getting a penalty if you keep one in place over time. Do a site crawl with Xenu and verify that any 302 (temporary) redirects are identified and changed to 301 (permanent) redirects.
Duplication - As mentioned above in #6, be sure you only have one primary URL, but if you already have more than one in the search engines, be sure to do a page strength comparison (You can use Firefox browser with the SEOQuake extension for this) and use a 301 with nofollows everywhere to pass on the maximum amount of linkjuice. Some websites, especially older websites may have several versions of the same page, and some search engines may have cached versions of both pages as well. You may have a .com, a .com/index and a .com/index.asp all in your root and wide open to be crawled. If these are duplicates I believe that all of the pages will suffer a penalty to some extent so either do a 301 redirect to pass on any backlinks, PR and authority that the page has to one primary page. Internal navigation needs to be checked to be sure all links go to the correct version as well. Especially with websites constantly being populated or worked on by many individuals, bad navigation is common. Pick one and use it throughout. The other problem with onsite duplication is many sites are using product feeds to populate their stores. Others may populate their content with an RSS feeds. The problem with these and other dynamic websites is that other websites may being using the same content. If this is the case you run the risk of being penalised for duplicate content. Use Copyscape to search for other duplicate content on the web. If you are not ranking well for the term/terms you'll need to change the content on the page, especially if you are not the originator of the content.
XML Sitemaps and Robots.txt - Universal XML Sitemaps - Providing an XML sitemap is one of the easiest things you can do to help search engines traverse your site. Google, Yahoo and MSN have all adopted this "standardized" tool. Having a sitemap and then submitting it through WEBMASTER Central will tell you not only when the crawl (usually 1-2 days) is complete, but also if there are any errors that the bots found. Robots.txt file - By defining rules in this file, you can instruct robots to not crawl and index certain files, directories within your site, or at all. For example, you may not want Google to crawl the /images directory of your site, as it's both meaningless to you and a waste of your site's bandwidth.
Spiderability - There are many tools available (Google's Webmaster Central for one) that will crawl your site and identify any problems. This like javascript, java applets, and flash navigation all create spiderability issues, as well as many others. Be sure to check these.
10. Use social bookmarking websites for short-term ranking boost and blogs/forums to establish long term trust and authority
Social Bookmarking - Wikipedia defines it: In a social bookmarking system, users store lists of Internet resources that they find useful. These lists are either accessible to the public or a specific network, and other people with similar interests can view the links by category, tags, or even randomly. Most social bookmarking services allow users to search for bookmarks which are associated with given "tags", and rank the resources by the number of users which have bookmarked them. Many social bookmarking services also have implemented algorithms to draw inferences from the tag keywords that are assigned to resources by examining the clustering of particular keywords, and the relation of keywords to one another.
GaryTheScubaGuy defines it this way:
One of the best free ways to get increased ranking, back links and traffic, for very little time commitment other than setup.
This very moment most search engine algorithms are placing a ton of weight on end-user bookmarking, tagging or one of various types of end-user generated highlighting.
Before doing any of this run a rank report to track your progress. I have tested this on terms showing on page one, on terms ranked 11th through 12th and others buried around pages 5-10. It works on them all in different time frames, and they last for different periods of time. This you will need to test yourself. Be careful because you dont want to be identified as a spammer. Be sure to use genuine content that provides a benefit to the user.
Here is how I recommend doing this;
1. Download this; Roboform. (It says it will limit you but Ive had as many as 30+ passwords created and stored in the trial version) This will allow you to quickly fill out signup forms and store passwords for the 10 Bookmark sites that I am going to be sending you to.
2. Within Roboform go to the custom area and put a username and password in, as well as your other information that sites usually ask for to register. This way when you are using these different bookmarks its a 1-click login in and becomes a relatively quick and painless procedure.
3. Establish accounts with these Social Bookmark Sites;
a. Digg
b. Technorati
c. Del.icio.us
d. NowPublic
e. StumbleUpon
f. BlinkList
g. Spurl
h. Furl
i. Slashdot
j. Simpy
k. Google Toolbar (w/Google Bookmarking)
4. Internet Explorer, Firefox and most other browsers have an "add a tab" option, but I use Firefox because I can bookmark the login pages in one file, then "open all tabs" in one click. From here I click on each tab and in most cases, if you set it up right, Roboform will have already logged you in. Otherwise youre on the login page and by clicking on the Roboform button everything is prefilled, all you need to do is click submit. (some of the bookmark sites will allow you to add their button into your browser bar, or you can get an extension from Firefox like the Digg Add-on to make things quicker)
5. Lastly, Install the Google Toolbar. It has a bookmark function as well, and you can import all your bookmarks from Firefox directly into it. Google looks at many different things when assigning rank and trust. For instance, when you search for something and go into a website, Google will remember how long you stayed, how deep you went, and if you came back out into the search to select another site, which means you didnt find what you were looking for. This is all part of the Privacy Issues that have been in the news.
Heres what Google actually says! "The Google Toolbar automatically sends only standard, limited information to Google, which may be retained in Google's server logs. It does not send any information about the web pages you visit (e.g., the URL), unless you use Toolbar's advanced features."
They practically spell it out for you. Use their bookmark feature just like you were doing the social bookmarking I outlined above. This is just one more click.
Some of the elements that Google looks at when grading a website are;
How much time did the average visitor spend on the site?
What is the bounce rate on the landing page?
How many end-users bookmarked the page?
How many users returned to the search query and then on to a different site?
Each time you publish an article put a Google Alert on a unique phrase. Each time Google sends you an alert, bookmark it on every bookmark site. This will take some getting used to, but will eventually become second-nature. Remember what I said in the beginning; "One of the best free ways to get links and traffic, for very little time commitment other than setup".
When you start seeing traffic coming in and your SERPs getting better you will use the heck out of this. Im waiting for someone to come out with software that will automate this process completely, but by the time that hits nofollows may come into play. But for the time being it works and it works well.
(Update: Found one) Bookmark Demon and Blog Comment Demon. It automates the process.
Blogs - When looking for blogs to post on I use Comment Sniper. It takes your chosen keyword and will search through MSN and Google blogs (Wordpress too!) for your keyword and come back with a long list of blogs that it found your keyword on. You can then sign up to monitor it and every 15 minutes or so it will update you on any posts that have been made the forums that you selected. You can then find out if it is a post that you can reply to and maybe get a link to one of your sites. This generally works great because these are aged pages that have earned page rank so you immediately reap the benefits on a backlink on a relevant page. Many have said that with some blogs, they have a nofollow, but that doesn't take away the fact that you have a link to your site, just that the sender (blog/page) doesn't endorse your site.
One more thing regarding posting to blogs and forums. When I create an account I sign up and use a very unique member name (eg. GaryTheScubaGuy). This is because many blogs and forums have a no-follow, which means the link in your signature or on your member name wont show up. So I also sign, or add my member name to the bottom of my post, then add a Google Alert on my signature so that when Google finds the post, it will alert me, and I will then start bookmarking the forum page.
Hot Tip#3 - Comment Sniper can be used to find aged blogs that have been abandoned but have great PR. Posting on these is a quick way to get a boost in the SERP's. Many of these older blogs were created before the addition of the nofollow tag and were not updated when this (blogger) update was released so you get 100% of the benefit rather than just the link. You can also filter the results that it returns by PR so you spend your time posting on pages with PR.
Forums - I use the search engines 'allinurl' feature using forum and "my keywords" to locate these forums. This will take a bit more time to establish as you need to sign up and become a member in good standing before posting any links. Some will allow you to have a screen name that you can link back to your website from, so I'll use my keywords or just a unique name (Like GaryTheScubaGuy) so that I can identify the links as they become indexed.
I recommend spending the time to find a few good forums and regularly post good recommendations or advice. I've done this on SEO forums (like SEO Chat) for a long time and I have 1000's that link to my websites and forum.
11. Optimise Your 404 Page
The search engines look at traffic in their algorithms to grade a page. If you have a complicated URL, one that is commonly misspelled, or do something else that could endanger losing any existing links that are published out on the WWW, this is the landing page the visitor will get sent to. If it has your template and navigation from the rest of the site it will get indexed like a normal page. Change your title and meta to one of your keyword strings, add an image and relative content that reflects your keywords as well. I avoid placing the actual term 404 on the page.
Some of the ways 404 pages are reached are:
Bookmarked sites that have since been moved
The end-user made an error when typing in a url
A moved page is still indexed in the SERPS
There are broken links in your link structure
What are some tips when customizing your 404 error pages?
1. Put a link to your FAQ page
2. Put a link to your top level categories
3. Put a link to your sitemap
4. Create a template 404 page that blends with your site
5. Add a search box
6. Make your 404 pages look as close to your site theme as possible
7. Add true navigation to it.
8. Optimise this page with the same elements as your other pages (See Tip #21)
A simple statement like, You have found this page in error, please select from the menu on the left side of this page will do here, and you will retain more traffic.
12. Get Your Pages Out of Supplemental Results What They Are, How to Find Them and How to Get Out of Them
"Supplemental sites are part of Google's auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Google's supplemental index.
The index in which a site is included is completely automated; there's no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank."
Nonsense!
At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had ' - Supplemental Result' just after the page size. They aren't showing these anymore. Here's what they had to say,
"Since 2006, we've completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results." Of course, you will continue to benefit from Google's supplemental index being deeper and fresher."
Google then said that the easiest way to identify these pages is like this; "First, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files.
Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you don't see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out."
Nonsense!
The easiest way to identify your supplemental pages is by entering this query 'site:www.yoursite.com/&'
Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere.
Now we need to identify why they are there. The main reasons that a page goes to supplemental results are;
1. Duplicate Content
2. 301's. Redirected Pages that have a cache date prior to the 301 being put in place
3. A 404 was returned when Google attempted to crawl it
4. New Page
5. Bad Coding
6. Page Hasn't Been Updated in Awhile
7. Pages That Have Lost Their Back Links
8. And according to Matt Cutts of Google,"PageRank is the primary focus determining whether a URL is in the main web index vs. supplemental results"
Now this isn't the end-all, but it covers about 95% of the reason that you may be in the supplementals.
So now we know what they are, how to find them and why they are most likely in the supplemental results. Now let's get them out of there.
Here are the different methods that I use when I find that a page has gone supplemental;
1. Add fresh content to the page
2. Add navigation to the page from the main page
3. Move the pages to the first subdirectory if it is not already there
4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page
5. Do some social bookmarking on the page
6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central.
7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isn't listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page.
This is the first of many posts I'll be doing here on Bryan's site. We met in Amsterdam and CAC and he's a good guy so since I resigned from CRAP, sorry, CAP (Sorry its just too easy , as a Moderator I've found some time to post other places and I'll be popping back and forth between here and GPWA.
Please feel free to PM me with any questions or just post them here.
Anything on SEO, PPC, Viral or so on. If I don't have the answer I'll research it and get it.
One last point...if I post anything here you can count on the fact that spent either the time, money or resources to validate it. No regurgitated propaganda here.
Cheers,
GaryTheScubaGuy