The New SEO strategies

Chatmaster

Dormant account
Joined
Mar 4, 2004
Location
South Africa
Hi everyone

I have been in the SEO industry for a very long time and the past 2 years has left us with some really scary updates. But what do we know about the big Google as far as now is concerned? Please note in the discussion below I am leaving out the SEO basics and stricly talk about the latest stuff. I made the following list and would like to get some feedback on this from everyone else...

Onpage factors

Site structure
At this stage onpage factors basically mean that of page construction and usability. It also means we need to considder our website structure to be deep and themed according to our content.

Copy changes
Based on the Google patent of historical data it has become clear that Googleare looking at frequently updated pages and pages with allot of content.

W3C
Yah I know, W3C complaint is bs but at the end of the day it is about the basic principle that we are taking about. How does the SE interpret the HTML code when it spiders your site's pages. Open tags or flash can really ruin your sites chances.

Off-page factors
We all know that we need backlinks. The backlinks needs to be themed and industry related to really give us the strength.

The latest
TrustRank
The latest google registration bowled a few people, I am sure. There are so many conspiracy theories on TrustRank... My take on trust rank? Well I see TrustRank as algorithm that will definately get rid of allot of crappy websites and spam. The core difference betweenTrustRank and PageRank. PageRank focus on awarding a point for incoming links, and for a long period of time this has caused the SERP's to be flooded with spam. TrustRank bases it's algorithm on the assumption that Good sites reference good sites in their content. So the future of SEO seems to be about having websites that are as natural as possible. But how do we define natural? :D
 
Will "Trust Rank" be incorporated into the Google toolbar? How important will this be from webmasters to track, or will they be able to track this and/or influence its outcome?


I want to be trusted :D
 
Bryan, I think that is a difficult one to answer, but as I see it trustrank essentially will be displayed in the SERPS. So if you rank #1 it will represent your TR. TR also works with a point between 0 and 1 so I doubt that Google will display that, but who knows? I said the same of PR years ago and look what happened... :D

To be honest... TrustRank is a scary algorythm! Here are the reason why...

TrustRank makes use of a couple of strategies to clear their data sets. The most important point though is that every single page linking to you and that you are linking to will effect your TR point. However, I know from experience that Google will not allow incoming links to penalise a site, as that will allow blackhats to have a jol! So Backlinks will only count positive towards a site. What will hurt you though is the sites you use to reference your own content.

One thing is for sure, it is wise to gradually looking at replacing your links pages with content referencing pages. As links pages will soon do more damage than good in my view anyways. TR has many benefits for serious outfits though as it will definately make it allot more difficult for black hats and fly by nights to get into the SERPS.

But let's address a couple of concepts and facts about the Google algo. Please note once again this is my version and view! Very few is proven. The Google algo as we know it actually consists out of different filters and algorithms that clear the datasets before the actual TR algo are used to reveal the final results.

When you do a search of e.g. 'online casino' there are currently 109,000,000 results! That is amazing! The result was returned within 0.31 sec That is even more amazing! So the first thing you probably say is how do they do that? It is actually quite simple. They work with a dataset that are much smaller! In fact, in this case a dataset that only had 772 results :D

That is very clever I'd say. How do they do that? Please note the rest is filled with speculation! The following takes place before you actually do a search!

Google check all pages indexed for backlinks and exitlinks. Now they already drop about 80% of all there pages as they have none of these. At this stage let's say they have 20,000,000 pages left to work with. They filter out new domains and domains from the same IP and / or C block. They just dropped a further 80% of the pages and are now on about 4,000,000 pages. Now this process continues with other filters and algo's untill they have only a couple of thousand pages left. This is when the TrustRank algo comes into play. Because TR cannot work with all that many data else the objective of TR will not be reached.

At the end of the day I believe that you need to focus on who you are linking to and get good quality sites to link to your site. There are onpage factors and technical factors that play a very important role as well, but those we are all aware of. In effect what this will mean is that we as webmasters will assist Google in Rooting out evil sites as we will use our expertise in identifying junk pages and only link to the quality that our industry has to offer. This is genius!
 
Last edited:
My rant on how idiotic Google is

Google really sucks as a search engine IMO - unfortunately we're stuck with it. Yesterday, Casinomeister was on page 2 for "online casinos" today it's page 5, and I didn't do a damn thing.

"online casinos" and "online casino" are essentially the same damn thing; one is plural whoopdeedoo. But the results are totally different. This indicates a big problem in how this search engine lists sites. Users are unaware of crappy results.

If a user types in "online casino watchdog" - I'm number one. But if a user types in the plural forum (watchdogs), I'm not even on the radar. This is a disservice for the user - crap results.

Type in "online casino information" and I'm usually number one or two - but Riverbelle Casino is right below me. That's downright stupid because this is a casino - it doesn't have much to do with "information".

Chatmaster said:
...What will hurt you though is the sites you use to reference your own content.
Could you expand on this? Please :D
 
Basically what TR says is...

It creates a huge hirachy of sites that are linking to each other. These sites are each awarded a trust point according to it's incoming and external links. This means that by linking to a spammy(non-trusted) site you can damage your trust point negatively within this hirarchy. However the same cannot be true for most incoming links as this will clearly make it possible for Blackhat SEO's to influence the rankings of competitors.
 
Sorry Bryan it just dawned on me that you want to know sites you reference,... These university studied multiple degree techies, feel that good content reference other sites as to proof and support their content. Similar to what a student will do when writing a paper.
 
It's still the same, really - you should have good content, relevant links, don't link to trashy sites, (bad neighborhoods, sites with huge link pages are suspect these days) and make each page so it can be read by users easily and fast.

Google also likes to see natural development - Bryan and me and other old, established sites don't worry about that part anymore.

But my guess is that new sites that add too many content pages and too many links in a hurry will have the opposite results from the ones they aspire.

Really, all google is trying to do is identify legitimate, informational sites that load fairly quickly and that are respected by other sites of the same caliber.

They will forever have to work at accomplishing this because there are forever new people coming on the scene trying to get to that spot faster and trying to outsmart google - so you can read SEO forums until you have steam coming out of your ears. Then you implement some of the stuff and eventually find out that some time ago google secretly decided they could locate this SEO stuff and changed their algo to punish you for trying to outsmart it.

The searchengines and SEO are always going to be at odds, because essentially SEO is trying to outsmart the engines, and the engines are just trying to find a way to identify true quality.

As long as you stick with real quality and visitor service, you will always end up on top in the long run.
 
You are right dominique a truelly outstanding post! The changes applied recently are subtle. They have been gradually tweaking their algo's over the past few months. But what is important to note with TrustRank is that it basic principle is to be a spam fighter using the opposite of the PageRank algo to accomplish that. PageRank are focusing on incoming links and TrustRank on outgoing links. Which means there needs to be a mindset change amongst webmasters. There are other issues that also become allot more important that I can't seem to find on any SEO forums. Those that knows about it either keeps quite or don't know they are accidentally doing the right thing.
 
Bear in mind we are in the middle of a massive update CM, with four sets of differing results being offered. I see you in the top 20 for online casinos and making progress on many other serps.

These wild fluctuations are normal during a google update. All in all though it is looking good for you ;)
 

Users who are viewing this thread

Click here for Red Cherry Casino

Meister Ratings

Back
Top