Think Human, Think Google!The biggest problem I see in today’s SEO professionals & companies that a lot of them still have to find out how Google thinks. Given the fact that there are over 200 different criteria that comprise Google algorithm of indexing and giving a site “authority”, it seems pretty tough that everyone can grasp the philosophy of Google. However, that’s not right, anyone who can think human, can think properly how Google sees a website without having much knowledge of what are those factors in Google algorithm and how they work, trust me on that.
Base of a strong SEO foundation for anyone who wants to achieve mastery in this subject is “understanding search engines”, and mainly the big “G”. As I said before, you don’t have to be a Ph. D in search engines to understand how their search algorithm works, but rather you need to think rational, you need to think more like a human than a robot, you need to think out of the box. Let me make it very simple for those who need to get out of this article with a profound understanding of how Google thinks. Further reading and Google updates later on will make more sense to your SEO foundation.
Did you ever think why Google outranked other search engines? Simply because it gives you the best filtered results for your queries, which means less waste of time in finding what you are looking for and bringing you the “right” sites on top through SERP or Search Engine Result Placement. I have been following Google’s moves since many years and I see a regular and continuous improvement process on how it indexes sites and how it qualifies results for its users. Many sites in the past used to rank higher (and they still do in Yahoo/Bing) but they are no longer in top results simply because Google found that they are not providing what a user “expected” to find during their search process on their landing pages. For example, eBay used to enjoy a very high ranking for tens of its sub-domains in the era when Google used to consider a sub-domain as a separate website, but it obviously affected the quality of its SERP and diversification of results on the first page. This is where Google took the step and now a sub-domain is NOT considered as a separate domain but rather treated as another page of that website.
If you want to understand Google, then just follow the “common sense” rules. Anything and everything that raises “shadows of doubt”, will not filtered out in top results. We all know how the duplicate content thing goes in search engines and that’s one of the example how Google thinks. Suppose you were a teacher and you have to evaluate 10 students for essay writing skills. Imagine that one of those students wrote a perfect article and the rest just copied it. Now as a teacher, its your job to find who did the original work and who copied and based on your evaluation, the ones who copied will never get 10/10. Again, as I said think human and you will think Google.
Other than the very sophisticated algorithm, Google has the technology AND the people to monitor the sites which are proving to be “too good to be true”. I recently saw a case, where a site in loans vertical just got extraordinary ranking within three months for really high competent keywords and they were minting money. That for Google is something that sets an alarm (how? That I have yet to discover). This particular website had very thin content and very few pages. All the ranking it achieved was through heavy link-building work. I know the site owner and I know the SEO company who did the job and I can assure you that they did all white-hat legitimate SEO. Still that “shadow of doubt” that Google sniffs from miles doesn’t get pass easily. Google will never jeopardize a user experience even though its algorithm got that website in top results. It was just a matter of weeks when the site was penalized and now that sites is nowhere in Google. Interesting! It’s similar to a human example, consider in your city there are top 10 restaurants that earned their reputation through years by their catering and customer service. And suddenly, everyone is talking about a restaurant which just started last month. Humanly, it’s possible that this restaurant is excellent, but your shadow of doubt will always tell you “there is something wrong”, this guy must paid the media and hired people to talk high about himself. So the gradual trust achieved through years have the better option to win if you were to judge. This is human and this is how Google thinks.
While dealing with Google during your SEO efforts for your own site or your client’s site, always try to be natural and think human. Consider your reader first before the search engine. Provide quality content that attracts the visitors and produces viral affect through the readers network (social media power). Always try to mix your link building strategy for the authority and frequency. A lot of high PR links coming to a URL within no time will trigger an alert to Google. R&D department in my company produces a list of hundreds of high PR blog that allow posting without admin approval using scrapebox, SEOMoz, Blekko and various other tools. Through all these years, we have a list of thousands of high PR “do-follow” blogs and sites that can give you an instant back-link and I can get a site on first page within weeks using these blogs, but I will never do that as its not human and it’s not natural. And if I do it then I did unjust to my client and the profession as well. It has always to be natural with a natural blend of mixed links from a big pool of available sites.
With Panda update release, a lot of websites have already been penalized because of poor quality content but high back-link profiles. This also affected the sites that are working using data-feeders. I will write a separate blog-post on how Panda update affected datafeeder dependent sites and how to correct it. Please subscribe to our RSS feed so you get that article right away whenever its up and running. Again a last word of advice, always do your best on on-page SEO with unique content, diversified pages, internal linking and valuable information first and then start developing a back-linking strategy with a gradual increase. On average in my experience, if you are targeting a high competency keyword, give it 8-12 months to become an authority site in the eyes of Google. Lower to medium competency keywords can take somewhere between 4-8 months.