Skip to content

Google Penguin 1.1 Update

August 7, 2012


Google did disco dancing again on May 25 2012 when Matt Cutts pushed out Minor Weather Report on Penguin Algorithm. It’s a “data refresh”, he said. A part of Google webspam fighting targeting the sites Violating Google Quality Guidelines. Speculation was far from over on Penguin being right or wrong and Penguin 1.1 resulted in immediate clamor across SEO industry.  Although Matt announced it would affect less than 0.1% of English language searcher yet many webmasters claim this new update has hit their site by mistake. Contrary to that, some sites with quality content recovered their rankings which they lost to Penguin launched back in April 2012.

So what is this Penguin 1.1 anyway?
First of all it’s an Index update to Old Penguin. Penguin emphasized more to eliminate and filter SERPs (Search Engine Result Pages) from Google web having low quality links. Since then we had been noticing webmasters out crying especially with quality content websites. Although Google intends to reveal little on  Penguin 1.1 as we are expecting more of these updates in near future but what I understood to explain; this new update focuses more on,

Keyword Stuffing: Sites loaded with randomly repeated keywords in Meta files, content and articles are no longer welcome. It’s spamdexing offering little or no benefit to users hence skewing deceptive results. Google is keen to filter and eliminate SERPs with no relevance contaminating search results.

Cloaking: Cloaking means when a webpage shows content to Googlebots which is different from that presented to site users. Usually it’s done through IP addresses functioned especially to locate when a Search Engine robots crawling the site, a different version of webpage appears deceiving the spiders. After the announcement of Penguin 1.1 Google is immediately removing and blacklisting sites doing cloaking black hat practice.

Duplicate Content: Sites with copied content from other websites can be detected very easily. Google always discouraged duplication of content and Penguin 1.1 is the very killing machine for webpages doing this blackhat.
Penguin 1.1 is also an update on techniques of creating links. Now Google is hungrier to devour Sites with low quality links. A little tip about creating links in the light of new Google dance is to avoid paid links, low quality paid links, irrelevant links, back links coming from disputed sites, hidden links etc. Choose versatility while creating anchor text links. Try to get backlinks from various and different IP addresses in same niche.  Also weed out non qualitative links coming to your site through link farms.

Rayman Corporation has always followed Google Webmaster guidelines so never been penalized by changing algorithms. We have been right on the top in getting predictions right about Google Updates. Penguin and updates are positive change for Google Quality guideline practitioners. So try to embrace these algo changes and stay out of harms way.

Keep checking your website Organic fluctuation and if ranking is affected by more than 50% then it was because of Penguin. If you are in doubt of being hit by Penguin mistakenly, you can access Google recovery form through this link,

Steven Adams


From → Uncategorized

  1. James Jordan permalink

    Simply wish to say your article is great. The clearness for your submit is just great and i could suppose you are a professional on this subject. Fine together with your permission let me to take hold of your feed to keep up to date with drawing close post. Thanks 1,000,000 and please continue the great work.

  2. Paul permalink

    I was Worried about the Google updates, specially now I came to know why my website lost ranking from the first page of Google. Thanks a lot for this wonderful information.

  3. Kristy permalink

    thanks for this information … about Google updates but Mr. Steven I hope you know google just recently banged the paid links.. so how we can get the inner links for our website now days because Now days my website got effected by Google update. I dont know its because of content or paid links..

    • Google pays attention to Inbound links on a site because people link to a site only when they find it interesting enough. In terms of quality content, user friendly content. So for Google Inbound links signifies the importance of a website which is based on users experience. If one cant buy the links, how can inbound links be achieved. Answer is in the basic idea of Google for sites with inbound links. Make your website qualitative. Content is King. Blogging, articles on top articles sites, usage of social media and anything which is good and interesting enough to attract attention of crowd enticing them to link to your site and distribute what you are publishing on web. Reciprocal links can also be helpful. This will take time and significant amount of effort but it will work.

  4. Abubakar Zahid permalink

    Yes, very true. I’ve stopped backlinking strategies months ago and decided to just concentrate on making long, meaningful posts. I’ve been getting more organic views since.

    Very good article. I’ll bookmark this.

  5. Alice Fox permalink

    Google has been making some big changes in their algorithm lately and the latest is something called the Penguin update which seems to have wiped out many internet businesses overnight. Unfortunately we are slaves to Google if we want free SEO traffic and we need to adapt if we want to remain in business for the long term!

  6. Kristy permalink

    thanks for sharing, this information is important to know. It’s always challenging to stay up with Google’s changes, but you’ve seem to have done so. Again, thanks, I agree with you and already implement most, but will be adding some of the other techniques to my SEO ranking campaigns.

    This is awesome advice, Steven. Will be posting this article to my blog and Facebook. Keep up the excellent work! Yours, Kristy Allen ♥

  7. Jenifier.. permalink

    If Penguin simply ignored these spammy links then even if a publisher were to spam up their website it wouldn’t have any negative effect, which some publishers might think those sites should be punished. Trust me they would be punished simply by Google ignoring the spammy links. I don’t think Google really gives much thought on how their algorithms can be gamed to hurt sites on purpose as their focus seems to be how to punish sites that are trying to game the system on purpose and reward the ones who aren’t. How would Google correct that mistake? By manually ignoring all the spammy links? So Google fails to see both sides of the coin, how their algorithms can be manipulated not just to increase the rankings of a site, but also to decrease the rankings of a competing site.

  8. Luca permalink

    Jenifer is right. Google can simply eliminate unworthy or irrelevant links from SPAMMY sites from the search results. Instead of penalizing legitimate sites, Google can make its search experience more relevant with spontaneous blocking of spam sites without the search ranking of the primary site.

    Now that Google has given clear warnings to Web publishers, I assume they’ll be more careful with their backlinks.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: