The Feature Called “Manual Action” Against Spam for the First time on Google Charts

3/10/2013 06:30:00 pm



These days, Pure Spam is the biggest category involved in violating guidelines of Google in not any specified manner but in some way, which involves cloaking or links which are unnatural and generate few actions. 

There is an associated page in which Google provides definitions for these all categories of spam. Pure Spam is oddly listed as also involving some type of spam which also gets enumerated. Basically, pure spam is defined as the site that appears to use spam techniques which are aggressive such as cloaking, gibberish that generate automatically, content scrapping from other websites and/or shocking or repeated violations of Webmasters Guidelines of Google. Hacked sites are the mostly generating actions just after pure spam. Google is imposing penalty on the web sites that have been hacked and are now no longer giving the content for which they originally earned their rankings. Actions of unnatural links, which gathered huge attention, are well below on the action list.

A part of this is also due to the fact that Google also performs automatic actions against the web pages and web sites such as Panda Update in the year 2011 to counter thin content or the Penguin update in the year 2012 to counter unnatural links. There are no notifications sent in these cases and you are penalized automatically. If we consider the chart of spam actions, it does not completely reflect all actions against spam that Google takes. A total line is lacking in the chart but lines of the pure spam and the legacy give a pretty good idea of when manual actions has raised. Eventually, manual actions over the spam were mostly done in June of 2011

For those who are hit by the manual actions of Google can only make amendments to their respective web sites and keep hoping that Google will spot these automatically and return them into good refinement of movement. People can send a Google Reconsideration request for the manual actions. Few steps are required to do this request by logging into Webmaster tools and check for any errors, such as ‘URLs restricted by robots.txt’ or ‘URL unreachable’ errors. If there are any errors, go deep, follow the recommendations and submit the reconsideration request. Records suggest that the highest number of reconsideration requests were sent during October of 2010. This was basically due to up gradation of notification system by Google and because of which there were more messages sent about manual actions which resulted in more number of reconsideration request. 

The ‘search works’ performed by Google basically involves three major parts. It begins with the first part which is called ‘Crawling and Indexing’. In this Google finds and stores the web pages in order to make them searchable. The second part which deals with how matches are returned in response to the search and how Google decides which pages should on top of the index. The third and final part deals with how Google fights the spam.

You Might Also Like

2 comments

  1. nice blog . This is very informative blog .

    ReplyDelete
  2. Hi
    blogger, Its very informative post , Thanks for u. I am really greateful to u.

    ReplyDelete

Popular Posts

Like us on Facebook

Flickr Images