I was busy somewhere and in between these days Google webmaster tool sent me severe situation alert about my blog. The severe site health alert already received in my webmaster tool inbox but I was not aware of this alert.
When I opened Google analytics really stats shocked me. Do you know what was my first reaction? Google might be updated Penguin or released new algorithm. Why not, being the time Google always scaring us anytime, anywhere.
And second reaction was If Google released new algorithm or refreshed penguin update, so why I am not aware?
uff hhaaa… but It relief me a lot when confirmed that there is no such action taken by Google after searching on internet.
At last I found, The problem was robots.txt file. Google can’t read robots file. Actually I changed robots file before a week ago and somewhere I missed to write properly to allow Google bot in robot.txt file.
And this created a big problem to bloggerbonus.com. Meta description disappeared from many of posts and title were changed partially in Google search.
Due to that many of pages blocked and blog traffic went down 30-40% in consecutive two days.
In webmaster tool a warning message displayed regarding bloggerbonus.com. Severe health issues are found on your site. Some important page(s) is blocked by robots.txt.
After opening warning message a status of site and a warning sign found next to ‘is robots.txt blocking important pages?’ and a sample link attached with this message.
Two other options also available there ‘important pages removed’ and ‘malware detected?’ but site was safe both of these warning.
This issue also created problem with xml sitemap that i submitted to Google in Webmaster tool. It was giving me seven warnings.
The next work was to identify all webpages those were disappeared from Google search. To do this I checked all pages manually one by one.
Ohh I forget to tell you that in between this work I used fetch as Google but it always given me denied by robot.txt error. See screen shot try to fetch on 17 May.
Now edited all affected webpages and resubmitted sitemap to Google. before submitted sitemap in webmaster tool, I re indexed all blog post through WordPress sitemap generator plugin and fetch as Google. Now it works fine no more denied by robots.txt file error.
Fetched again one by one all webpages those were affected. Now Google re indexed all webpages and got old ranking of all posts again. And today, 21st May traffic rocks again in Google analytics:) .