Home Page Deindexed Only at Google after Recovering from Hack Attack
-
Hello, Facing a Strange issue, wordpress blog hghscience[dot]com was hacked by someone, when checked, I found index.php file was changed & it was showing some page with a hacked message, & also index.html file was added to the cpanel account.All pages were showing same message, when I found it, I replaced index.php to default wordpress index.php file & deleted index.htmlI could not find any other file which was looking suspicious. Site started working fine & it was also indexed but cached version was that hacked page. I used webmaster tool to fetch & render it as google bot & submitted for indexing. After that I noticed home page get deindexed by google. Rest all pages are indexing like before. Site was hacked around 30th July & I fixed it on 1st Aug. Since then home page is not getting indexed, I tried to fetch & index multiple time via google webmasters tool but no luck as of now. 1 More thing I Noticed, When I used info:mysite.com on google, its showing some other hacked site ( www.whatsmyreferer.com/ ) When Searching from India But when same info:mysite.com is searched from US a different hacked site is showing ( sigaretamogilev.by )However when I search "mysite.com" my site home page is appearing on google search but when I check cached URL its showing hacked sites mentioned above.As per my knowledge I checked all SEO Plugins, Codes of homepage, can't find anything which is not letting the homepage indexed.PS: webmaster tool has received no warning etc for penalty or malware.
I also noticed I disallowed index.php file via robots.txt earlier but now I even removed that. 7Dj1Q0w.png 3krfp9K.png
-
.htaccess file has nothing but
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>END WordPress
Installed Plugins
Yoast SEO, Google XML Sitemaps, Akismat, Udinra All Image Sitemap, Social Share Bar (Digg Digg Alternative), Jetpack by WordPress.com, AuthorH Review.
Apart from Yoast, it seems nothing can block site, and Yoast settings are fine, just disabled tag indexing & subpages along with author archive.
Problem is something else I guess
-
Hi Ankit,
Though I have checked for the pages you're serving to bots, could you please have a look at your .htaccess file once? Does it contains something like:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo)Do you have your code's copy in github or bitbucket or any other source code management tool? If yes, please scan last few commits thoroughly.
You can create a list of plugins installed recently. Remove them one by one and submit your home page URL to GWT for fetching a fresh copy it every time. Not sure what's the issue here, let's do hit-and-trial to deep dive a bit.
-
Hey Alan,
Do let me know if you find some solution or identify the problem.
-
That's what. Not able to find any good information to go next-step for this. But, still checking random things with a "hope".
-
Domaintools domain report shows no more info that could be helpful. Leaving me at a complete loss as to what else to check.
-
More info.
Because Nitin was able to run a ping and traceroute without problem, I went to DomainTools.com - the worlds leading resource for forensic digital investigative research. I use it whenever I am doing investigations for expert witness work I do.
When I ran the domain there, it had a screen-capture of the home page from June. So I submitted a refresh, and it came back as not being able to provide a screen-shot of the home page.
While not a smoking gun issue, it further clouds my trust in regard to whether the domain is actually functioning properly in the hosting environment as I originally thought it might not be.
I will run a deeper test to see if I can get more information, however I wanted to post this update because I believe it relevant.
-
Well, this is probably 1 of the most interesting issues an SEO can come across with. Google is showing different cached version in different countries. For me, that's strange too. Is that usual thing?
-
Nitin
Thanks for doing that - Now I'm stumped - I've never had Pingdom fail before with both ping and traceroute. And I now wonder if it's a non-issue, or part of the confused mess that Ankit referenced somehow.
-
That's right, its showing different cached versions in different countries. Just checked for US here. Screenshot attached.
-
I think that index.php disallowed was not an issue, I took suggestion and removed it but many sites disallow index.php via robots.txt to avoid duplicate content issue in site.com & site.com/index.php
here is an example - http://www.shoutmeloud.com/robots.txt
Still I did it about 10-12 days ago, fetched & submitted to index & also put rendering request.
Attaching current Screenshot of last rendering request.
I think some other issue, what's your view on that info:site.com showing some other hacked sites, how's this happening & sites are also changing. Its different in India, Different in US.
-
Ping and traceroute worked for me when I tried using my terminal (screenshot is attached).
Well, I agree that the problem is actually bigger. If you see its cached version on google, it was last cached on 16th Aug i.e after the issue of index.php/index.html was fixed by the admin (another screenshot attached).
I tried to see this page as googlebot as well, couldn't find the issue (wanted to check it for cloaking as well).
-
UPDATE TO MY ORIGINAL COMMENT
I initially found a problem doing a ping and traceroute test using Pingdom.com - both returned an "invalid host name" error, something I have not seen previously for both ping and traceroute simultaneously.
Nitin (see his comment below) did a similar test locally and found both to be okay. Though he has other thoughts.
I just wanted to clarify here now, that my original finding may not be a key to this issue, though I want to understand why my test came back that way...
-
You said you remove the index.php from the robots.txt. I just wanted to when did that happened? Because after removal, it usually took some time to get back in index (crawler need to recrawl the website accordingly).
My advice is to resubmit your robots.txt and updated sitemap.xml to Webmaster console and wait for the next crawl and this should be fixed.
Hope this helps!
-
Just sent SC, Nothing helped so far, Its quite strange that the info:domain.com is now showing some other hacked URL. SC attached.
-
It was quite strange for me as well, Just attached Screen Shot after fetching for 1 more time.
1 more thing I noticed, that info:mysite.com is not showing some other Hacked domain. Not sure How it's happening & why It's happening.
Sorry for the delay in reply, I was not getting email updates so I though no one answered my question.
-
Hi Ankit! Did Nitin's suggestions help at all? And are you able to share the screenshot he asked for?
-
Check the following, may be it'll help you resolve the issue:
https://mza.seotoolninja.com/community/q/de-indexed-homepage-in-google-very-confusing
https://mza.seotoolninja.com/community/q/site-de-indexed-except-for-homepage
-
That's really strange. Could you please share the screenshot when you're trying to fetch it as google in the GWT?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate page titles for blog snippets pages
I can't figure the answer to this issue, on my blog I have a number of pages which each show snippets and an image for each blog entry, these are called /recent-weddings/page/1 /2 /3 and so on. I'm getting duplicate page titles for these but can't find anywhere on Wordpress to set a unique title for them. So http://www.weddingphotojournalist.co.uk/recent-weddings/…/2/ has the same title as http://www.weddingphotojournalist.co.uk/recent-weddings/…/3/
Technical SEO | | simonatkinsphoto0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
Titling Category Pages Like You Would a Blog Page?
So, with our 600 or so category pages, I was curious... on each of these category pages we show the top 12 products for that category. In trying to increase click through rate, I wonder if it would be prudent to use some of the strategies I see used for Blog posts with thee category pages. i.e. Instead of Category Name - Website Name How about: Top 12 Kitty Litters We Carry - View the Best and the Rest! Or something like that. And then in the description, I could put, "Number 8 made my jaw drop!!!" (Ok, kidding about that one...) But serious about the initial question... Thanks! Craig
Technical SEO | | TheCraig0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Why is google not deindexing pages with the meta noindex tag?
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function. Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50 How long should it take for these pages to disappear from the index?
Technical SEO | | JGar-2203710 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0 -
2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
Hello Everyone, I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it. Can you spot it? Does he have some settings wrong? What should he do? Thank you.
Technical SEO | | QuietProgress0