Google Indexing - what did I missed??
-
Hello, all SEOers~
I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses.
But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG
I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious.
Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much.
I am re-doing 301 redirect within today, but I am not sure it means anything anymore.
Any advise or opinion?? Thanks in advance~!
-
thanks for your kind advice.
I will try to follow up your suggestions~ thanks
-
Hi there,
Complete your 301 redirects, but do it 1-on-1 basis - one old URL toward one new URL. DO NOT redirect all your URLs to your home page. (after you do that, verify to make sure is indeed 301 redirect and not other types of redirects like 302).
a) The most beneficial way is to 301 redirect as much as possible following a structural way: the old categories to the new categories and so on. Don't worry there are no limits on how many 301 redirects you can use, just don't loop them with intermediary redirects, like: old URL -> 301 -> intermediary URL -> 301 -> final active URL. Go directly from the old to the new, final, active URL in 1 step if possible.
b) Verify if in your Webmaster Tools there are old Sitemaps. If there are, delete the old ones and create new ones that have to contain only new URLs.
c) Make the same move for the robots.txt file as well. (If you don't have a robots.txt file, create one and place it in the root of your domain name, e.g. www.example.com/robots.txt )
d) If possible, use all instances of "fetch as google bot" and then subscribe those URLs for crawling but do it as much as possible for the main node pages from your website (e.g. main categories), don't waste this function for final product pages, as Googlebot will go link-by-link from the categories and will re-discover all your URLs.
e) Be patient, the Page Rank and old traffic flow won't happen over night. It can take up to 3 months for Googlebot to re-discover and re-index all the pages of your website (i know it's a long time but usually happens a lot sooner).
f) Keep a close eye on Webmaster Tools account and make sure you solve any problems that appear in a due time.
g) Scan your entire new website with a software to make sure you don't have broken links, it's important. If you find any broken links, solve them imediately.
I hope it helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
How long does Google takes to re-index title tags?
Hi, We have carried out changes in our website title tags. However, when I search for these pages on Google, I still see the old title tags in the search results. Is there any way to speed this process up? Thanks
Technical SEO | | Kilgray0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
Google indexing despite robots.txt block
Hi This subdomain has about 4'000 URLs indexed in Google, although it's blocked via robots.txt: https://www.google.com/search?safe=off&q=site%3Awww1.swisscom.ch&oq=site%3Awww1.swisscom.ch This has been the case for almost a year now, and it does not look like Google tends to respect the blocking in http://www1.swisscom.ch/robots.txt Any clues why this is or what I could do to resolve it? Thanks!
Technical SEO | | zeepartner0 -
Google Update Frequency
Hi, I recently found a large number of duplicate pages on our site that we didn't know existed (our third-party review provider was creating a separate page for each product whether it was reviewed or not - the ones not reviewed are almost identical so they have been no indexed. Question - how long do you have to typically wait for Google to pick this up On our site? Is it a normal crawl or do we need to wait for the next Panda review (if there is such a thing)? Thanks much.
Technical SEO | | trophycentraltrophiesandawards0 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
Technical SEO | | RichardPicard0 -
Will Google Continue to Index the Page with NoIndex Tag Upon Google +1 Button Impression or Click?
The FAQs for Google +1 button suggests as follows: "+1 is a public action, so you should add the button only to public, crawlable pages on your site. Once you add the button, Google may crawl or recrawl the page, and store the page title and other content, in response to a +1 button impression or click." If my page has NoIndex tag, while at the same time inserted with Google +1 button on the page, will Google recognise the NoIndex Tag on the page (and will not index the page) despite the +1 button's impression or clicks send signals to Google spiders?
Technical SEO | | globalsources.com0