Accidental No Index
-
Hi everyone,
We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all.
If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
-
- Make sure the redirects from http --> https are 301 redirects
- Make sure the canonical URLs have been updated to https
- Make sure your sitemap URLs have been updated to https
-
Thank you for your reply and sorry for the delay in mine. We have the https versions of our site in GA and GSC. The https versions of the sites were not fully indexed. We noticed when traffic was falling significantly on all the sites.
Unfortunately according to GSC, Google has still only crawled a small fraction of all client sites and rankings/organic traffic are slightly improving but still not back to normal. We submitted the site maps and have re-submitted some. We can try to again?
Any other input would be appreciated.
-
Great suggestions by Oleg.
Yoast wrote an article last fall that talks about some of the reasons a site might be slow getting indexed and how to get it indexed faster. They have some worthwhile suggestions.
Did you submit the new (https) version of your website to both google search console and google analytics? Is that what you're looking at? Was the https version of the site fully indexed before you noticed the noindex tags?
Have you confirmed every page on the site was converted to https and there aren't any remaining pages (images or pdfs, for example) that are still http and therefore not showing up in your reports?
-
- Crawl the entire site and make sure no noindex tags remain
- Keep resubmitting the sitemap(s)
- If you have an HTML sitemap, you can do "Fetch as Google" then "Index this page and internal links" which will help recrawl the pages
Don't think there is much else you can do but send Google signals to recrawl the site and see that the noindex tag was removed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Longer Indexed in Google (Says Redirected)
Just recently my page, http:/www./waikoloavacationrentals.com/mauna-lani-terrace, was no longer indexed by google. The sub pages from it still are. I have not done anything sketchy with the page. When I went into the google fetch it says that it is redirected. Any ideas what is this all about? Here is what it says for the fetch: Http/1.1 301 moved permanently
Technical SEO | | RobDalton
Server: nginx
Date: Tue, 07 Mar 2017 00:43:26GMT
Content-Type: text/html
Content-Length: 178
Connection: keep-alive
Keep-Alive: timeout=20
Location: http://waikoloavacationrentals.com/mauna-lani-terrace <title>301 moved permanently</title> <center> 301 moved permanently </center> <center>nginx</center>0 -
Why seomoz.org still in Google index?
I searched in Google, the number of URLs indexed left in the seomoz.org domain since it changed to moz.comI am surprised that after all this time more than 15,000 URLs indexed:https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Aseomoz.org%20inurl%3Aseomoz.org If I clicked on any of the results it will be redirect (301) to the new domain, so it is working, but Google still keep these URLs in the index.
Technical SEO | | Yosef
What could be the reason?Will not cause duplicated content issue on moz.com?0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
Does Google Parse The Anchor Text while Indexing
Hey moz fanz, I'm here to ask a bit technical and open-minding question.
Technical SEO | | atakala
In the Google's paper http://infolab.stanford.edu/~backrub/google.html
They say they parse the page into hits which is basically word occurences.
But I want to know that they also do the same thing while keeping the anchor text database.
I mean do they parse the anchor text or keep it as it is .
For example, let's say my anchor text is "real car games".
When they indexing my link with anchor text, do they parse my anchor text as hits like
"real" distinct hits
"car" distinct hits
"games" distinct hits.
OR do they just use it as it is. As "real car games"0 -
Should I put meta descriptions on pages that are not indexed?
I have multiple pages that I do not want to be indexed (and they are currently not indexed, so that's great). They don't have meta descriptions on them and I'm wondering if it's worth my time to go in and insert them, since they should hypothetically never be shown. Does anyone have any experience with this? Thanks! The reason this is a question is because one member of our team was linking to this page through Facebook to send people to it and noticed random text on the page being pulled in as the description.
Technical SEO | | Viewpoints0 -
Quickest way to remove content from Google index?
We had some content on our own website indexed by Google and the content was changed later. But that content is still showing up in Google results. Of course because it was indexed. Its very important for us that content should not show up in Google. So how to remove that content quickly from Google Index? I know normally when it crawl again it will show new content. Google url removal tool or Google url fetch ? or anything else?
Technical SEO | | Personnel_Concept0 -
Sitemap coming up in Google's index?
I apologize if this question's answer is glaringly obvious, but I was using Google to view all the pages it has indexed of our site--by searching for our company and then clicking the link that says to display more results for the site. On page three, it has the sitemap indexed as if it wee just another page of our site. <cite>www.stadriemblems.com/sitemap.xml</cite> Is this supposed to happen?
Technical SEO | | UnderRugSwept0 -
Domain restructure, sitemaps and indexing
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school. The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it) To this end, I took categories like: /body/amazing-big-shoes/
Technical SEO | | magdaknight
/style/red-boots/
/technology/cyber-boots/ And rehoused all the content like so, doing it all manually with ftp: /boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/ I placed 301 redirects in the .htaccess file like so: redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/ (not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.) Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so: <url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url> And resubmitted the sitemap to Google Webmasters. This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939. Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely? I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ? PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted0