Noindex Success?
-
Has anyone had success implementing noindex/follow to pages from their site which has been hit by a Panda penalty? Our site has a lot of duplicate content for products descriptions that we had permission to use from our distributor (who is also online). We went ahead and noindex/follow those pages in the hopes that google will focus on the products that we carry that do have original descriptions (about 1/3 of our products).
We didn't want to just remove those products since they are actually beneficial to our customers. Most of the duplication of content is in the form of ingredients lists.
-
I would just use cross domain canonical and look into the other issues with your website that have triggered Panda.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
Why are these blackhat sites so successful?
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:" http://dentalimplantsvaughan.ca - 9 (on google.ca) http://dentalimplantsinhonoluluhi.com - 2 (on google.com) http://dentalimplantssurreybc.ca - 7 (on google.ca) These markets are not particularly competitive, however, all of these sites suffer from: Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier). Average speed score. No structured data No links And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago. But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
Technical SEO | | nowmedia10 -
Does adding a noindex tag reduce duplicate content?
I've been working under the assumption for some time that if I have two (or more) pages which are very similar that I can add a noindex tag to the pages I don't need and that will reduce duplicate content. As far as I know this removes the pages with the tag from Google's index and stops any potential issues with duplicate content. It's the second part of that assumption that i'm now questioning. Despite pages having the noindex tag they continue to appear in Google Search console as duplicate content, soft 404 etc. That is, new pages are appearing regularly that I know to have the noindex tag. My thoughts on this so far are that Google can still crawl these pages (although won't index them) so shows them in GSC due to a crude issue flagging process. I mainly want to know: a) Is the actual Google algorithm sophisticated enough to ignore these pages even through GSC doesn't. b) How do I explain this to a client.
Technical SEO | | ChrisJFoster0 -
404 not found page appears as 200 success in Google Fetch. What to do to correct?
We have received messages in Google webmaster tools that there is an increase in soft 404 errors. When we check the URLs they send to the 404 not found page:
Technical SEO | | Madlena
For example, http://www.geographics.com/images/01904_S.jpg
redirects to http://www.geographics.com/404.shtml.
When we used fetch as Google, here is what we got: .
#1 Server Response: http://www.geographics.com/404.shtml
HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
What is wrong and what shall we do? The soft 404 errors are mainly for images that no longer exist in the server. Thanks!0 -
Medium sizes forum with 1000's of thin content gallery pages. Disallow or noindex?
I have a forum at http://www.onedirection.net/forums/ which contains a gallery with 1000's of very thin-content pages. We've currently got these photo pages disallowed from the main googlebot via robots.txt, but we do all the Google images crawler access. Now I've been reading that we shouldn't really use disallow, and instead should add a noindex tag on the page itself. It's a little awkward to edit the source of the gallery pages (and keeping any amends the next time the forum software gets updated). Whats the best way of handling this? Chris.
Technical SEO | | PixelKicks0 -
Ok to internally link to pages with NOINDEX?
I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses.
Technical SEO | | OMGPyrmont0 -
Will Google Continue to Index the Page with NoIndex Tag Upon Google +1 Button Impression or Click?
The FAQs for Google +1 button suggests as follows: "+1 is a public action, so you should add the button only to public, crawlable pages on your site. Once you add the button, Google may crawl or recrawl the page, and store the page title and other content, in response to a +1 button impression or click." If my page has NoIndex tag, while at the same time inserted with Google +1 button on the page, will Google recognise the NoIndex Tag on the page (and will not index the page) despite the +1 button's impression or clicks send signals to Google spiders?
Technical SEO | | globalsources.com0 -
After entire site is noindex'd, how long to recover?
A programmers 'accidentally' put "name="robots" content="noindex" />" into every single page of one of my sites (articles, landing pages, home page etc). This happened on Monday, and we just noticed today. Ugh... We've fixed the issue; how long will it take to get reindexed? Will we instantly retain our same positions for keywords? Any tips?
Technical SEO | | EricPacifico0