Is Noindex Enough To Solve My Duplicate Content Issue?
-
Hello SEO Gurus!
I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article.
Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well.
My concern is duplicate content.
In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit.
I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution.
Thanks in advance!
Kind Regards,
Mike
-
Definitely deal with the security issues! Good find there...
Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine.
Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
-
Good thread and I agree with everything Brian has already said. One additional option that hasn't been mentioned is possibly using Repost.us . If your client's blogs are on WordPress, there is a nifty Repost.Us plugin, very easy to install. He could then use this to repost the content on his main blogs, without having duplicate content issues or problems for his SEO. It would get the content where he wants it, preserve authorship plus give a link back to his main site. He would also have the opportunity of monetizing his posts if that was something he wanted to do. Hope this is helpful!
Dana
-
Wow, that's new! Yes, I wouldn't be surprised if the plug-in is at fault.
Well, as usual, issues compound into new issues.
My many thanks for your help and insight, Brian.
Kind Regards,
Mike
-
Wasn't able to visit the site, got this warning, attached.
Kinda poignant that this warning from the Fiji site gave me a warning referencing the Pacific site, which is exactly the kind of thing we're talking about.
Wonder if the very plugin your client is using is causing this issue too. -
Sure, here's an example: this is the main website: beautifulpacific.com, with the blog being located at beautifulpacific.com/blog. One of the satellite sites is beautifulfiji.com, with its blog at beautifulfiji.com/blog.
-
_To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed. _
Agreed. Also, there's no reason he can't write a post for one audience that references a post he made on another domain. It's hard to get a good feel for the whole situation without viewing the sites and blogs themselves.
-
Many thanks for your reply, Brian.
The satellite websites are not where conversations/sales take place; they feed his main site. I agree that providing a feed via the blog's RSS would make more sense. And when you say, "but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in [noindexing]," I wholeheartedly agree. Even if it were to solve the duplicate content issue, it would preclude us from being able to put fresh content up on that blog and leverage it accordingly.
I can tell you that there is nothing nefarious in the client's idea here: his intentions are purely to give users fresh content to explore on the satellite sites. But as he relies on me to guide him in terms of SEO implications, I don't think he thought through how duplicate content could hurt him.
To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed.
-
Have you suggested he use an iframe to host the content from one site into the satellites?
Or maybe simply a feed to show the fresh content to visitors?
Does he convert on those satellite sites or are they micros to drive to the main?The thing is, it is definitely going to be duplicate content, and since the host is presumably the same... well... Not good.
I would ask: "why?" He is expecting to get links to this content on this site one day, the same content on this site the next? If it's a good post, what would happen if someone shares it socially from one domain, and those exposed to it see it elsewhere?
I think noindexing is a good half measure, but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in even doing that. A noindexed blog post getting links? A noindexed blog category getting social buzz?
Force your client to understand the end goal. If he just wants something for them to read, add a feed. Then the social shares and links will do some good to at least the most important domain.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Duplicate content analysis
Hi all,We have some pages being flagged as duplicates by the google search console. However, we believe the content on these pages is distinctly different (for example, they have completely different search results returned, different headings etc). An example of two pages google finds to be duplicates is below. if anyone can spot what might be causing the duplicate issue here, would very much appreciate suggestions! Thanks in advance.
Technical SEO | | Eric_S
Examples: https://www.vouchedfor.co.uk/IFA-financial-advisor-mortgage/harborne
https://www.vouchedfor.co.uk/accountant/harborne0 -
Duplicate content issue: staging urls has been indexed and need to know how to remove it from the serps
duplicate content issue: staging url has been indexed by google ( many pages) and need to know how to remove them from the serps. Bing sees the staging url as moved permanently Google sees the staging urls (240 results) and redirects to the correct url Should I be concerned about duplicate content and request Google to remove the staging url removed Thanks Guys
Technical SEO | | Taiger0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0 -
Duplicate content and URL's
Hi Guys, Hope you are all well. Just a quick question which you will find nice and easy 🙂 I am just about to work through duplicate content pages and URL changes. Firstly, With the duplicate content issue i am finding the seo friendly URL i would normally direct to in some cases has less links, authority and root domain to it than some of the unseo friendly URL's. will this harm me if i still 301 redirect them to the seo friendly URL. Also, With the url changed it is going to be a huge job to change all the url so they are friendly and the CMS system is poor. Is there a better way of doing this? It has been suggested that we create a new webpage with a friendly URL and redirect all the pages to that. Will this lose all the weight as it will be a brand new page? Thank you for your help guys your legends!! Cheers Wayne
Technical SEO | | wazza19850