Duplicate Content & Tags
-
I've recently added tags to my blog posts so that related blog posts are suggested to visitors.
My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog.
I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited.
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b -
If the only option is to disallow via the robots.txt, then I would agree with your setup - disallow the slugs specific to the tags you don't want indexed. I've heard shopify is a little rough to work with sometimes because of the limitations, so whatever you can do I think is better than nothing. Remember that the robots exclusion is treated as a suggestion and not a command, so if it's possible to assign a no-index meta tag to those URL types that would be best case.
Looks like you're on the right track with the post below:
{ % if handle contains "tagged" % }
{ % endif % }
The one suggestion I would make is that you use noindex,follow so the content will still be crawled, but the duplicate tag won't get indexed. That would create multiple paths to the content on your site, but not create an index bloat issue with multiple tags.
-
Yoast is a WordPress plugin, not Shopify so that option isn't available with the current CMS. Just wanted to chime in to make sure others aren't looking for Yoast SEO in the Shopify app store.
-
I'm using Meta Tagger as the SEO plugin, I've not heard of Yoast SEO but will certainly check it out.
I understand that I need to exclude the tags from being crawled and think I might have worked it out but I'm not 100% sure, as I mentioned my understanding is fairly limited.
My URL which is being seen as duplicate content looks like this
http://www.tangled-yarn.co.uk/blogs/news/tagged/sock-knitting
If I exclude the handle 'tagged' from being index this should work. I think the code should be
{ % if handle contains "tagged" % }
{ % endif % }
Do you think this will work?
-
Do you use Yoast SEO, or another plugin? The key is to set tags to no index so that the crawler only goes through your category links. The issue is that your tag URLs are being indexed and you don't want that. The option is under XML site map.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Repurpose/reuse blog content - email address to make available for this
Hi, From Rand's recent Whiteboard Friday, I learned this: "Have right on the blog page the email address to use if they want to repurpose/reuse the content. That way if someone wants to give us a backlink and quote/reference our blog, they have an easy way to get permission." My question is, what do I say with the email address when I list our contact email? Something like 1. Just list the email address 2. "To reuse/repurpose our content, please contact [email protected]." or something else?
Content Development | | BobGW0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
What is the Best Content Spinner to Use?
I'm looking for a good article spinner. I used to use Spin Doc but it's not as intuitive anymore.
Content Development | | 01023451 -
Is there a content ratio that google looks for?
What i mean by this is: If i have a 1000 page ecommerce site that has say 10 pages of good quality content (1% good content) and a competitive site that has 100 pages of good content (10% good content) and another that has 500 (50%). if they are all almost the same in every other way the 50% content site would win hands down. however this is not always possible to have so much good content on some types of site. The question is: Is there a min percentage to aim for? Also is there a similar min rate of content production to aim for. Is their a kind of tipping point to get past that google will think they are doing things right i should keep an eye on them!
Content Development | | mark_baird0 -
How Many Words on Page for Content When Optimizing a K.W.
I want to hired a writer to create content. When optimizing a keyword on a page, how many words (minimum) should I have on that content. Some writer use ''Word Count'' when fixing a price for text, before asking a writer, I need to specified ''How many word'' to included in the content. Thank you,
Content Development | | BigBlaze2050 -
Duplicate Content Discovery
I was hit with Penguin on April 24th like a ton of bricks. Luckily my cash cow keyword was kept safe and still is today with even an increase in traffic over the year. With some other main keywords I used to rank far I fell off the board on that day. Since then I have been slowly trying to clean things up as much as I know Today I was sitting down with my coffee and Penguin mindset and I decided to use copyscape again to review duplicate content issues and something I noticed which I either didn't before or didn't think was an issue was my footer. In my footer I used a blurb from some other site in my niche a long time ago. Which I discovered they used from one of the main sites in my niche. Anyways I noticed that my footer is what kept coming up as being duplicate content and was always at an overage of 28% according to copyscape. My question is should I be worried about the footer? Is 28% a lot?
Content Development | | cbielich0 -
Does the duplicate content on the crawl errors report test content on external websites?
Hello, Can you tell me if this is just duplicate content within my site or if it also recognises duplicate content on external sites as well? Thanks
Content Development | | stuarta600 -
Content
I'm curious what people are paying when they outsource content writing. I'm thinking about outsourcing some writing. I'm looking for the best quality content on the web, nothing medicore or average! What do you guys pay?
Content Development | | PeterM220