Duplicate content question
-
Hey Mozzers!
I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit.
I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer.
I’m not sure how to respond to this crawl issue:
-
Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose.
-
Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right.
-
Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site).
Is a meta robots noindex the best way to deal with duplicate content in this case?
Thanks in advance for your help.
AK
-
-
@seoelevated Thanks. I see your reasoning. It's also valid.
-
@andykubrin I would like to add that another valid approach is to ignore the "issue". Are all 4 of your form pages currently indexed? If so, then this Moz-reported issue is not necessarily an actual issue. There is no "penalty" for duplicate content like this. The situation we all wish to avoid is for the search engine to choose one of the pages to index, because it sees them all as duplicates, and not necessarily index the one we want or associate with all our desired keywords we've individually targeted per page. But, if all 4 of your pages are currently indexed, and if they rank for the terms that you want, then it would be OK to ignore the issue.
As well, you might think about whether you want these pages to be indexed/rank at all. If your desire is for traffic to go to the service description pages and then flow to the forms, and if the service description pages are the ones which actually are ranking, the issue may not even matter to you. And so again, you might decide to ignore. And that would be a valid choice.
-
Hi @nigel_carr ,
You've pointed the way to a solution.
I offer four services. I have a service description page for each one, with around 300 words on each page and a link to a separate HubSpot form. I use four separate HubSpot forms because that arrangement allows me to tailor the automated responses I send to each person who requests more information on a given service.
So I think the two best options for a solution are:
-
Keep the service descriptions and forms separate, but noindex each individual form
-
Incorporate the forms into the service description pages
The first option provides a neater appearance, but the second option would shorten the visitor's path to the form, so that's probably the better option.
Thank you for your advice.
Regards,
Andy
-
-
Hi Andy
The reason why they are coming up with a duplicate content warning is likely because there is very little content on the pages other than the forms.
You have a couple of options:
- Flesh out the content on the pages so that the subject is clearly defined in each case. This would involve writing 300+ words/adding an image+alt/H1/Meta etc on the subjects of:
Content Marketing Consultation
Copywriting Consultation- The other two
- If you can't write unique content for each form because the subjects are too close (Copyright vs Content) then it begs the question why you have the 4 forms in the first place.
If you want to keep the 4 forms then can canonicalize 3 of them to the main one, so only 1 is set to rank.
Note: If you do canonicalize it is not guaranteed that Google won't feature either one or all the other forms. You are simply telling Google that that is your preference.
- You could noindex three of the four forms which would be a perfectly acceptable solution even though most scanning software will warn you of the no indexes.
I hope that helps
Nigel
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content within Site
I'm very new here... been reading a lot about Panda and duplicate content. I have a main website and a mobile site (same domain - m.domain.com). I've copied the same text over to those other web pages. Is that okay? Or is that considered duplicate content?
Technical SEO | | CalicoKitty20000 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Link Structure & Duplicate Content
I am struggling with how I should handle the link structure on my site. Right now most of my pages are like this: Home -> Department -> Service Groups -> Content Page For Example: Home -> IT Solutions -> IT Support & Managed Services -> IT Support Home -> IT Solutions -> IT Support & Managed Services -> Managed Services Home -> IT Solutions -> IT Support & Managed Services -> Help Desk Services Home -> IT Solutions -> Virtualization & Data Center Solutions -> Virtualization Home -> IT Solutions -> Virtualization & Data Center Solutions -> Data Center Solutions This structure lines up with our business and makes logical sense but I am not sure how to handle the department and service group pages. Right now you can click them and it just brings you to a page with a small snippet for the links below. The real content is on the content pages. What I am worried about is that the snippets on those pages are just a paragraph or two of the content that's on the content page. Will this hurt me and get considered duplicate content? What is the best practice for dealing with this? Those department/service group pages have some good content on them but it's just parts of other pages. Am I okay doing this because there are not direct duplicates of other pages just parts of a few pages? Any help on this would be great. Thanks in advance.
Technical SEO | | ZiaTG0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0