How detrimental is duplicate page content?
-
We have a local site wherein we have multiple advanced search parameters based on facilities available at a particular place. So for instance, we list a set of fun places to take kids to in a city. We have a page for this. We now have ability to select a list of fun places that have parking facility available or which are "outdoor". Now we use parameters to address these additional search criteria. Would search engines treat them as duplicate pages and in case it would how detrimental would this be?
-
As others had answered before, if the pages with parameters are just a consequence of a filter, but don't actually add nothing relevant (aka: substantially duplicated of the not parametered URLs) or nothing all, than the best idea is having those URLs with noindex meta robots.
This will ensure that those pages, if they were crawled, will disappear from the index.
But this is just a general rule, because can exists many variations to that same rule (and we don't know how really has been developed your site).
For instance, if those pages cannot be physically crawl because the filters are behind a Javascript selector (something that can be verified disabling Java in the browser), then you should not suffer issues and, eventually, using the meta robots "noindex" should be just a prevention not really an intervention to something already happened.
-
If you no-index, any link pointing to that page will waste its link juice.
If you must do that no-index,follow so the link juice can flow back out.
if your site is mainly duplicates then you have a problem, but if it is just a few pages, don't worry.
google will give credit to one page and will disregard the others. -
I guess it depends how much duplication there is. If the pages contain completely duplicate content with no unique content at all then the best move would be to noindex or nofollow them. Otherwise rel=canonical is probably fine.
-
Does rel="canonical" only indicate to Google the preferred page or does it also indicate that the content on the current page is duplicate in nature? Should it be better if we actually remove these pages from the index by providing for a "noindex" on the page?
-
Duplicate content is detrimental but the issue is relatively easy to solve. Just ensure you add rel="canonical" tags to the duplicate pages to allow Google to identify and rank the preferred page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle no ad pages or no search result pages for a classifieds website?
We have a classified website.
Local Listings | | SirishaNueve
We have started doing SEO for classifieds search pages so I have submitted some pages to Google using sitemap.xml ex: www.domain.com/search/austin.
If there are any Ads in the Austin location then Google is considering it as soft 404 errors in GWT.
I am submitting them to Google yet there are no Ads because at some point users may add Ads and by that time my URL need to be in Google. My question is how to handle the page which doesn't have any Ads?
Please let me know and guide me if I am wrong.0 -
What's the best way to identify duplicate listings?
I'm doing manual duplicate research for an account and wanted to know if anyone had a resource to share on how to find duplicate listings for GMB and other citations. https://searchengineland.com/definitive-guide-duplicate-research-local-seo-238719 Ive been working off of this article from Joy Hawkins, but she mentions using Map Maker to search a phone number, but Google has since shut Map Maker down. Maps doesn't seem to work the same way, as I've searched a phone number which I know has duplicate listings and they don't come up. Any tips on a better tool or process?
Local Listings | | formandfunctionagency0 -
Help - my boss wants me to duplicate websites for local SEO targeting
my boss is insisting that I duplicate a site that is ranking well and then roll it out across the UK on new domain names beginning with targeted city names in the domain name. I will then be going through each duplicate site changing the location keywords to the target city location Along with images etc. what effect will this have? Do you have any advice on the best way to tackle this? thanks
Local Listings | | platinumhouse0 -
Multiple Local Domains and Location Pages Question
Hello Everyone, So we have a priority site (domain.com) but also a geo-specific site for another location we have (domainNYC.com). Assuming both have completely unique content, different contact information and it’s justifiable to have a second domain (i.e. resources, brand/link equity…etc.) would it be recommend to also use the sub-folder approach on our primary (meaning domain.com/nyc)? And then potentially linking to domainNYC.com (just the once, not overdoing it)? Or just play it safe and keep them separate. Our concern is doing both sub-folder and separate domain might cannibalize on local searches resulting in us essentially competing with ourselves for those terms. The benefit would be leveraging the priority domain and driving visitors there. We could always ‘noindex, follow' the sub-folder page so users have access to the address on the primary domain too but wanted to see if anyone had any thoughts or suggestions as well as how it could pertain to linking (scarcely). We have found a lot of information on choosing one over the other but not as much for whether both is recommended so any extra insight would be very appreciated. Looking forward to hearing from all of you! Thank you in advance for the help! Best,
Local Listings | | Ben-R0 -
I have 2 locations and 6+ Google Business pages... How can I combine the duplicates without losing maps rankings?
I have 2 locations and 6+ different Google Business pages due to a company merger and automatic page creation. Some of the GMB even pages rank in maps above the ones we use for certain terms and most bring traffic to my site, but I know the dupes are hurting our maps rankings. Is there a way I can consolidate these pages by combining them? Or am I better off just biting the bullet and deleting the pages I don't want to use?
Local Listings | | formandfunctionagency0 -
How best to delete a duplicate Google My Business listing
Hi Mozzers, I am in need of help please. What is the best way to remove a duplicate GMB listing? Is best practice to a) Mark as permanently closed and wait for Google to update b) Call GMB in India and ask them to delete the redundant listing (do they even do this for you?) c) Delete the page from within my dashboard settings I am worried my NAP data is being diluted by inaccurate historical listings (most of which I have been able to claim ownership of) Any help appreciated. Ben
Local Listings | | Bendall0 -
Can we place a FB business page link instead of direct domain link in a Google My Business listing?
A client of ours asked if we could place link to their local Facebook page instead of a link to the direct domain in their Google My Business listing. Will Google allow this?
Local Listings | | RosemaryB0 -
G+ Local Business Page vs. Brand Page Problems
I'm struggling a bit with a Brand page vs. Local page on G+ and wondering if anyone here has had this same problem and found a solution.... This is related to a business that has a does have a physical address for a head admin office, but they provides a financial service to people across Canada over the phone. So although the business has an address and local phone number for admin purposes, it doesn't want people showing up at that address and definitely doesn't want to be considered a "Local" business. However, Google automatically creates the local listing in google maps, which the business has claimed but otherwise does not want to maintain. Instead the business has a Brand page on G+ (not local) which it has linked to the domain and actively maintains as their G+ business page. The trouble is, Google is associating showing the local listing as the rich snippet in in their organic result instead of the Brand page. Is there anything the company can do to further help Google associate the Brand G+ page with the website instead of the local listing? I already tried removing the link to the website from the local listing in hopes that would dis-associate it with the domain. That got rid of the rich snippet, but now the local listing shows up as a separate organic result just below the main company website, which is just as bad or maybe worse. To confirm, the website IS linked to the BRAND page using rel=publisher, and the brand page does have a verified link to the company domain. Thanks for the help!
Local Listings | | PlusROI1