Affiliate Url & duplicate content
-
Hi
i have checked passed Q&As and couldn't find anything on this so thought I would ask.
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1I cant seem to locate where these URLS are coming from and how this is being created?
This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this?
If anyone has any information on this it would be a great help.
thanks
E
-
I'm seeing a lot of that in the SERPs with no particular pattern, even on BBC's site. Are you running Wordpress? Could it be a plugin you've added?
-
Often, pages on a website will have different means by which they're reached. As an example, the logo on a website's top-left often links to the homepage, but the link (initially) would be something like: websitename.com/ref=logo or something to that effect. This is the same sort of issue as websitename.com/ and websitename.com/home being the same page. What's recommended in these instances is to create a 301 redirect for every duplicate page, to the base. Meaning, you would redirect websitename.com/home to websitename.com/, which tells search engines like Google that the prior should be replaced in search and in access to the latter.
This may help you find other duplicates that you have by searching **site:websitename.com **in Google (all one word, no spaces). This shows you every page on a website and can help you keep track of what pages you need to take care of.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have 2 versions of URLs. we have the mobile and the desktop. is that a duplicate content?
Hi, Our website has two version of URLs. dektop: www.myexample.com and mobile www.myexample.com/m If you go to our site from a mobile device you will land on our mobile URL, if you go to our site from desktop computer you will land on a regular URL. Both urls have the same content. Is that considered duplicate? If yes, then what can I do to fix it? Also, both URLs are indexed by google. We have two separate XML sitemaps- one for desktop and one for mobile. Is that a good SEO practice?
Technical SEO | | Armen-SEO0 -
Value in Consolidating Similar Sites / Duplicate Content for Different URLs
We have 5 ecommerce sites: one company site with all products, and then four product-specific sites with relevant URL titles and products divided up between them (www.companysite.com, www.product1.com, www.product2.com, etc). We're thinking of consolidating the smaller sites into our most successful site (www.product1.com) in order to save management time and money, even though I hate to lose the product-specific URLs in search results. Is this a wise move? If we proceed, all of the products will be available on both our company site and our most successful site (www.company.com & www.product1.com). This would unfortunately give us two sites of duplicate content, since the products will have the same pictures, descriptions, etc. The only difference would be the URL. Would we face penalties from Google, even though it would make sense to continue to carry our products on our company site?
Technical SEO | | versare0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0