How to Handle Franchise Duplicate Content
-
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns.
- Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to)
- Should we just continue to use robots.txt/noindex for all duplicate pages for now?
- Any other recommendations?
Thanks in advance!
-
It sounds like you are already doing as well as you can - since there's no clear canonical page, noindexing the duplicate pages would probably be the way to go. Don't panic if you see some duplicate pages still sneak into the index after you've noindexed them; this is common and it's unlikely that Google will see this as a Panda-worthy problem on your part.
The one drawback to noindexing the pages is that when unique content is up on them, and they are ready to be indexed, it may take a while for Google to get the message that this page is supposed to be indexed now. I've seen it take anywhere from an hour to a week for a page to appear in the index. One thing you can do in the meantime is make sure each site is accruing some good links - not an easy task with 80 websites, I know, but the higher authority will help out once the unique content is ready to go. Sounds like a herculean task - good luck!
-
Solid insight, but unfortunately we do have the 80 websites because the owners of the store manage each separately. Some stores offer different products or services than others and are completely separate entities. Each store owner that we work with is an individual client; we do not work with corporate. Plus, since we don't do marketing for ALL stores in the entire franchise, just a large chunk of them, one big site just wouldn't work. Also, it's really not possible for us to make all these store owners write their own content for the entire site.
We really appreciate your thought on this and totally agree with your logic, but unfortunately would not be able to implement either solution. Right now, we just need some kind of bandaid solution to utilize as we work through rewriting the most important pages on the site (probably either de-indexing them or some kind of canonical strategy).
Thanks!
-
Hey There!
Important question ... why does the company have 80 websites? Are they being individually managed by the owner of each store, or are they all in the control of the central company?
If the latter, what you are describing is a strong illustration supporting the typical advice that it is generally better to build 1 powerhouse website for your brand than a large number of thin, weak, duplicative sites.
If this company was my client, I would be earnestly urging them to consolidate everything into a single site. If they are currently investing in maintaining 80 website, there's reason to hope that they've got the funding to develop a strong, unique landing page for each of the 80 locations on their main corporate website, and redirect the old sites to the central one. Check out how REI.com surfaces unique pages for all of their locations. It's inspiring how they've made each page unique. If your client could take a similar approach, they'd be on a better road for the future.
You would, of course, need to update all citations to point to the landing pages once you had developed them.
If, however, the 80 websites are being controlled by 80 different franchise location managers, what needs to be developed here is a policy that prevents these managers from taking the content of the corporation. If they want to each run a separate website, they need to take on the responsibility of creating their own content. And, of course, the corporate website needs to be sure it doesn't have internal duplicate content and is not taking content from its franchise managers, either. 80 separate websites should = 80 totally separate efforts. That's a lot to have going on, pointing back to the preferred method of consolidation wherever possible.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Duplicate Schema Syntax
Is having both JSON and Microdata markup on one site detrimental to SEO? I'm unsure if Google would read it as spammy to have both.
Local Website Optimization | | GoogleAlgoServant2 -
Meta descriptions in other languages than the page's content?
Hi guys, I need an opinion on the optimization of meta descriptions for a website available in 6 languages that faces the following situation: Main pages are translated in 6 languages, English being primary >> all clear here. BUT The News section includes articles only in English, that are displayed as such on all other language versions of the website. Example:
Local Website Optimization | | Andreea-M
website.com/en/news/article 1
website.com/de/neues/article 1
website.com/fr/nouvelles/article 1
etc. Because we don't have the budget right now to translate all content, I was wondering if I could add only the Meta Titles and Meta Descriptions in the specific languages (using Google Translate), while the content to remain in English. Would this be accepted as reasonable enough for Google, or would it affect the website ranking?
I'd like to avoid major mistakes, so I'm hoping someone here on this forum has a better idea of how to proceed in this case.0 -
How many SEO clients do you handle?
I work in a small web & design agency who started offering SEO 2 yrs ago as it made sense due to them building websites. There have been 2 previous people to me and I now work there 3 days a week and they also have a junior who knew nothing before she started working for us. She mainly works for me. My question is, how many clients do you think would be reasonable to work on? We currently have around 55 and I have been working there for nearly 5 months now and haven't even got to half of the sites to do some work on. I've told them the client list is way too big and we should only have around 15 clients max. However they don't want to lose the money from the already paying clients so won't get rid of any and keep adding new ones Their systems were a mess and had no reporting or useful software so I had to investiagte and deploy that, along with project management software. Their analytics is also a mess and have employed a contractor to help sort that out too. It's like they were offering SEO services but had no idea or structure to what they did. Meta descriptions were cherry picked which ones to be done, so say 50/60 on a site not filled in. So it's not like I have 45 or so well maintained accounts. They're all a mess. Then the latest 10 new ones are all new sites so All need a lot of work. I'm starting to feel incredibly overwhelmed and oppressed by it all and wanted to see what other SEO professionals thought about it. Any thoughts would be appreciated.
Local Website Optimization | | hanamck0 -
How can I migrate a website's content to a new WP theme, delete the old site, and avoid duplication and other issues?
Hey everyone. I recently took on a side project managing a family member's website (www.donaldtlevinemd.com). I don't want to get too into it, but my relative was roped into two shady digital marketing firms that did nothing but a mix of black-hat SEO (and nothing at all). His site currently runs off a custom wordpress theme which is incompatible with important plugins I want to use for local optimization. I'm also unable to implement responsive design for mobile. The silver lining is that these previous "content marketers" did no legitimate link building (I'm auditing the link profile now) so I feel comfortable starting fresh. I'm just not technical enough to understand how to go about migrating his domain to a new theme (or creating a new domain altogether). All advice is appreciated! Thanks for your help!
Local Website Optimization | | jampaper1 -
Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Hi Mozzers,
Local Website Optimization | | emerald
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows: (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons). So my questions for recovery are: Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else? Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.? Thanks in advance for your advice.0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0