Onpage optimising for multiple sites
-
I’ve been given the task of optimizing a company’s websites (15 in total) that has multiple websites selling the same product. In terms of optimizing them, can I use the same set of meta descriptions, page title tags and key words for them all or do I need to produce a different set for each?
The sites are for independently branded companies that are set up in a franchise-like arrangement. They all exclusively sell the parent companies joinery products
-
No matter the model, the verdict is the same - gotta be fairly unique content on each of websites.
P.S. Look at any well established franchises - they have 1 website, where the products are shown, and you can choose your closest store/office and go get the product there. Then the locations itself have their own little subpages, which are unique to them - maybe their story, their staff etc.
-
Thanks for getting back to me. Sorry I should have explained the sites are for independently branded companies that are set up in a franchise-like arrangement. They all exclusively sell the parent companies joinery products
-
The first question you need to ask is why does your company have 15 websites selling the same things? Would having one site rank and the others just be canonical (or eliminated) be an acceptable solution? If the sites have different audiences you can try gearing the descriptions, keywords, etc. to the particular audience of the site you are working on but doing that 15 times in ways that would be unique enough to each site to allow them all to rank well on the same search isn't possible. If you ask about the business reason for the 15 sites, you may be able to work with the company on a realistic path to their desired outcome.
-
Hi there.
Only one word comes to mind as i read your question - "duplication".
- No, you shouldn't use same meta descriptions and tags - it's going to bring up duplication issue;
- Yes, you can use the same keyword research, but you'll need to write unique content for each of websites;
- Why would you have 15 websites which are basically the same thing? Are they all ranking already for the same product? Do they have great traffic to all of them? If so, why fix something that ain't broken? If not, why spread all of your efforts to 15 different websites, if you can concentrate on 1 and make it the best thing in the world?
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects for Multiple Language Sites in htaccess File
Hi everyone, I have a site on a subdomain that has multiple languages set up at the domain level: https://mysite.site.com, https://mysite.site.fr , https://mysite.site.es , https://mysite.site.de , etc. We are migrating to a new subdomain and I am trying to create 301 redirects within the htaccess file, but I am a bit lost on how to do this as it seems you have to go from a relative url to an absolute - which would be fine if I was only doing this for the english site, but I'm not. It doesn't seem like I can go from absolute url to an absolute url - but I could be wrong. I am new to editing the htaccess file - so I could definitely use some advice here. Thanks.
Intermediate & Advanced SEO | | amberprata0 -
Redirecting Ecommerce Site
Hi I'm working on a big site migration I'm setting up redirects for all the old categories to point to the new ones. I'm doing this based on relevancy, the categories don't match up exactly but I've tried to redirect to the most relevant alternative. Would this be the right approach?
Intermediate & Advanced SEO | | BeckyKey1 -
Pull multiple link data for multiple pages at once?
Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers. Hi guys, I was wondering if there is a tool or way to pull link data for a list of URLs/Pages at once to one single file with ahrefs or majestic. I know scrapebox can do this with OSE, but looking for a way to do this with the other backlink databases. Any ideas? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Only the mobile version of the site is being indexed
We've got an interesting situation going on at the moment where a recently on-boarded clients site is being indexed and displayed, but it's on the mobile version of the site that is showing in serps. A quick rundown of the situation. Retail shopping center with approximately 200 URLS Mobile version of the site is www.mydomain.com/m/ XML sitemap submitted to Google with 202 URLs, 3 URLS indexed Doing site:www.mydomain.com in a Google search brings up the home page (desktop version) and then everything else is /m/ versions. There is no rel="canonical" on mobile site pages to their desktop counterpart (working on fixing that) We have limited CMS access, but developers are open to working with us on whatever is needed. Within desktop site source code, there are no "noindex, nofollow, etc" issues on the pages. No manual actions, link issues, etc Has anyone ever encoutnered this before? Any input or thoughts are appreciated. Thanks
Intermediate & Advanced SEO | | GregWalt0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
Not ranking well after site was hacked
My site was hacked and that seemed to have a pretty big effect on search rankings. I'm pretty sure I've gotten the hack completely removed (as of 10/16.) During the hack and even now, all of my posts created after Aug 1 don't rank in regular Google searches at all even if I search for the titles in quotes. But I can tell they are indexed because I see them when I do a site search in Google. And I see that they are cached. Posts from before Aug 1 rank well in Google searches. The issue with August and after posts no longer appearing started in mid September. Prior to that the some of the August posts were actually performing very well. The September date corresponds to when we first started working on removing the hack. It looks like the August and after posts all have a pagerank of 0, whereas the mozrank is much higher. I've request reconsideration from Google and was told that no manual action was taken against my site. I know that's a lot of background, but I'm wondering how long do I need to wait before the August and after posts start ranking? Is there anything I can do in the mean time to address this pagerank issue?
Intermediate & Advanced SEO | | Chris-at-Magoosh0