What strategies can you use when you're optimizing for 10 locations x 20+ services?
-
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks:
- Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state.
- Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title.
- Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice.
- Create new domains for each location/state covered. But then we have to start over building link juice.
How have other sites dealt with this? What has worked best and what hasn't worked?
-
Hi Adam,
My short and sweet answer to this scenario is:
A page for every city and a page for every service
So, you'd have a total of 30 pages to budget and plan for (one for each of the 10 cities and one for each of the 20 services).
Most small local businesses are not going to have the funding for developing 200 exceptional pages ... what I've seen when small businesses try to go this route of developing a page for every possible service/city combo is that they end up with a collection of so-so pages at best and at worst, thin or duplicate pages.
So, for a client like a dental practice, I believe that a sterling quality page for every city and for every service tends to be an achievable goal if structured over a reasonable time frame contract.
I definitely do not recommend developing a different website for each city. Build a powerhouse and keep working on improving it for the life of the business. Hope this helps!
-
Since nobody has responded I'll share what we are currently doing with only two locations and multiple services. It's number 3 on your list. The caveat here is that we're still implementing this so the final results are not in. Here is what we're doing:
- Make sure you have a Google+ business page for each physical location to make sure that Google knows you're "local" and you can pop-up on their location snippet (hopefully!).
- On the contact us page or locations page (not sure what you have), we list each location with the physical/mailing address, phone number and a link that says "Directions" that navigates to the "city-office" page (or however you want to name it... atlanta-office for example).
- On the city-office page we have a nice write-up about this city and the office. We also include a google map of the location, full address, phone numbers, email, and the associated Google+ profile link for that specific location. Now here is the magic: Below that we have a list that has a heading of "Local [city] Services" that has list of of each service that links to an optimized page for that city and service. For your client the heading might be "Local Atlanta Dental Services" for example. You want each service listed to have the appropriate keywords/phrases in the anchor text.
- Create each services page per location and optimize it like a pro. WARNING: this method will run the risk of duplicate content when you start having multiple cities with similar pages. It is therefore imperative that you make sure that each page contains unique content. The "Atlanta Teeth Whitening" page, although identical in nature with the "L.A. Teeth Whitening" page, must have content unique to their respective cities. This is where the opportunity presents itself to create 10x content for each city (https://mza.seotoolninja.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday)
I suggest you start with one major city at a time, measure results, make any necessary adjustments and move on to the next city. The key here is that the content is unique for each service in each city. Sure, they can follow the same format, however make sure you put in the time to make each services page somewhat unique to that city. It may seem like a bit of a gray line that we're walking but, in my opinion, it's logical for expansion. Again the big risk is duplicate content but that can be avoided if done correctly.
Hopefully this helps! I would love to see others chime in on this and give feedback as I'm sure we're not the only ones in the world with this problem.
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Ranking for keyword I don't optimize for & Other oddities
Hi Moz Community! I've been working with a clients website for about a year now. They were hit with the original Panda update because of some spammy links from a shady SEO firm. We've made a decent climb back but not a full recovery. There are some weird things happening that I would love some insight into. 1. Ranking for keywords we don't optimize for: I noticed some low keyword volume for a keyword term that is close to our main term, but is slightly different. We don't optimize for this term at all on our website. We rank third for this term, and actually show site links in the result, which doesn't happen for any of our other pages. 2. Index not found when doing site: search: Other oddity is that when you search site:www.mywebsite.com, I see all the pages within the site except the homepage. Not sure whats going on here, but when I fetch the homepage in GWMT, it returns the homepage. When you query the homepage by itself, it also ranks. Any help would be appreciated! Regards, J
Intermediate & Advanced SEO | | artscienceweb0 -
HTML5: Changing 'section' content to be 'main' for better SEO relevance?
We received an HTML5 recommendation that we should change onpage text copy contained in 'section" to be listed in 'main' instead, because this is supposedly better for SEO. We're questioning the need to ask developers spend time on this purely for a perceived SEO benefit. Sure, maybe content in 'footer' may be seen as less relevant, but calling out 'section' as having less relevance than 'main'? Yes, it's true that engines evaluate where onpage content is located, but this level of granular focus seems unnecessary. That being said, more than happy to be corrected if there is actually a benefit. On a side note, 'main' isn't supported by older versions of IE and could cause browser incompatibilities (http://caniuse.com/#feat=html5semantic). Would love to hear others' feedback about this - thanks! 🙂
Intermediate & Advanced SEO | | mirabile0 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
Can I dissavow links on a 301'd website?
So we are performing link removal for a client on his old website (A), which is being 301 redirected to his new website (B). We have identified toxic links on site A and are removing, once complete we will undo the current 301, confirm a new GWT account for website A, and then submit the disavow report. We would then like to reapply the 301 redirect to site B while we are waiting for Google to process the disavow report, the logic being we can retain some current rankings on site B while waiting for the disavow to process on site A. Has anyone had experience with this method? I foresee some potential issues here but am interested to here from others on this. Thanks!
Intermediate & Advanced SEO | | SEOdub1 -
What's next?
What's next with the tool? For SEOmoz users that have gotten their Crawl Diagnostics and On-Page issues under control, what's next? In other words, what do long-time SEOmoz users do with the tool? What ongoing weekly value do they get? Ranking reports? Link Analysis? It took me four weeks to resolve all my simple issues, which you can see in Crawl Diagnostics and On-Page reports. (It would have only take one week, if the tool crawled all my pages upon demand instead of only once a week.) But now that all my simple issues are resolved, I'm not sure what else to do with the tool. I don't want to hastily cancel the service, but I also don't know what else to do... I'd even pay more for an actual human to look in on me from time to time and tell me what to do next. But I'm self-motivating, so I'll try to figure it out.
Intermediate & Advanced SEO | | raywhite0 -
How to be a good SEO optimizer while competing with a good ranked Bad SEO optimizer?
My keywords are very competitive. My on page optimization report gives A grade for all the keywords I want to target to my Root domain. But my root domain does not show up on search engines for those same keywords. So thanks to SEOmoz i have managed to understand the place I lack is good link building. My competitors have done lot of link building through spamming, commenting on blogs, directories etc. Now according to good seo, this is not right. What do i do? I get digging more in it, i realized that i am getting traffic mostly for less globally searched keywords. But my competitors get high traffic from well searched keywords. How do i cope with such competition? Thanks
Intermediate & Advanced SEO | | MiddleEastSeo0 -
In order to improve SEO with silos'urls, should i move my posts from blog directory to pages'directories ?
Now, my website is like this: myurl.com/blog/category1/mypost.html myurl.com/category1/mypage.html So I use silos urls. I'd like to improve my ranking a little bit more. Is it better to change my urls like this: myurl.com/category1/blog/mypost.html or maybe myurl.com/category1/mypost.html myurl.com/category1/mypage.html Thanks
Intermediate & Advanced SEO | | Max840