What strategies can you use when you're optimizing for 10 locations x 20+ services?
-
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks:
- Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state.
- Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title.
- Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice.
- Create new domains for each location/state covered. But then we have to start over building link juice.
How have other sites dealt with this? What has worked best and what hasn't worked?
-
Hi Adam,
My short and sweet answer to this scenario is:
A page for every city and a page for every service
So, you'd have a total of 30 pages to budget and plan for (one for each of the 10 cities and one for each of the 20 services).
Most small local businesses are not going to have the funding for developing 200 exceptional pages ... what I've seen when small businesses try to go this route of developing a page for every possible service/city combo is that they end up with a collection of so-so pages at best and at worst, thin or duplicate pages.
So, for a client like a dental practice, I believe that a sterling quality page for every city and for every service tends to be an achievable goal if structured over a reasonable time frame contract.
I definitely do not recommend developing a different website for each city. Build a powerhouse and keep working on improving it for the life of the business. Hope this helps!
-
Since nobody has responded I'll share what we are currently doing with only two locations and multiple services. It's number 3 on your list. The caveat here is that we're still implementing this so the final results are not in. Here is what we're doing:
- Make sure you have a Google+ business page for each physical location to make sure that Google knows you're "local" and you can pop-up on their location snippet (hopefully!).
- On the contact us page or locations page (not sure what you have), we list each location with the physical/mailing address, phone number and a link that says "Directions" that navigates to the "city-office" page (or however you want to name it... atlanta-office for example).
- On the city-office page we have a nice write-up about this city and the office. We also include a google map of the location, full address, phone numbers, email, and the associated Google+ profile link for that specific location. Now here is the magic: Below that we have a list that has a heading of "Local [city] Services" that has list of of each service that links to an optimized page for that city and service. For your client the heading might be "Local Atlanta Dental Services" for example. You want each service listed to have the appropriate keywords/phrases in the anchor text.
- Create each services page per location and optimize it like a pro. WARNING: this method will run the risk of duplicate content when you start having multiple cities with similar pages. It is therefore imperative that you make sure that each page contains unique content. The "Atlanta Teeth Whitening" page, although identical in nature with the "L.A. Teeth Whitening" page, must have content unique to their respective cities. This is where the opportunity presents itself to create 10x content for each city (https://mza.seotoolninja.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday)
I suggest you start with one major city at a time, measure results, make any necessary adjustments and move on to the next city. The key here is that the content is unique for each service in each city. Sure, they can follow the same format, however make sure you put in the time to make each services page somewhat unique to that city. It may seem like a bit of a gray line that we're walking but, in my opinion, it's logical for expansion. Again the big risk is duplicate content but that can be avoided if done correctly.
Hopefully this helps! I would love to see others chime in on this and give feedback as I'm sure we're not the only ones in the world with this problem.
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I rank without links
Let's say I have great content. I have a great website design (easy to navigate for user) that answers their questions but I have no links. Can I still rank on on a keyword that has a difficulty score of 24. I imagine that I can that google can't penalise me for not having links. Does it mean that without links it will take longer to rank than with links but that google with rank me at some point ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Ranking without use of keywords on page & without use of matching anchor text??
Howdy folks. So, here is a dilemma. One of competitors of ours is somehow ranking for a keyphrase "houston chronicle obituaries" without any usage of these keywords on the page, without any full or partial anchor text match ("chronicle" is not used anywhere). The rest of competitiors' rankings make sense. Any ideas?
Intermediate & Advanced SEO | | DmitriiK0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Location appearing on search result. how can this be achieved?
I'm pretty sure this site is not doing any SEO but i think what made them no. 1 is the location. I already tried adding a google publisher tag to my site that points to my google page which contains my address but i still can't have the location appear.. here's a screenshot of the search result that i want to achieve: https://www.dropbox.com/s/tbdv3121rrs6zp5/Screen Shot 2013-04-15 at 9.39.30 AM.png Screen%20Shot%202013-04-15%20at%209.39.30%20AM.png
Intermediate & Advanced SEO | | optimind0 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
Are URL shorteners building domain authority everytime someone uses a link from their service?
My understanding of domain authority is that the more links pointing to any page / resource on a domain, the greater the overall domain authority (and weight passed from outbound links on the domain) is. Because URL shorteners create links on their own domain that redirect to an off-domain page but link "to" an on-domain URL, are they gaining domain authority each time someone publishes a shortened link from their service? Or does Google penalize these sites specifically, or links that redirect in general? Or am I missing something else?
Intermediate & Advanced SEO | | Jay.Neely0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0