Subdomain question for law firm in Indiana, Michigan, and New Mexico.
-
Hi Gang,
Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc.
We currently are set up with the main site as:
http://www.2keller.com (Indiana)
Subdomains as:
http://michigan.2keller.com (Michigan)
http://newmexico.2keller.com (New Mexico)
My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought.
Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it.
Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage.
Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly.
I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales?
I guess I'm wondering if there are some things I'm overlooking here?
Thanks guys/gals!
-
Crazy, I have quite a bit of experience with this exact scenario: law firms using geo subdomains to target specific areas.
Here's my findings and suggestions based on actual results and experience:- SEO on domain.com benefits atlanta.domain.com. This is a fact. If Starbucks decided to create subdomains tomorrow for every location, their subdomains would benefit from 91 DA. That's how Findlaw, lawyers.com and all those guys get first page placement with high DA and low PA.
- Digital Diameter is right, subdomains are more effective and directories are more efficient. UNLESS you have a really good multi-site CMS. Then you can be equally efficient and more effective.
I hope this answers your question, if you want some help or have any other questions, PM me.
-
Much appreciated... Can you see the reply above I sent to Mike and offer your thoughts?
-
Thanks, Mike. I agree with your reply, but I suppose my main concern is more associated with whether or not our site becomes too convoluted as we begin geo-targeting states and the major cities within them. It would seem to be an organizational nightmare, making sure that users are getting the experience they expect when visiting the site. Users in New Mexico don't care about Indiana law, copy, and vice-versa. There are so many topics related to specific states, and there's so much content, I worry about it becomes haphazzrd when restricted to one domain. Thoughts?
-
Subdomains (more effective):
In short the benefit is that Google will see each subdomain as a locally focused, independent site.
However, this is also the disadvantage of subdomains.
While they are more likely to be seen as locally focused, each subdomain will have to be managed, provided with unique content and links so it can quickly become much more effort.
Folders (more efficient):
Folders offer much more synergy as they are seen as a single site, but they are also seen as less local / independently targets than subdomains.
-
Randal,
I think in this instance first and foremost lets talk about url structure.From an organic search perspective structuring urls in this way (http://michigan.2keller.com) will hinder any positive seo you do on your main url. Google would view your current url structure as individual domains, therefore none of the seo strategy done on 2keller.com will transfer to the other domains.How the url is structured should not have any affect on how your add the content. We deal with national clients with multiple locations all the time. How you want to structure this is http://www.2keller.com/Michigan or http://www.2keller.com/newmexico. This would allow your team to only have to do search marketing work once and would add efficiency's to your work flow.
I know your main concern is the amount of state specific content. You can still create the pages the exact same way as before from a content perspective. Just have a solid internal linking structure on 2keller.com guiding people to the proper relevant pages or you could use geo targeting allowing the site to recognize IP address and auto-direct people to the right area. Hope this helps. Let us know if you have any questions.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images, CSS and Javascript on subdomain or external website
Hi guy's, I came across webshops that put images, CSS and Javascript on different websites or subdomains. Does this boost SEO results? On our Wordpress webshop all the sourcescodes are placed after our own domainname: www.ourdomainname.com/wp-includes/js/jquery/jquery.js?ver=1.11.3'
Technical SEO | | Happy-SEO
www.ourdomainname.com/wp-content/uploads/2015/09/example.jpg Examples of other website: Website 1:
https://www.zalando.nl/heren-home/ Sourcecode:
https://secure-i3.ztat.net//camp/03/d5/1a0168ac81f2ffb010803d108221.jpg
https://secure-media.ztat.net/media/cms/adproduct/ad-product.min.css?_=1447764579000 Website 2:
https://www.bol.com/nl/index.html Sourcecode:
https://s.s-bol.com/nl/static/css/main/webselfservice.1358897755.css
//s.s-bol.com/nl/upload/images/logos/bol-logo-500500.jpg Website 3:
http://www.wehkamp.nl/ Sourcecode:
https://static.wehkamp.nl/assets/styles/themes/wehkamp.color.min.css?v=f47bf1
http://assets.wehkamp.com/i/wehkamp/350-450-layer-SDD-wk51-v3.jpg0 -
Questionable Referral Traffic
Hey SEOMozers, I'm working with a client that has a suspicious traffic pattern going on. In October, a referral domain called profitclicking.com started passing visits to the site. Almost, in parallel the overall visits decreased anywhere from 35 to 50%. After checking out profitclicking.com more, it promises more traffic "with no SEO knowledge". The client doesn't think that this service was signed up for internally. Regardless, it obviously smells pretty fishy, and I'm searching for a way I can disallow traffic from this site. Could I simply just write a simple disallow statement in the robots.txt and be done with it? Just wanted to see if anyone else had any other ideas before recommending a solution. Thanks!
Technical SEO | | kylehungate0 -
Best way to host new product?
Hi guys We are launching a new product, the web pages are being built by a 3rd party and fall outside our current CMS. We're considering either hosting it on 1) sub domain 2) folder within existing site (although will be tricky to implement) or 3) a different URL altogether. What would you say is the best for SEO? Many thanks in advance.... Nigel
Technical SEO | | Richard5550 -
New to rich snippets, help needed
Hi, I have an online store selling mens business attire in Australia. I have had my developers add the required code to allow rich snippets. You can see the result here http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fjsshirts.com.au%2Fmens-business-shirts%2Fclassic-fit%2Fsky-blue-poplin-classic-fit-shirt.html&view=cse I have few questions 1.How can I change the product description? Is it possible to use the product Meta Tag as the description? 2.Under the stars and the review count is this text. The excerpt from the page will show up here. The reason we can't show text from your webpage is because the text depends on the query the user types. Should whateever product description I have used show up here? 3.Is there anything else I need to do to get SERP's to show the snippet? Many thanks for any answers, Jason
Technical SEO | | mullsey0 -
Are there any tools that will detect when new pages are added?
need help with keeping track of when new pages pop up that might compete internally with already optimized pages.
Technical SEO | | SEOmoxy0 -
Robots.txt question
What is this robots.txt telling the search engines? User-agent: * Disallow: /stats/
Technical SEO | | DenverKelly0 -
New Forum: SEO considerations.
We're going to add a new forum to our website. We don't anticipate very large volumes of users. I read somewhere in The Art of SEO that forums should be 'built in bbPress'. I'm very much a programming novice so I'm still trying to get to grip with the basics of forums. I'd be grateful to know the main SEO considerations (however basic) that I should tell my web developer who is building the new forum.
Technical SEO | | JacobFunnell0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0