Local SEO - Rich content and list of towns enough?
-
I'm working on creating pages to target local SEO. I've created pages for example 'wedding band london' with useful content and optimised them with title tags, alt tags etc and will continue to do so for major cities and queries where there are significant search volume. However, I also want to pickup longtail local search queries such as 'wedding band camden' etc... Will adding a list of towns somewhere on the page for each city or county help drive traffic to the site from such queries? Is so what's the best way to structure the page?
-
Hi Samuel,
Google will consider you most relevant to whatever your physical address is - this is the locale for which you can work towards your main local/blended rankings.
Beyond this, yes, you can work for organic visibility for other towns you serve in. Having a unique landing page for each of your major service cities is a smart way to go. You can then linkbuild to these pages to go after organic visibility.
I agree that you should not simply go the route of listing towns in a list. Have you considered blogging? Instead of putting too many geo terms per page, have you considered using a blog to showcase weddings you've done in Camden, etc.? A few photos of your band, folks dancing, pics of the venue and 400-600 words of text describing the event, songs the couple requested, testimonials from the bride and groom or family members would make a strong blog post on the subject and could get that long tail traffic you seek. The nice thing about this would be that it would naturally generate very unique content. Though I'm sure there are similarities in all your gigs, each one must be a little bit different, right? And, it would enable you to optimize part of your site for things like regional and neighborhood/district names.
I hope this is a helpful suggestion.
-
Hi Gerry,
Thanks for your reply. It makes sense not to include too many towns so as not to risk diluting the value of the main keyword. Most of the long tail search terms I'm talking about are likely to bring me 2 - 5 clicks a week as an estimate. It does seem, and IS a lot of work to build pages for low traffic keywords so I guess I need to figure out if it's worth my time or if my time is better spent building links and improving domain/page authority. The reason I think building local pages is the best option is simply because the industry is so competitive for the main keyword terms and I think that perhaps highly targeted local terms would convert better anyway.
I think I'll go with the strategy of short but useful content (maybe around 100 words) which might be a good compromise as theses pages wouldn't take too long to build. As the long tail keywords are not competitive (most people don't bother with them at all) hopefully they'll rank page one pretty easily. I could potenitially get a few hundred highly targeted leads this way.
-
So I'm no expert, but I've spent a lot of energy and money with several revisions related to local SEO. Here it is for what it's worth and you'll have to judge for yourself.
The more cities and town names you add to any one page the more you will dilute the organic value in the one town or city that is most important to you. You may want to include the names of a few towns within a city if they are a logical and natural language why in which people speak about a given place. For example, if you were writing a page about New York, you could logically include info on Brooklyn, Queens, The Bronx, Manhattan and (you might want to not bother with Staten Island), but you certainly would not want to mention services in Nassau and Suffolk which are well outside the city boundaries. But that wouldn't negate the thought of a separate page for each borough.
So how do you deal with services across a large geographical area without massive duplication? There are more and more problems arising with SEO placement with duplicate and "near duplicate" content. On my core website I have tried to use templates with some success, where I generate pages for different cities and towns using some common content, but leaving room for custom content on each page - more than just a city name variable. (Be sure that the meta data is also very unique). For my business, which is pest control, weather has a big impact on pest pressure. So I have a template for coastal, inland, mountain and valley towns - plus a few more. Then I also leave room for additional custom content for each city/town page - to eliminate near duplicate content and to address real differences between towns. Each town has it's own demographics and culture - so I address that as it relates to the products and services I offer.
Still, the best bet is to create totally unique pages for each targeted geographic area. That takes time. I am gradually addressing this, but wow, serving all of Southern California means creating a lot of unique local pages. I your business too, I am sure that in different cities and towns you will find that customers have different preferences and budgets. You can address that with unique pages for each geo.
I came across a company that for a while has had very good SEO placement, but with a tactic that I think will get detected. This company has massively duplicate content, but it is not on the same website. They have purchased separate domain names for every town and populated the identical content across these websites. I am sure Google will start smacking this website soon enough.
I hope this helps.
Gerry
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I make sure pages with similar content don't damage the other's SEO?
I work for a travel company and I have a 'tour page' targeted for pre-booking and a 'booking pack page' post-booking page, with some similar content but with details such as hostel locations, meeting places and times etc. I want to make sure the tour page keeps the authority as this is what I want to rank on SEO. I've got a couple of similar problems to this across site, there are a few pages on site that are post-sale and don't really need to rank on Google but it would be great if they could contribute to other pages' rankings. Thanks!
On-Page Optimization | | nicolewretham0 -
Product Descriptions (SEO)
Hello, I sell products relating to wood. Although the products vary, I like to give description of the wood type for the customers who might not be familiar with it. Will it hurt my rankings to give the same descriptions for the same wood type as long as the majority of the description is different? Here is an example of the layout: 1. Different description for different products 2. The same short description for the same wood types (seen throughout multiple pages) Hopefully my question makes sense.
On-Page Optimization | | mattl992 -
URL Path. What is better for SEO
Hello Moz people, Is it better for SEO to have a URL path like this: flowersite.com/anniversary_flowers/dozen_roses OR flowersite.com/dozen_roses Is it better to have the full trail of pages in the URL?
On-Page Optimization | | CKerr0 -
Duplicate Content
I'm currently working on a site that sells appliances. Currently, there are thousands of "issues" with this site, many of them dealing with duplicate content. Now, the product pages can be viewed in "List" or "Grid" format. As Lists, they have very little in the way of content. My understanding is that the duplicate content arises from different URLs going to the same site. For instance, the site might have a different URL when told to display 9 items than when told to display 15. This could then be solved by inserting rel = canonical. Is there a way to take a site and get a list of all possible duplicates? This would be much easier than slogging through every iteration of the options and copying down the URLs. Also, is there anything I might be missing in terms of why there is duplicate content? Thank you.
On-Page Optimization | | David_Moceri0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Checking for content duplication against content on your own site.
We are currently trying to rewrite our product descriptions and I'm afraid some of the salespeople that are writing the descriptions are plagiarizing one-another's writing. Is there a content duplication checker that will allow you to check a piece of writing against a specific site rather than all of the web?
On-Page Optimization | | MichealGooden0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
References and SEO?
Hi Everyone, I am really new to the SEO world (having come from paid search), so if this is a stupid question, I apologize. I noticed in Webmaster Tools that the top 25 keywords or so that Google thinks my site is about are keywords pulled from our references pages. Our site has a ton of authoritative content, most of which have corresponding reference pages with overlapping sources. Is this a problem? I am a little concerned that the keywords Google thinks are the most relevant to my site are really the least relevant. Any thoughts or suggestions? Thanks! nina
On-Page Optimization | | dirigodev0