Many Regional Pages: Bad for SEO?
-
Hello Moz-folks
We are relatively well listed for "Edmonton web design." - the city we work out of. As an effort to reach out new clients, we created about 15 new pages targeting other cites in Alberta, BC and Saskatchewan. Although we began to show up quite well in some of these regions, we have recently seen our rankings in Edmonton drop by a few spots. I'm wondering if setting up regional pages that have lots of keywords for that region can be detrimental to our overall rankings.Here is one example of a regional page:
http://www.web3.ca/red-deer-web-design
Thanks,
Anton TWeb3 Marketing Inc.
-
Hi Anton,
This is a good question. On visiting your Red Deer example page, a few concerns come up:
-
The text content is quite thin on this page. If it's this thin on the other pages, yes, that could be a problem.
-
If the text on the other pages is a duplicate or near-duplicate of the Red Deer page, then that is definitely a problem.
-
The optimization of the Red Deer page seems a bit awkward to me. 'Red Deer' just feels like it has been dropped into the text in a manner that doesn't read very naturally.
-
The text on the Red Deer page needs some TLC. Your major call to action contains an error in word choice:
Call 1-780-760-3333 for a free consolation.
These 4 elements do give some cause for concern that these pages may have been published without a lot of planning or effort going into them. Poorly planned and executed pages with thin or duplicate content can definitely water down the strength of your website. My view is that you need to find a reason for these landing pages to exist; a user-centric reason. What can you tell Red Deer customers about your work for Red Deer businesses that is unique? How does this differ from your Edmonton work?
I think a natural fit for website design firms taking the approach you would like to is to showcase their local clients in each chosen locale. Do awesome project writeups, case studies, infographics about the community, stat sharing, etc., to make each page unique and worth visiting. Never take a cookie cutter approach, or I think it will be readily apparent to Google and humans that you aren't making the most awesome effort you could to be the best possible answer for related queries. Hope this helps!
-
-
If you created quality pages, I don't know why it would have hurt your Edmonton pages. Did you check your domain authority to see if dropped? Also, did you run a comparison against the people who are now ahead of you? Did they get new links or improve in some way to jump you?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The Importance of Bold Keywords in SEO?
Hi all, Recently I came cross an RV lifestyle blog named RVing Trends. The website features high-quality contents about handy RV camping tips & guides, and in-depth RV product reviews. They seem to spend a lot of effort on the content quality. I've followed this website for a few months and can see they've been producing 3,000-5,000 word length contents regularly. One thing I notice is that they emphasize the main keyword as bold in almost the posts. You can check 1 sample here about RV mattress reviews. Just want to ask for your opinions about the efficiency of this technique and is the keyword density still important for blog content to rank well in Google. Thanks!
White Hat / Black Hat SEO | | TungNM1930 -
One page sites
HI Guys, I need help with a one page site What is the best method to getting the lower pages indexed? Linking back to the site(Deeplinking) is looking impossible. Will this hurt my SEO? Are there any other tips on one page websites that you can recommend?
White Hat / Black Hat SEO | | Johnny_AppleSeed0 -
Obscene anchor text linking to non-existent pages on my site
My website seems to be rapidly accumulating links from what seem to be reputable websites and which are going to non-existent pages on my website. The anchor text of many of these links is obscene. Here is the URL of one of the pages that is linking to me. I contacted the originating site a couple of weeks ago and they are looking into it but I've not heard back. I'm guessing the originating sites have been hacked. Should I be concerned? Why are they linking to pages on my site that don't exist? http://www.radicalartistsagency.com/htmlarea/language/0content_abo_utus.html Looking at the page source of this page reveals the hidden links.
White Hat / Black Hat SEO | | MartinDS0 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
SEO Company claiming our results?
This company http://www.synapseinteractive.com/portfolio/kempnrugelawgroup.php is linking to us, with a bs graphic about how they improved our rankings for some keywords. I have no idea who this company is. Does this happen often? Also, I'm tempted to contact them to take it down, but I really don't need some questionable company getting annoyed and then linking 10,000 spam sites to me. Any thoughts on what to do? I'm tempted to just do nothing, but for I ignore it, I want to make sure there's insidious about this link that would cause me any problems down the road. No, I haven't got any GWT or BWT messages about it. Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
SEO problems with PR Newswires
Just been investigating PR newswires for the first time (despite having worked in PR for over a decade!) One of my clients has asked my to send out a news release via a newswire of my choice. I will not be posting the news release on my client's website, to avoid the most obvious duplication issue. Has anyone had SEO probs from newswires though? I just saw one which offered: "Minimum guaranteed number of media websites on which your release is posted" alarm bells!
White Hat / Black Hat SEO | | McTaggart0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0