Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
-
Greeting Mozzers,
This is a long question, so please bare with me
We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up).
So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now).
http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-trainingSo, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location.
After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it But, still don't want to be reviewed manually lol.
So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way.
Thanks again and sorry for the detailed message!
-
THANK YOU, EGOL!
-
Those pages look just about identical to me. The top paragraph to left of the map is almost identical... then the huge block of "directions and lodging information" is identical and a lot of words.
If this was my site, I would do this...
-
Rewrite unique content for the top paragraph beside the map. Would take a bit of work but I would do it. Its is not hard writing.
-
For the "Directions and Lodging Information" ... I would place that on a separate page and link to it. That eliminates a LOT of duplicate content from the NYC pages.
If this was my site I would not publish the pages as I see them today... but would feel good publishing all 800 if I did 1 and 2 above.
-
-
EGOL,
Thanks for your reply! The content is not entirely unique, but all created internally with the user in mind. For example, the main segments on all of the New York pages say the similar things with the exception of the course topic area.
For example this New York page on SharePoint outlines our SharePoint courses in New York (http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training) and this New York page on Project Management Training (http://www2.learningtree.com/htfu/usny27/new-york/project-management-training) shows our Project Management courses in New York. You'll notice the similarities of the page, but the content is different per course area. The UI to create the page simply changes a few elements of the URL to dynamically adjust the location, which provides the unique address, meta description (etc - all other vital SEO aspects). Otherwise, we would have had to use significant resources to create truly unique content for each and every page, something that management did not want to do. So, this is as white hate as I can be given the resources that I have :)...make sense?
-
Honestly... if these are all pages with great, original, unique, substantive, non-duplicating content... I would blast them up right now. 800 ain't that many.... and if you are a white hat then google should be OK with it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
Discourage search engines from indexing this site AFTER a site launch
Hi, I have unticked "Discourage search engines from indexing this site" a few months before the initial release of my website. I don't want to be found by search engines until the official release (still a few months left). Do you think that ticking this box again will harm the website's long-term ranking or have any repercussion on the website? Do you have any additional advice to avoid being temporarily ranked until the official release which won't harm the website in SERPs? Thanks for your answers.
Local Website Optimization | | Juvo0 -
Need Awesome Examples of Well-Designed Service & Product Pages
I'm looking for some examples of really well built product/service pages that have great conversion points on them. I find most small businesses do a terrible job at highlighting their features & benefits (the "why") for their services and wanted some inspiration from those that are doing a fabulous job.
Local Website Optimization | | JoyHawkins0 -
Ideas on creating location based service pages for SEO value while not worrying about local SEO?
Hello and thanks for reading! We have a bit of a rare issue, where we are a nationwide distributor but have a local side that handles all tristate area requests, the sales that happen via local basically don't impact the online side, so we're trying to not focus on local SEO but in a sense worry about abroad local SEO. We want to try the location based service pages, but not for every state, at most 5 states and inside those pages target 2 to 3 big cities. Is this a waste of time to even think about or is this something that can be done with a careful touch?
Local Website Optimization | | Deacyde0 -
Call Tracking numbers effect on Local SEO
Hello Mozzers! With the importance of homogeneous NAP information on Local SEO, could using Call Tracking numbers have a negative effect? Is it better to use Javascript to place the number, or to hard code it? Thanks in advance!
Local Website Optimization | | FrankSweeney0 -
Targeting different cities for my service - Geo landing pages
I am breaking my head trying to figure out the best way around this... so we have an hvac company located in nyc. We want to also target all the different boroughs. We have a bunch of different major keywords hvac repair + location hvac service + location along with keywords such as air conditioning repair + location, heating service + location , and so on..... Should each borough + keyword have its own page? Or should we just have one page called brooklyn and in that page target all the different keywords like hvac, air conditining, and heating ? Also does it matter how we have it laid out? Domaim/hvac-repair-brooklyn or should I add domain/service-area/hvac. ..... Some of my competitors have the same content written on each borough page just moved around a little with different city names, how are they ranking so well? Isn't that duplicate? Would love to hear from some people with success in this local area. Thanks!
Local Website Optimization | | interstate0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1 -
Google ranking wrong page
I have a client where google is ranking the homepage for a term that I want a specific landing page to rank for. The landing page is filled with great keyword focused content, gets a perfect score on the moz keyword target grader. And the home page is not even about the keyword it is ranking for. Any advice on how to get google to stop ranking the wrong page?
Local Website Optimization | | Atomicx0