What is the best way to fix legacy overly-nested URLs?
-
Hi everyone,
Due to some really poor decisions I made back when I started my site several years ago, I'm lumbered with several hundred pages that have overly-nested URLs. For example:
/theme-parks/uk-theme-parks/alton-towers/attractions/enterprise
I'd prefer these to feature at most three layers of nesting, for example:
/reviews/alton-towers/enterprise
Is there a good approach for achieving this, or is it best just to accept the legacy URLs as an unfixable problem, and make sure that future content follows the new structure? I can easily knock together a script to update the aliases for the existing content, but I'm concerned about having hundreds of 301 redirects (could this be achieved with a single regular express in .htaccess, for example?).
Any guidance appreciated.
Thanks, Nick
-
Thanks Alan and Irving, your responses are both very helpful. In reality, these pages have relatively few external links pointing to them compared to other sections of the site, so I think I will opt to redirect them. The newer sections of the site have a nice clean URL structure and good on-page optimization, so I think it's best to bite the bullet and move the older pages over to a new system.
-
Except that hundrends of 301's means hundreds of link juice leaks
-
there's no problem with having hundreds of 301's.
having /theme-parks/ twice in the url is slightly spammy, but probably better than your second example where you don't have "theme parks" even once in the url. I would make it /theme-park-reviews/ unless your domain name already have "theme parks" in it.
If you're having great rankings with the current pages you may want to just leave the legacy pages and work on new structure for posts going forward, but if it's not brining you a ton of traffic that your business depends on, then I would 301 them to the new structure, should be fine and you could always revert if you see negative effects.
-
I would accept it, google does not mind, it is said Bing does count the folders as a signal, but with modern routing engines like MVC along with friendly urls, it is common to have many sections to a url, and Bing i assume will take that into account.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
How should I close my forum in a way that's best for SEO?
Hi Guys, I have a forum on a subdomain and it is no longer used. (like forum.mywebsite.com) It kind of feels like a dead limb and I don't know what's best to do for SEO. Should I just leave it as it is and let it stagnate? There is a link in the nav menu to the main domain so users have a chance to find the main domain. Or should I remove it and just redirect the whole subdomain to the main domain? I don't know if redirects would work as I doubt most of the threads would match our articles, plus there are 700 of them. The main domain is PR3 and so is the forum subdomain. Please help!
Technical SEO | | HCHQ0 -
What is the best way to deal with https?
Currently, the site I am working on is using HTTPS throughout the website. The non-HTTPS pages are redirected through a 301 redirect to the HTTPS- this happens for all pages. Is this the best strategy going forward? if not, what changes would you suggest?
Technical SEO | | adarsh880 -
What is the best way to find missing alt tags on my site (site wide - not page by page)?
I am looking to find all the missing alt tags on my site at once. I have a FF extension that use to do it page by page, but my site is huge and that will take forever. Thanks!!
Technical SEO | | franchisesolutions1 -
Long URL
I am using seomoz software as a trial, it has crawled my site and a report is telling me that the URL for my forum is to long: <dl> <dt>Title</dt> <dd>Healthy Living Community</dd> <dt>Meta Description</dt> <dd>Healthy life discussion forum chatting about all aspects of healthy living including nutrition, fitness, motivation and much more.</dd> <dt>Meta Robots</dt> <dd>noodp, noydir</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> <dd> 1 Warning Long URL (> 115 characters) Found about 17 hours ago <dl> <dt>Number of characters</dt> <dd>135 (over by 21)</dd> <dt>Description</dt> <dd>A good URL is descriptive and concise. Although not a high priority, we recommend a URL that is shorter than 75 characters.</dd> </dl> </dd> <dd> URL: http://www.goodhealthword.com/forum/reprogramming-health/welcome-to-the-forum-for-discussing-the-4-steps-for-reprogramming-ones-health/ The problem is when I check the page via edit or in the admin section of wordpress, the url is a s follows: http://www.goodhealthword.com/forum/ My question is where is I cannot see where this long url is located, it appears to be a valid page but I cant find it. Thanks Pete </dd> </dl>
Technical SEO | | petemarko0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Which is the best way of make text bold?
Is there a difference between , and font-weight: bold, in terms of SEO weighting? Does font size make a diffence, relative to the average font size of the page?
Technical SEO | | soltec0 -
Best way to condense content on a page?
We want to add a video transcript to the same page as the video, but it doesn't really fit the design of the page. Is it fine to use CSS/DIVs to either have a "click to read full transcript" or a scroll box?
Technical SEO | | nicole.healthline0