SEO url best practices
-
We're revamping our site architecture and making several services pages that are accessible from one overarching service page. An example would be as follows:
Services
- Student Services
- Essay editing
- Essay revision
- Author Services
- Book editing
- Manuscript critique
We'll also be putting breadcrumbs throughout the site for easy navigation, however, is it imperative that we build the URLs that deep? For example, could we simply have www.site.com/essay-editing rather than www.site.com/services/students/essay-editing?
I prefer the simplicity of the former, but I feel the latter may be more "search robot friendly" and better for SEO.
Any advice on this is much appreciated.
-
Thanks donford, that's very helpful.
After thinking it over, I feel it's best to keep the urls as simple as possible and use something like /s/essay-editing for them (the 's' representing services).
Thanks!
-
Hi Kibin,
Based on your situation the 2 things of URL BEST PRACTICES at odds with each other are:
Length vs Content
I would say depending on the average overall depth you should be perfectly fine and likely see benefits from a strategy like "www.site.com/services/students/essay-editing" as this is only 3 layers deep. At some point however, there is no benefit other then folder organization to having long urls.
If you forsee your site getting over 5 levels of deepness you may want to consider a different structure. Long urls especially those containing URL parameters can cause crawl issues. There are 2 basic thoughts on urls; 1 can a user understand the url, and 2 will the crawlers be able to navigate the url and index it correctly? You want to design for the users first while keeping in mind the way Search Engines will view it.
Finally about the difference between
www.site.com/services/students/essay-editing
and
www.site.com/essay-editingWhat you miss out on the latter is long tail keyword opportunities ie..(student essay editing, student services essay editing). Those still can be incorporated into the content of the page and likely will with the breadcrumbs, but they will have a tad more power by having the keyword in the url.
Think of the user of the site first, then the search engines, then the backend administration.
As a user I like the short url but from an administration and SEO perspective I like the longer urls.
Hope that helps,
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use existing page with bad URL or brand new URL?
Hello, We will be updating an existing page with more helpful information with the goal of reaching more potential customers through SEO and also attaching a SEM campaign to the specific landing page. The current URL of the page scores 25 on Page Authority, and has 2 links to it from blog articles (PA 35, 31). The current content needs to be rewritten to be more helpful and also needs some additional information. The downsides are that it has an "bad" URL- no target keyword and uses underscores. Which of the following choices would you make? 1. Update this old "bad" URL with new content. Benefit from the existing PA. -or- 2. Start with a new optimized URL, reusing some of the old content and utilizing a 301 redirect from the previous page? Thank you!
Technical SEO | | XLMarketing0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
Advice urgently needed on best practice for handling multiple product categories on Magento website
I have an ecommerce site built using Magento and urgently need advice on best practice for handling multiple product categories (where products appear in more than one category on the site creating multiple URLs to the same page). In April this year, based on advice from my SEO who felt that duplicate content issues were causing my rankings to be held back, I changed about 25% of the product categories to 'noindex, follow'. This has made organic traffic fall (obviously) as these pages fell out of Google's index. But, contrary to what I was hoping for, it didn't then improve rankings - not one iota, nothing - which was the ONLY reason why I did this. This has had a real negative impact on sales, so I'm starting to think this was actually an a terrible idea. Should I change them back? And to ask a wider question, what is best practice for this particular scenario?
Technical SEO | | Coraltoes770 -
Writing of url query strings to be seo frinedly
I understand the basic concepts of url write and creating inbound and outbound rules. I understand the creating of rules to rewrite url query strings so that it’s readable and seo friendly. It’s simple when dealing with a small number of pages and database records. (Microsoft Server, asp.net 4.0, IIS 7) However, I need to understand the concept to handle this: Viz the following: We have a database of 10,000+ establishments, 650+ cities, 400+ suburbs. Each establishment can be searched for by country, province, city and suburb. The search results show establishments that match the search criteria. Each establishment has its own unique id. Each establishment in the search results table has a link to the establishments detailed profile aspx page. The link is a query string such as http://www.ubuntustay.com/detailed.aspx?id=4 which opens the establishments profile. We need to rewrite the url to be something like: http://www.ubuntustay.com/detailed.aspx/capetown/westerncape/capetown/campsbay/diamondhouse which should still open the same establishment profile as the above query string. I can manually create a rule for this one example above without a problem. But there are over 10,000 establishments, all in different provinces, cities and suburbs. Surely we don’t manually generate a rewrite rule for each establishment? The resulting .htaccess will be rather large(?!) Therefore my questions are: How do I create url rewrite rules for dynamic query strings that originate from a large dataset? How do I translate the id number into the equivalent <country>/<province>/<city>/<suburb>/ <establishment>syntax?</establishment></suburb></city></province></country> Do I have to wire-up the global.asax so that every incoming requests extracts the country, province, city and suburb based on the establishment id which seem a bit cumbersome(?). If you’re wondering how I currently do it (it works but it’s not very portable or efficient): For each establishment which is included on the search results I simply construct the link url as: http://www.ubuntustay.com/detailed.aspx/4/Diamond%20House/Camps%20Bay/Cape%20Town On the detailed.aspx page load I simply extract the record id (4 in the example above) from the querystring and select that record from the db. Claude, what I’m looking for is advice on the best approach on how to create these rewrite rules and would be grateful if you can have one of your SEO friends lend their advice and experience. Any web resources that show the above techniques would be great. I’m not really looking for simple web links to url rewriting overviews…I have plenty of those. It’s the detail on the specific requirement above that I need please.
Technical SEO | | claudeSteyn0 -
Best practice for author tags: G+ personal or G+ company page?
I work for a company that has a corporate G+ page. I have a personal G+ page. When I write articles for the company blog there are 2 questions that come up: (1) for the rel="author" tag within the blog posting on the company's blog, should I reference my personal G+ page, or the company's G+ page as the author? (2) which G+ page, mine or my company's, should share the link to the blog posting on the company's site? Or should both share it? My goal is to build up author rank for either me or the company I work for (don't care which) so that after a while the Google organic search listing will include the author thumbnail if the article ranks for the search query. I don't care if the thumbnail is me or my company; just trying to figure out how to best link everything to maximize the chance of getting an author thumbnail in the search rankings. Thanks!
Technical SEO | | scanlin0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Domain redirect seo
Hello, my domain www.pacomarca.com and when i start the new campaing i get this pronblem: We have detected that the domain www.pacomarca.com and the domain pacomarca.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here. my domain is in networksolutions.com. how can i resolve it? many thanks Gonzalo
Technical SEO | | Kuna0 -
Best SEO strategy for a site that has been down
Because of hosting problems we're trying to work out, our domain was down all weekend, and we have lost all of our rankings. Doe anyone have any experience with this kind of thing in terms of how long it takes to figure out where you stand once you have the site back up? what the best SEO strategy is for immediately addressing this problem? Besides just plugging away at getting links like normal, is there anything specific we should do right away when the site goes back up? Resubmit a site map, etc? Thanks!
Technical SEO | | OneClickVentures0