Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
G.A. question - removing a specific page's data from total site's results?
I hope I can explain this clearly, hang in there! One of the clients of the law firm I work for does some SEO work for the firm and one thing he has been doing is googling a certain keyword over and over again to trick google's auto fill into using that keyword. When he runs his program he generates around 500 hits to one of our attorney's bio pages. This happens once or twice a week, and since I don't consider them real organic traffic it has been really messing up my GA reports. Is there a way to block that landing page from my overall reports? Or is there a better way to deal with the skewed data? Any help or advice is appreciated, I am still so new to SEO I feel like a lot of my questions are obvious, but please go easy on me!
White Hat / Black Hat SEO | | MyOwnSEO0 -
Buying a domain vs. renting a domain
I am considering buying and redirecting a domain that has a pretty strong, relevant link profile. However, it's very expensive. There is another option to rent the domain on a month-to-month basis. I am interested in doing this for at least a month just to see what SEO benefits are to be had and if it would ultimately be worth buying or not. Can renting a domain have any negative impacts on my primary site? Would the search engines know if I did this? Is there any harm in having those redirects appear and then disappear?
White Hat / Black Hat SEO | | jampaper0 -
11 000 links from 2 blogs + Many bad links = Penguin 2.0\. What is the real cause?
Hello, A website has : 1/ 8000 inbound links from 1 blog and 3000 from another one. They are clean and good blogs, all links are NOT marked as no-follow. 2/ Many bad links from directories that have been unindexed or penalized by Google On the 22nd of May, the website got hurt by Penguin 2.0. The link profile contains many directories and articles. The priority we had so far was unindexing the bad links, however shall we no-follow the blog links as well? Thanks!
White Hat / Black Hat SEO | | antoine.brunel0 -
Questionable backlinks...
One of our competitors (who are ranking top spot ) have this trend of building backlinks from websites build for the sole purpose of seo. (see example) When you see the website it's just a submission of articles from different companies trying to rank for a certain keyword most of the time poorly written. Our competitor seems to be doing this a lot...
White Hat / Black Hat SEO | | Immanuel
What do you guys think, is it just a matter of time before Google cracks down on them or is this technique actually working for them? (even though it's rather grey hat) Or... could it be someone trying to build "poor" backlinks to them in an attempt to push them of the Google throne 😉1 -
Black Hat Link Building Ethics Question
I have taken on the SEO/Inbound duties for my company and have been monitoring some of our competitors in the market space. In June one of them began a black hat link building campaign that took them from 154 linking root domains to about 7500 today. All of the links target either /header or /permalink/index and all have anchor text along the lines of "Windows 7 activation code." They are using forgotten forums and odd pages, but seem to be finding high DA sources to place the links. This has skyrocketed their DA (40 to 73), and raised their mozRank, mozTrust, and SERP positions. Originally I thought to report it to Google, but I wanted to wait a few weeks and see what the campaign did for them and if Google would catch on. I figured adding 81K links in 2 months would trigger something (honestly, if I was able to find out they were doing it then it's got to be obvious). But they have grown every week and no drop in rankings. So my question is would you report it? Or continue to wait and see? Technically they are not a "competitor" in the strictest sense of the word (we actually do sell some of their products as OEM), but I find the tactic despicable and it makes my efforts to raise our rankings and DA seem ineffective to people not in the know about SEO. Interested to see everyone's responses! Taylor
White Hat / Black Hat SEO | | anneoaks0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0 -
Using an auto directory submission
Has anyone used easysubmits.com and what's your experience with it? Any other directory submission or link building tools that help automate and manage the process like easysubmits.com says they can do? I'm just looking at it currenlty and wanted to hear others thoughts before I get taken in by some black hat method that hurts my websites instead.
White Hat / Black Hat SEO | | Twinbytes0