.htaccess and SEO
-
Hey Everyone,
New to SEOMOZ and I have an important question:
We launched a new version of our site about 6 months ago and had a TON of redirects in our HTCaccess file due to a change in our permalink structure (over 2000 easily).
Anyways, recently we went back in and took 2000+ lines of individual htaccess redirects and consolidated them into a RegularExpression for the ones where we could find a pattern for and the others (30 or so) are just the actual redirect link.
Since doing that, it appears our search engine traffic has dropped a bit. It's not crazy, but it's definitely noticeable. I'm not an SEO expert, so my question is this the reason why? How long will we see this decline before we're back at normal levels? We're seeing a lot less crawl errors since doing this, so I think it's a good thing. But I just wanted to check and see.
The site is http://thetechblock.com if you want to take a look. Any help would be really appreciated.
-
Hi Bayan,
Sorry to hear your search engine traffic has dropped.
It might be helpful if you posted the section of the .htaccess file in question.
Here's some things I would double check:
1. Does the .htaccess file serve a 301 response code for the redirect? (probably, but worth double checking) What I might try is create a file of all your OLD urls and upload them into a crawler like Screaming Frog and test them all out to see if they both redirect to the proper URL and with the correct response code.
2. Did you redirect the 2000 pages to unique URLs, or did you redirect them to a single url (or handful or urls)? If you consolidated your URLs to only a handful, this could effect your rankings.
3. Did the content and other HTML elements stay the same during the redirect? For example, did the title tags stay the same or reasonable close to the original? Big differences could cause the URLs to lose relevance and thus rankings.
4. Less crawl errors = good. I would check the Index Status in Google Webmaster to see if the number of pages discovered/indexed matches up well with the number of URLs on your site.
5. Proper sitemaps submitted? Oftentimes when you change your URL structure it's good to submit 2 sitemaps - one listing all your old URLs and another for the new. This way, search engines will attempt to crawl the old URLs and "process" the redirect. Probably not an issue for you since the change was 6 months ago, however.
6. Finally, I'd keep my eyes open for any other causes that may have caused the drop in traffic, i.e. Algorythm Updates, Site Issues, Backlinks and so on.
That's all I can think of, but there may be more. Let us know if you find anything!
-
Anyone?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
Redirecting Blog WITHOUT .htaccess
Hello Mozzers! I currently have a WordPress blog on a subdomain that I'd like to redirect to a subdirectory. Unfortunately, the CMS that I am on will not allow me implement a rewrite rule because we do not have access to the server (It's on Shopify, which is a fully hosted solution). If we set up a 301 from blog.domain.com to domain.com/blog, will all my link juice be preserved? Or is it better just to keep it on the subdomain for now? I'd like to avoid importing all my content into Shopify's native blog, if I can, just because we may be moving to another CMS shortly. Thank you in advance! -Alima Team
Algorithm Updates | | okatieo0 -
Pdfs for SEO - benefits, downfalls and promotional methods
Hi fellow Mozzers, We're just in the middle of relaunching our website (a design agency), and I had a few questions re: SEO of our service keywords. The designers want the site to seem light on content, despite my advice that this would reduce the terms we can rank for. With that in mind, I was going to include advice pages that can be found via the site map, site search or text links but aren't promoted via the top level or second level nav. Another alternative I was going to explore was using pdfs for design case studies, so the site would feature a light case study, but with a more in-depth pdf available if wanted. I have located numerous articles highlighting how best to optimise pdfs, but I have a few queries aside from the technical standpoint. So: is this the best way to getting round the issue of keeping the site 'light' on content? are there stats that show CTRs on pdf pages over HTML? as well as optimising the pdf content and promoting them on our social media channel, is there a benefit from including them on the likes of Scribd, Edocr and so on (from either an SEO or simply from a promotional viewpoint, or both) Hopefully that's all clear! Nick
Algorithm Updates | | themegroup0 -
SEO Budgets, the million dollar question???
Hi All, I am currently looking to revamp my SEO strategy inline with Google's latest Panda and Penguin updates, and looking to appoint a new agency. With SEO changing so much over the years and so many players in the marketplace quoting all sorts, I simply need to determine the kind of money I need to be spending on my SEO, 2) what i should be getting for the money, or different budget levels what I need to be focusing on in priority order, a top ten in sorts Should i be looking to increase or decrease my spend over the long term. I am only a small business with a turnover of about 50 - 80k and need to really cement my strategy so it work long term but also shows a steady return. I have one guy quoting $99 a month, one £250 and one £750, you can probably see my problem. Thanks in advance.
Algorithm Updates | | etsgroup0 -
Local SEO-How to handle multiple business at same address
I have a client who shares the same address and suite number with multiple business. What should be done to optimize their website and citations for local SEO? Is this a huge issue? What should we do so our rankings aren't affected. Will changes take a long time to take place? Thanks
Algorithm Updates | | caeevans0 -
Can You Recommend An SEO Consultant To Support Our Panda Recovery Efforts?
Hi, I'm looking to find an SEO consultant to help me review my organic search strategy following the recent Panda update. Can you recommend somebody? Thanks, Adam
Algorithm Updates | | adampick0