How to Block Urls with specific components from Googlebot
-
Hello,
I have around 100,000 Error pages showing in Google Webmaster Tools. I want to block specific components like com_fireboard, com_seyret,com_profiler etc.
Few examples:
I tried blocking using robots.txt. Just used this
Disallow: /com_fireboard/
Disallow: /com_seyret/But its not working. Can anyone suggest me to solve this problem.
Many Thanks
Shradda
-
I agree with Sha that your 404 page has a nice appearance. My main concern is it lacks functionality.
If I click on a link to your site and end up on that page, what is my next action? Likely I would hit the <back>button on my browser and leave your site. It is either that or typing a URL.</back>
I recommend you offer users the option to stay on your site. Your site navigation, a search box, some links, anything would be helpful.
-
Hi Shradda,
I agree with Ryan that the use of a meta noindex tag is the preferable way to block the pages, but obviously there may be difficulties with applying the tag, depending upon how your pages are generated and whether you are able to alter the code or not.
You can also use ?option=com_fireboard etc to create 301 redirects back to a higher order category page or search.
You should be able to use a single line of code to 301 all pages within each directory.
Using 301 redirects will also send a signal to search engines to de-index those pages.
Very clever 404 page too! Had to watch him go all the way across the page and back just so I knew I wasn't missing anything!
Sha
-
You can log into Google Webmaster Tools and adjust your parameter settings. It was designed for this exact purpose. Site Parameters > URL Parameters. If you use this solution, be sure to do the same in Bing WMT as well.
A better solution would be to noindex the pages. Using robots.txt should be avoided when possible.
If you do need to use robots.txt, your current disallow statement is set up to not crawl the folder named "com_fireboard". You intention is to not crawl the parameter ?option=com_fireboard. I know wildcards work for the trailing portion of a path but I have not tried them for the beginning part of the path.
I suggest you try the following:
Disallow: ?option=com_fireboard
For more on the robots.txt file, please view the following site: http://www.robotstxt.org/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL 301 Re-direct
Hello, If we publish a blog post with a url which accidentally contains a number at the end (blog.companyname.com/subject-title-0), is it best-practice to update the URL (e.g. to blog.companyname.com/subject-title) and put in a 301 re-direct from the old to the new one or should it simply be left as is? I've read that 301's lose link equity and relevance so is it really worth re-directing for the sake of a cleaner url? Thanks for your input! John
Technical SEO | | SEOCT1 -
Localizing URLs Path - Hreflang
Hello, This is a simple question regarding how URLs should be managed for proper results with the hreflang tags. Right now, we have a website in English and German. The hreflang tag is working properly. This is how we currently have it: https://www.memoq.com/ https://de.memoq.com/ But we will soon change the way we localize our web, moving out of the sub-domain structure. There is this possibility of localizing the URLs path, but I was wondering if the hreflang tag would work in such case. The new structure would look something like: https://www.memoq.com/why-memoq https://www.memoq.com/de/warum-memoQ So my question is: If we localize the keyword in the path of the URL, will the tag still work? Or do they need to be in the same language than the English version. Thanks!
Technical SEO | | Kilgray1 -
URL removals
Hello there, I found out that some pages of the site have two different URL's pointing at the same page generating duplicate content, title and description. Is there a way to block one of them? cheers
Technical SEO | | PremioOscar0 -
Blog article URL - with or without date?
Quick question to all you folks: does including the date in a blog article's permalink affect rankings? For example, here's an article with the month and year, as well as the blog title: http://www.ayzanyc.com/blog/2012/12/difference-between-hot-chocolate-hot-cocoa/ Is it better to omit the date and just put the blog title? Also, if is better to avoid using the date, is it worth it to change the link structure of our previous articles (given that the URL will now be different), or should we just focus on future articles? Thanks ahead of time for your advice.
Technical SEO | | onurkiyak0 -
Block /tag/ or not?
I've asked this question in another area but now i want to ask it as a bigger question. Do we block /tag/ with robots.txt or not. Here's why I ask: My wordpress site does not block /tag/ and I have many /tag/ results in the top 10 results of Google. Have for months. The question is, does Google see /tag/ on WordPress as duplicate content? SEOMoz says it's duplicate content but it's a tag. It's not really content per say. I'm all for optimizing my site but Google is not penalizing me for /tag/ results. I don't want to block /tag/ if Google is not seeing it as duplicate content for only one reason and that's because I have many results in the top 10 on G. So, can someone who knows more about this weigh in on the subject for I really would like a accurate answer. Thanks in advance...
Technical SEO | | MyAllenMedia0 -
Duplicate Content and URL Capitalization
I have multiple URLs that SEOMoz is reporting as duplicate content. The reason is that there are characters in the URL that may, or may not, be capitalized depending on user input. A couple examples are: www.househitz.com/Pennsylvania/Houses-for-sale www.househitz.com/Pennsylvania/houses-for-sale www.househitz.com/Pennsylvania/Houses-for-rent www.househitz.com/Pennsylvania/houses-for-rent There are currently thousands of instances of this on the site. Is this something I should spend effort to try and resolve (may not be minor effort), or should I just ignore it and move on?
Technical SEO | | Jom0 -
Optimal Structure for Forum Thread URL
For getting forum threads ranked, which is best and why? site.com**/topic/**thread-title-goes-here site.com**/t/**thread-title-goes-here site.com**/**thread-title-goes-here I'd take comfort in knowing that SEOmoz uses the middle version, except that "q" is more meaningful to a human than "t". The last option seems like the best bet overall, except that users could potentially steal urls that I may want to use in the future. My old structure was site.com/forum/topic/TOPIC_ID-thread-title-goes-here so obviously any of those would be a vast improvement, but I might as well make the best choice now so I only have to change once.
Technical SEO | | PatrickGriffith0 -
Dynamic Parameters in URL
I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do? These attributes really help the user to find easier the products. I could remove some of the attributes, I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.
Technical SEO | | levalencia10