XML Sitemap instruction in robots.txt = Worth doing?
-
Hi fellow SEO's,
Just a quick one, I was reading a few guides on Bing Webmaster tools and found that you can use the robots.txt file to point crawlers/bots to your XML sitemap (they don't look for it by default).
I was just wondering if it would be worth creating a robots.txt file purely for the purpose of pointing bots to the XML sitemap?
I've submitted it manually to Google and Bing webmaster tools but I was thinking more for the other bots (I.e. Mozbot, the SEOmoz bot?).
Any thoughts would be appreciated!
Regards,
Ash
-
Thanks for the answer and link John!
Regards,
Ash
-
I think it's worth it as it should only take a few minutes to set up, and it's good to have a robots.txt, even if it's allowing everything. Put a text file named "robots.txt" in your root directory with:
<code>User-agent: * Disallow: Sitemap: http://www.yourdomain.com/none-standard-location/sitemap.xml</code>
Read more about robots.txt here: http://www.seomoz.org/learn-seo/robotstxt.
-
It is not going to make any difference. Time is better spend in fixing crawling & indexing issues of the website.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowed "Search" results with robots.txt and Sessions dropped
Hi
Intermediate & Advanced SEO | | Frankie-BTDublin
I've started working on our website and I've found millions of "Search" URL's which I don't think should be getting crawled & indexed (e.g. .../search/?q=brown&prefn1=brand&prefv1=C.P. COMPANY|AERIN|NIKE|Vintage Playing Cards|BIALETTI|EMMA PAKE|QUILTS OF DENMARK|JOHN ATKINSON|STANCE|ISABEL MARANT ÉTOILE|AMIRI|CLOON KEEN|SAMSONITE|MCQ|DANSE LENTE|GAYNOR|EZCARAY|ARGOSY|BIANCA|CRAFTHOUSE|ETON). I tried to disallow them on the Robots.txt file, but our Sessions dropped about 10% and our Average Position on Search Console dropped 4-5 positions over 1 week. Looks like over 50 Million URL's have been blocked, and all of them look like all of them are like the example above and aren't getting any traffic to the site. I've allowed them again, and we're starting to recover. We've been fixing problems with getting the site crawled properly (Sitemaps weren't added correctly, products blocked from spiders on Categories pages, canonical pages being blocked from Crawlers in robots.txt) and I'm thinking Google were doing us a favour and using these pages to crawl the product pages as it was the best/only way of accessing them. Should I be blocking these "Search" URL's, or is there a better way about going about it??? I can't see any value from these pages except Google using them to crawl the site.0 -
Is it worth redirecting?
Hello! Is there any wisdom or non-wisdom in taking old websites and blogs that may not be very active, but still get some traffic, and redirecting them to a brand new website? The new website would be in the same industry, but not the same niche as the older websites. Would there be any SEO boost to the new website by doing this? Or would it just hurt the credibility of the new website?
Intermediate & Advanced SEO | | dieselprogrammers0 -
Robots.txt Blocking - Best Practices
Hi All, We have a web provider who's not willing to remove the wildcard line of code blocking all agents from crawling our client's site (user-agent: *, Disallow: /). They have other lines allowing certain bots to crawl the site but we're wondering if they're missing out on organic traffic by having this main blocking line. It's also a pain because we're unable to set up Moz Pro, potentially because of this first line. We've researched and haven't found a ton of best practices regarding blocking all bots, then allowing certain ones. What do you think is a best practice for these files? Thanks! User-agent: * Disallow: / User-agent: Googlebot Disallow: Crawl-delay: 5 User-agent: Yahoo-slurp Disallow: User-agent: bingbot Disallow: User-agent: rogerbot Disallow: User-agent: * Crawl-delay: 5 Disallow: /new_vehicle_detail.asp Disallow: /new_vehicle_compare.asp Disallow: /news_article.asp Disallow: /new_model_detail_print.asp Disallow: /used_bikes/ Disallow: /default.asp?page=xCompareModels Disallow: /fiche_section_detail.asp
Intermediate & Advanced SEO | | ReunionMarketing0 -
Should I bother with a Video Sitemap?
Morning all, I've started a pretty aggressive Video content push in recent weeks. All our videos are on our YouTube channel. I decided to go with hosting the videos on YouTube based on my research on moz.com, especially considering the potential reach of the content on YouTube. What I'm finding is that the YouTube channel is doing great. We've hit 200 subscribers and 15K views in a little under a month. Wayyyy more than I could have ever hoped for. But the blog posts on our website are getting minimal traffic and no search visibility. That doesn't necessarily bother me, since the intention of our marketing campaign is to use YouTube to drive traffic to our website. So I guess my question is really more to do with optimizing the site with Video Sitemaps and best practices for Google Webmaster Tools. Right now we have YouTube videos embedded on blog posts like this one that have a time-stamp. But I've been working to create Gallery-style pages (no time-stamp) which would have multiple YouTube videos embedded on them like this one. These make it easier for visitors to watch multiple videos without needing to skip around to multiple blog posts. The challenge I'm running into is that when I go to submit a Video Sitemap to GWT I get an error saying that I have duplicate page content within the video sitemap. I've used several WP plugins to do this. It seems that when there is a video embedded on multiple URLs (pages + posts) the plugins will ignore the posts and only add the pages to the video sitemap. Here is my regular Sitemap Here is my video Sitemap I've attached a screenshot of my current Yoast Video SEO config if that's useful for reference. Does anyone have experience with using multiple sitemaps in GWT? I'm starting to think that maybe I shouldn't even bother with a video sitemap. Maybe those gallery-style pages should just go in the regular sitemap? Any thoughts or advice would be highly appreciated! Thanks llQfydA
Intermediate & Advanced SEO | | TMHoward860 -
Massive URL blockage by robots.txt
Hello people, In May there has been a dramatic increase in blocked URLs by robots.txt, even though we don't have so many URLs or crawl errors. You can view the attachment to see how it went up. The thing is the company hasn't touched the text file since 2012. What might be causing the problem? Can this result any penalties? Can indexation be lowered because of this? ?di=1113766463681
Intermediate & Advanced SEO | | moneywise_test0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
XML Sitemap index within a XML sitemaps index
We have a similar problem to http://www.seomoz.org/q/can-a-xml-sitemap-index-point-to-other-sitemaps-indexes Can a XML sitemap index point to other sitemaps indexes? According to the "Unique Doll Clothing" example on this link, it seems possible http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic Can someone share an XML Sitemap index within a XML sitemaps index example? We are looking for the format to implement the same on our website.
Intermediate & Advanced SEO | | Lakshdeep0 -
What should I block with a robots.txt file?
Hi Mozzers, We're having a hard time getting our site indexed, and I have a feeling my dev team may be blocking too much of our site via our robots.txt file. They say they have disallowed php and smarty files. Is there any harm in allowing these pages? Thanks!
Intermediate & Advanced SEO | | Travis-W1