Do I need a separate robots.txt file for my shop subdomain?
-
Hello Mozzers!
Apologies if this question has been asked before, but I couldn't find an answer so here goes...
Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt
We host our shop on a separate subdomain https://shop.mysitename.org.uk
Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused!
-
Thank you. I want to disallow specific URLs on the subdomain and add the shop sitemap in the robots.txt file. So I'll go ahead and create another!
-
You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
-
Currently we just have: User-agent: *
I'm in the process of optimising.
-
It depends what currently is in your robots.txt. Usually it would be useful to have another one for your subdomain.
-
Yes, I would have a seperate robots.txt files.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
Need help please with url guidelines.
Hi SEO PROS, I have a website and I am planing to change all the urls. I need to know what is the right way of making the urls. Here is some information. We are based in Brooklyn NY and we sell our services to Manhattan clients and Manhattan has few names. NY, NYC, Manhattan and NY. So by looking at my service area I came up with this url. http://www.signsny.com/brooklyn-ny/awnings this is my current url. http://www.signsny.com/sign-types/awnings-canopies-brooklyn-NYC. This is what I am planning to change it to. Please guide me to the right direction, so in future I don't have to re-do them again. Thanks Abie
Technical SEO | | signsny0 -
Effects of significant cross linking between subdomains
A client has noticed in recent months that their traffic from organic search has been declining, little by little. They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain! Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site. Interested to hear the thoughts of the community on this one!
Technical SEO | | helga730 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
Recommendations. Need Hosting Company
I need a new hosting company asap. I am based in Costa Rica but need a reliable international service that supports Modx. Any suggestions would be greatly appreciated!
Technical SEO | | Llanero0 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0 -
E-commerce solution and subdomain issues
Hello All,
Technical SEO | | CherieP
In light of Wil Reynold's closing keynote at Portland's Searchfest, I thought I might try posting here to get some advice. We run a family business on the side and we're looking at starting to use volusion.com for our e-commerce solution. The catch is we currently have a wordpress site summitmining.com running on thesis with great SEO. Ranking #1 & #2 for our highest trafficked terms. Ideally, I'd like Summitmining.com to direct to the Volusion store and then summitmining.com/blog to go to our wordpress installation BUT since the volusion site will be hosted with the company and they will not host our wordpress installation we'd have to use a subdomain instead of a subdirectory which I understand will be bad for SEO. Does anyone have any recommendation on how to set this up without totally screwing up our ranking OR any recommendations of an easy to use shopping cart (I've worked on a magento site before and it's too complex for us) that wouldn't require a separate or subdomain? Thank you so much!
-Cherie Prochaska
503-816-3557
[email protected]
@cherieprochaska0 -
Robots exclusion
Hi All, I have an issue whereby print versions of my articles are being flagged up as "duplicate" content / page titles. In order to get around this, I feel that the easiest way is to just add them to my robots.txt document with a disallow. Here is my URL make up: Normal article: www.mysite.com/displayarticle=12345 Print version of my article www.mysite.com/displayarticle=12345&printversion=yes I know that having dynamic parameters in my URL is not best practise to say the least, but I'm stuck with this for the time being... My question is, how do I add just the print versions of articles to my robots file without disallowing articles too? Can I just add the parameter to the document like so? Disallow: &printversion=yes I also know that I can do add a meta noindex, nofollow tag into the head of my print versions, but I feel a robots.txt disallow will be somewhat easier... Many thanks in advance. Matt
Technical SEO | | Horizon0