Utilizing one robots.txt for two sites
-
I have two sites that are facilitated hosting in similar CMS. Maybe than having two separate robots.txt records (one for every space), my web office has made one which records the sitemaps for the two sites, similar to this:
-
You can use the same syntax on both root domains but realistically you can't have 1 robots.txt point to something else (with the exception of XML sitemaps). The only option that I see here is that you have one file that gets served from 1 server on both requests from two sites but that seems like a lot of engineering overkill for a very simple problem (just duplicating files).
-
@eulabrant It is advisable to use separate robots.txt files for different properties.
If there is a subdomain, you can utilize the main parent domain robots.txt file for the corresponding child subdomain.
-
Hey @eulabrant You didn't really ask a question here but I'll assume you want to know if you can use one robots.txt for two sites?
You could duplicate it but you can't really use it for both. I also assume you are missing something off this post as you are referring to "this:" but not that 'this' is
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
Site Migration from One Dev. and Server to Another Dev. and Server
Hi Mozzers! I've got a client that is in the early stages of moving the development of their site to another company and therefore, a new server. The site is very large and the migration will take place over 18 months. In the beginning, smaller chunks of the site will be moved, and as that process gets dialed in, larger portions will migrate. It was brought to our attention today that they (on either side of development) have not yet worked out the logistics of keeping the domain and URL structure consistent throughout the migration. The initial proposal was that they publish newly migrated pages to a subdomain, which we obviously want to steer away from. I'm now on a mission to find a solution that will make everyone happy; client, old dev, new dev, and us (as the SEO partner). Does anyone have experience in managing SEO through a migration such as this?
Technical SEO | | LoganRay0 -
Robots.txt
Hello, My client has a robots.txt file which says this: User-agent: * Crawl-delay: 2 I put it through a robots checker which said that it must have a **disallow command**. So should it say this: User-agent: * Disallow: crawl-delay: 2 What effect (if any) would not having a disallow command make? Thanks
Technical SEO | | AL123al0 -
Two Domains for the Same Page
We are creating a website for a client that will have hundreds of geographically driven landing pages. These pages will all have a similar domain structure. For example www.domain.com/georgia-atlanta-fastfood-121 We want the domain to be SEO friendly, however it also needs to be print friendly for a business card. (ex www.domain.com/121) The client has requested that we have two domains for each page. One for the Search Engines and then another shorter one for print/advertising purposes. If we do that will search engines the site for duplicate content? I really appreciate any recommendations. Thanks! Anna
Technical SEO | | TracSoft0 -
Photography Sites with Same Developer - Why Is One Ranking & Other Not?
I'm currently confused about the difference in ranking between two competing sites, created by the same agency. http://jmayphoto.com/index2.php#!/home (302 redirected from http://jmayphoto.com...yeah) is not ranking well, and I'm not surprised. However, competitor http://www.shanrenee.com/ is ranking within the top 5 spots for a primary target keyword (dallas wedding photographer) and I don't understand how it's doing so well. I definitely see differences, but not enough to explain how Shan Renee is one page. What am I missing?
Technical SEO | | BrittanyHighland0 -
Could multiple languagues on one site be bad for SEO???
Our site is has content in English and in Spanish. The spanish side was translated by me, Spanish is my first language, so i know that the translations are good and its original content. We were Pandalized/Penguinnized pretty bad earlier this year. We have completely cleaned our site of anything that could be considered thin content or grey hat techniques. An associate is telling me that we need to put the spanish version of the site on its own domain, does this make sense to anyone? The spanish side of the site gets only about 5% of the visitors, bu i still don't see the logic in taking all those pages and putting them on a different domain. Would this help recover from Panda/Penguin. Thanks
Technical SEO | | 858-SEO0 -
No indexing url including query string with Robots txt
Dear all, how can I block url/pages with query strings like page.html?dir=asc&order=name with robots txt? Thanks!
Technical SEO | | HMK-NL0 -
Mobile site - allow robot traffic
Hi, If a user comes to our site from a mobile device, we redirect to our mobile site. That is www.mysite/mypage redirects to m.mysite/mypage. Right now we are blocking robots from crawling our m. site. Previously there were concerns the m. site could rank for normal browser searches. To make sure this isn't a problem we are planning on rel canonical our m. site pages and reference the www pages (mobile is just a different version of our www site). From my understanding having a mobile version of a page is a ranking factor for mobile searches so allowing robots is a good thing. Before doing so, I wanted to see if anyone had any other suggestions/feedback (looking for potential pitfalls, issues etc)
Technical SEO | | NicB10