How to Submit XML Site Map with more than 300 Subdomains?
-
Hi,
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months.I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt
XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz ,
Currently in my website we have only 1 robots.txt file for main domain & sub domains.
Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately.
Is there any automatic way & do i have to ping separately if i add new pages in subdomain.
Please advise me.
-
Let me know how it goes. I'm sure it can be done. Just needs the right team
-
Yea in wordpress that option is available, but we are using Ruby On rails platform, so i am not sure whether we can do or not.
For eg http://windows7.iyogi.com/sitemap.xml.gz they use Wordpress CMS & it's mentioned in page that
"It was generated using the Blogging-Software WordPress and the Google Sitemap Generator Plugin by Arne Brachhold."
Anyway thx for ur help i will speak to my smart developers, let's c what they can do
-
Okay with this little bit of information it does sound like it might in fact be legitimate. If it is, then the best solution is to work with the development team to automate the creation of each sitemap.xml file, and have them submitted to Google automatically, I know this is possible because I use the google Sitemaps plug-in for WordPress - and it automatically submits to Google and Bing.
How it does that I do not know. That's up to smart web developers to figure out and replicate.
-
Hi Alan, i recently joined this co & i can't change the whole structure.
I believe they have created virtual sub - domains & Moreover site traffic is growing at a great rate so they can't think of changing structure.
Last month it has been ranked as 20th Most visited website in India, so things are pretty fine. Moreover it's an education website and students can easily remember Subdomain URL eg: http://gmat.abc.com . also direct traffic to these sub domains is very high. So now how should i solve problem of XML sitemap
-
The more important, and URGENT issue is why are there so many subdomains, and why are there going to be more? That's got to be one of the most serious and potentially harmful things you could do to your SEO efforts unless it's an extremely rare situation that justifies the tactic.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt on subdomains
Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?
Technical SEO | | Whittie0 -
Site Migration Questions
Hello everyone, We are in the process of going from a .net to a .com and we have also done a complete site redesign as well as refreshed all of our content. I know it is generally ideal to not do all of this at once but I have no control over that part. I have a few questions and would like any input on avoiding losing rankings and traffic. One of my first concerns is that we have done away with some of our higher ranking pages and combined them into one parallax scrolling page. Basically, instead of having a product page for each product they are now all on one page. This of course has made some difficulty because search terms we were using for the individual pages no longer apply. My next concern is that we are adding keywords to the ends of our urls in attempt to raise rankings. So an example: website.com/product/product-name/keywords-for-product if a customer deletes keywords-for-product they end up being re-directed back to the page again. Since the keywords cannot be removed is a redirect the best way to handle this? Would a canonical tag be better? I'm trying to avoid duplicate content since my request to remove the keywords in urls was denied. Also when a customer deletes everything but website.com/product/ it goes to the home page and the url turns to website.com/product/#. Will those pages with # at the end be indexed separately or does google ignore that? Lastly, how can I determine what kind of loss in traffic we are looking at upon launch? I know some is to be expected but I want to avoid it as much as I can so any advice for this migration would be greatly appreciated.
Technical SEO | | Sika220 -
XML Sitemap Creation
I am looking for a tool where I can add a list of URL's and output an XML sitemap. Ideally this would be Web based or work on the mac? Extra bonus if it handles video sitemaps. My alternative is XLS and a bunch of concatenates, but I'd rather something cleaner. It doesn't need to crawl the site. Thanks.
Technical SEO | | Jeff_Lucas0 -
Are mobile annotation in PC xml sitemaps a replacement for mobile xml sitemaps?
These two links confused me as to what I should do... https://developers.google.com/webmasters/smartphone-sites/details https://support.google.com/webmasters/answer/34648?hl=en
Technical SEO | | JasonOliveira0 -
How to handle mobile site with less pages than the main site?
We are developing a mobile version of our website that will utilize responsive design/dynamic serving. About 70% of the main website will be included in the mobile version. What (if anything) should be the redirect for pages not included in the mobile version of the site? Also - for one specific section users will be redirected from that page to the homepage, what is the redirect that should be used for this? Thanks!
Technical SEO | | theLotter0 -
How is Google finding our preview subdomains?
I've noticed that Google is able to find, crawl and index preview subdomains we set up for new client sites (e.g. clientpreview.example.com). I know now to use "meta name="robots" and robots.txt) to block the search engines from crawling these subdomains. My question though, is how is Google finding these subdomains? We don't link to these preview domains from anywhere else, so I can't figure out how Google is even getting there. Does anybody have any insight on this?
Technical SEO | | ZeeCreative0 -
What happens if my site was down for 10 hours?
Hello, We have a fairly large site. Due to an attack it was taken down for about 10 hours. We have finally been able to resolve the security issues, restore the code and put everything back online. Now, we are a little bit worried about what GoogleBot will think of us. We have been working hard during the past months to get a better SEO. Should we be worried?
Technical SEO | | jgenesto0 -
One good reason why i should have a mobile site map
Good evening from I can just about keep my eyes open 7th cup of Coffeee David, Ok I'm adding a mobile sitemap to a mobile site. Whilst I know this is important the client wants one good reason why he should have one integrated into http://www.innoviafilms.com/m/Home.aspx I'm so knackered I cant articulate one, could some one put me out my misery and give me one good reason I should toil away with mobile xml; sitemap? Resource: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648 Any insights welcome 🙂
Technical SEO | | Nightwing0