Subdomain Research Tool
-
Does anybody know of a research tool that can track the amount of subdomains on a root domain?
Maybe there is a way to manipulate a Google search to display the different subdomains that are indexed?
-
Cool, looking forward to the ad-hoc query tool!
-
Thanks for the update Jon! Looking forward to the updates to come and to be able to manipulate the data in new ways. Should be exciting trying to figure new ways to explore things like subdomain counts.
Thanks again!
-
Hey guys!
So I spoke to the MozScape team and right now this is not possible. Although in theory we have the data to answer this question, due to the sheer size of the dataset the engineering teams have to make data structure and optimization decisions that favor certain use cases (e.g. 'show me all of the external followed links for a root domain'). Currently MozScape is not optimized to answer the use case 'show me all of the subdomains on a given root domain'.
However, you may know we are working on index updates that are going to change the way we store data - this is a huge project, but once it is completed we will be able to run ad-hoc queries against our data, and solve use cases like this.
Hope this helps!
Jon
-
This would be a very handy addition. As Michael has said, using search qualifiers is no quick (or absolute) solution. This would be a great tool for site audits as this was what I was looking for. It would also be useful for searching out other sites (that have been crawled by Moz) to quickly search out blog subdomains or other language subdomains.
Anyway, thanks for the response - looking forward to see if this becomes an available tool!
-
Thanks Michael - great idea! I could see this fitting in the MozBar, maybe in OSE. Let me do some digging into how our index is structured and I will get an update back up here on feasibility.
-
That's a good question...I've not seen a tool that magically does all of that, but certainly Moz could get that from the data they get when they crawl. I'll pass along that idea to Jon White.
I would use this tool myself during site audits, when I'm looking to see if the client's site has subdomains other than www that might be worth consolidating onto the www subdomain.
Today, I do it arduously with Google site: -inurl queries, e.g.
site:acme.com -inurl:www -inurl:blog
and then when I see a new subdomain appear, e.g. news.acme.com, then I append -inurl:news to the site: search.
This doesn't work if the client has decided that the www-less version of their domain is their preferred one...in this case, I'm totally SOL.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Competitive Analysis Tools in robots.txt.... Worth it?
I've been considering blocking third party crawlers for a while – specifically those crawling my website for the sake of competitive analysis, such as SEMrush and Ahrefs. I'm familiar with how to do so, but when researching the question I found practically no one asking the same question. The guides I've found on what to put in your robots.txt make no mention of whether to block competitive analysis crawlers. Which makes me wonder whether this is a good idea after all. My chief concern here is rival sites going after the same search terms we target – one of our competitors in particular has an uncanny way of going after the same searches we are. I know blocking crawlers won't prevent competitors from watching our content, but it will make it slightly harder for them. Is there any major drawback I'm missing? Any big reason not to go ahead and block SEO analysis crawlers?
Competitive Research | | davidwaring0 -
Competitor research: No data / results displaying on Keyword tools, Aexa
Hi there! I'm trying to research a few competitors using various Keyword tools (SEM Rush, Compete, Keyword Spy -- even Alexa for high level insight). While the bulk of the competitors generate expected results through these tools (a smattering of their top organic and paid search keywords, some traffic estimations through Alexa), ONE of these competitors lists "No results" across all categories and all tools: http://www.bgstar.ca Despite this, we know that they invest heavily in search -- and my SEM Rush toolbar indicates that they have a Google PR of 5 (though I recognize that that should be taken with a grain of salt). So I'm stumped! Has anyone encountered this before? Is there something structural that they might be doing, that's blocking not only Google-based platforms, but Alexa too? Thanks for your help!
Competitive Research | | MACJ0 -
Free tools to find country of origin of backlinks/urls
Hey are there any free tools out there which can allow me to insert a large list of urls, and it determines the country of origin of the domain. I know the paid version of majestic does, but i was wondering if theres any free tools? Cheers, Chris
Competitive Research | | monster990 -
SEO Keyword Research
Hi, We are SEO beginners so please bear with us! We are trying to promote "Web based Invoicing Software". The SEO company we have signed with offer us 5 keywords for the package we are on with them. They have suggested\offered us: 1. Invoicing Software - Fine
Competitive Research | | Studio33
2. Online Invoicing - Fine
3. Online Invoicing Software - Covered by 1 and 2
4. Small Business Invoicing Software - Covered by 1
5. Invoice Template - Fine. Will Invoice templateS be covered on this one too? My question is does number 3 cover number 1,2 & 4 anyway? If so I am thinking to not go for 1,2, & 4 just keep 3 and choose three other new keywords. Would this be a better strategy and "more for our money?" Or, keep 1 and 2 and lose 3 and 4, would that be a good option. So, in summary options are (all assuming keeping number 5) 1. Keep all
2. Keep 1 & 2 - Lose 3 & 4
3. Keep 3 - Lose 1,2 & 4 4. Any other combo you can suggest? Any advice welcomed Thanks nutnut0 -
Keyword Research - tools
Hello all, I would like to find better synonyms for my keywords, and dig deeper to bid / place strategy into place for them. I am currently using the adwords too but it only gives me closely related keyword ideas. Is there something "free" which can give me a better co-relation data to work with? Thanks Aditya
Competitive Research | | shanky10 -
Tool to compare which links competition has but i don't?
Is there a tool that I can use that will show me a list of my competition's links that I don't have?
Competitive Research | | ballhogjoni0 -
Keyword difficulty/research question
Wondering if I could get some opinions from the fellow moz users' I have a website which I which to rank for the term 'evening dress'.As you can imagine it is a pretty difficult term with a score of 62% (the term gets 301,000 broad matches and 27,000 exact matches a month). As much as I would like to target this term I feel that my domain is not strong enough (DA 39) to match the competition. Therefore, would a better strategy be to target long tail keywords which also contain the primary keyword, ie black evening dress evening dress hire cheap evening dress buy evening dress online please note that these were just examples, I haven't researched a comprehensive long tail list. Would targeting these long tail keywords mean that a) I should be able to rank for them faster and thus receive more traffic, sooner, and b) build up links to the page which I ultimately want to rank for evening dresses with numerous backlinks containing the keyword evening dress. The trade off with doing this is that I would need to seo one page for all the long tail keywords to gain the maximum benefit for the 'money' keyword. Does this sound like a sensible approach to both ranking for big money term and also getting traffic sooner rather than later? Thanks Carl
Competitive Research | | Grumpy_Carl0 -
How accurate is the Keyword Difficulty Tool for international markets (specifically Australia)?
The difficulty percentage on several keywords is identical for Google.com and Google.com.au and I am wondering why?
Competitive Research | | davidangotti0