My webmaster doesn't shows backlinks data why???
-
Hi every one
I have 2 websites with same domain with targeting different country name for example www.domain.com.au and the other is wwww.domain.co.nz . So my question is that in webmaster tool my one domain doesn't shows back links data or search query data. why this is happen?
-
hi.
I activate webmaster tool for this site last 6 month.
-
How long has Google Webmaster Tools been activated for the site that's not working?
-
Hi Martin
Thanks for the replying. i am doing SEO for both sites. but one site shows data correctly but another does not show anything why this happen i have checked webmaster verification and all that things.
-
How long has the website been live? How much traffic are you getting to the site? Where is that traffic coming from? Do you know if links exist?
It takes time for data to show up in GWT and sometimes the best approach is to keep on promoting your website, 'doing SEO' and working hard and then you won't have to 'be patient' to wait for data.
In the interim use Google Analytics or maybe Get Clicky Analytics, then you can at least get your keyword information right up to yesterday.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages not mobile friendly - new pages in process but don't want to upset current traffic.
Working with a new client. They have what I would describe as two virtual websites. Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation. Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages. What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity? Would they see a drop in traffic ? Any thoughts? Thanks, Toni
Technical SEO | | Toni70 -
Webmaster tools showing 200 page load ok - all other testing tools show a 301
hey, on https://www.xxx.co we've setup a 301 redirect to xxx.us - > BUT in webmaster tools its still showing a 200 load ok, whereas on all other testing tools its showing a 301 redirect (screamingfrog etc) even https://dns.google.com/query?name=www.xxx.co is showing that its 301 redirected. Any ideas? as we want to trigger the change of address tool in WMT and its saying it cant as it loads the homepage still....
Technical SEO | | RobertN-London0 -
Selling same products under separate brands and can't consolidate sites...duplicate content issues?
I have a client selling home goods online and in-store under two different brand names in separate regions of the country. Currently, the websites are completely identical aside from branding. It is unlikely that they would have the capacity to write unique titles and page content for each website (~25,000 pages each), and the business would never consolidate the sites. Would it make sense to use canonical tags pointing to the higher-performing website on category and product pages? This way we could continue to capture branded search to the lesser brand while consolidating authority on the better performing website. What would you do?
Technical SEO | | jluke.fusion0 -
404s still showing in GWT
Hi, My client recently undertook a site migration. Since the new site's gone live GWT has highlighted over 2000 not found errors. These were fixed nearly 2 weeks ago and they're still being listed in GWT. Do I have to wait for Google to re-crawl the page before they're removed from the list? Or do I need to go through the list, individually check them and mark them as fixed? Any help would be appreciated. Thanks
Technical SEO | | ChannelDigital0 -
Removing thousands of shady backlinks?
Hey guys, We've been hired to redesign a website that has thousands of backlinks created by a (possibly) shady offshore company, and I'm wondering if anyone out there has experience dealing with a deletion of this size and type. Is it as simple as just disavowing the whole lot? Thanks, Jason
Technical SEO | | JKorolenko0 -
Best practice: unique meta descriptions on blog 'tag' pages
Hi everyone, I'm curious, are there best practices for introducing unique meta descriptions on blog tag pages (I'm using wordpress)? For instance, using platinum seo, on an original post, the meta description is either the excerpt or a specified custom sentence. It doesn't appear that platinum seo allows for custom descriptions on tag pages. Love to hear your thoughts. Thanks! Peter
Technical SEO | | peterdbaron1 -
/$1 URL Showing Up
Whenever I crawl my site with any kind of bot or a sitemap generator over my site. it comes up with /$1 version of my URLs. For example: It gives me hdiconference.com & hdiconference.com/$1 and hdiconference.com/purchases & hdiconference.com/purchases/$1 Then I get warnings saying that it's duplicate content. Here's the problem: I can't find these /$1 URLs anywhere. Even when I type them in, I get a 404 error. I don't know what they are, where they came from, and I can't find them when I scour my code. So, I'm trying to figure out where the crawlers are picking this up. Where are these things? If sitemap generators and other site crawlers are seeing them, I have to assume that Googlebot is seeing them as well. Any help? My developers are at a loss as well.
Technical SEO | | HDI0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0