Links to my site still showing in Webmaster Tools from a non-existent site
-
We owned 2 sites, with the pages on Site A all linking over to similar pages on Site B. We wanted to remove the links from Site A to Site B, so we redirected all the links on Site A to the homepage on Site A, and took Site A down completely. Unfortunately we are still seeing the links from Site A coming through on Google Webmaster Tools for Site B.
Does anybody know what else we can do to remove these links?
-
Google often has a long memory. You removed the links, now you have to wait for Google to forget them. It might take weeks, but more often it will be months. I've seen things in Google Webmaster Tools that are over two years old.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
Own Domains shown as Spam Links in Open Site Explorer
Hi ! I have 7 Domains that I bought that point to the same webspace as my main domain. In Open Site Explorer they are showed as spam links. So to solve the issue I redirected the links to an empty subdirectory on the same server which is different from the directory the main domain is linking to. But nevertheless the domains are still showing up as spam. Why might that be? What can I do to get rid of these domains? In fact I only need the main domain. Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Webmaster Tools says that Structured Data is missing (author and updated)
Hi, Google Webmaster Tools tells me, that every blog category and blog post is missing: 'updated' 'author' I find this data under 'Structured Data' => The datatype is 'hentry'. Markup is microformats.org. Is this a problem for SEO? How can I fix this? Best, Robin
Intermediate & Advanced SEO | | soralsokal0 -
Unnatural Links From My Site Penalty - Where, exactly?
So I was just surprised by officially being one of the very few to be hit with the manual penalty from Google "unnatural links from your site." We run a clean ship or try to. Of all the possible penalties, this is the one most unlikely by far to occur. Well, it explains some issues we've had that have been impossible to overcome. We don't have a link exchange. Our entire directory has been deindexed from Google for almost 2 years because of Panda/Penguin - just to be 100% sure this didn't happen. We removed even links that went even to my own personal websites - which were a literal handful. We have 3 partners - who have nofollow links and are listed on a single page. So I'm wondering... does anyone have any reason to understand why we'd have this penalty and it would linger for such a long period of time? If you want to see strange things, try to look up our page rank on virtually any page, especially in the /gui de/ directory. Now the bizarre results of many months make sense. Hopefully one of my fellow SEOs with a fresh pair of eyes can take a look at this one. http://legal.nu/kc68
Intermediate & Advanced SEO | | seoagnostic0 -
Depth of Links on Ecommerce Site
Hi, In my sitemap, I have the preferred entrance pages and URL's of categories and subcategories. But I would like to know more about how Googlebot and other spiders see a site - e.g. - what is classed as a deep link? I am using Screaming Frog SEO spider, and it has a metric called level on it - and this represents how deep or how many clicks away this content is.. but I don't know if that is how Googlebot would see it - From what Screaming Frog SEO spider software says, each move horizontally across from Navigation is another level which visually doesnt make sense to me? Also, in my sitemap, I list the URL's of all the products, there are no levels within the sitemap. Should I be concerned about this? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0