100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Google webmaster reports non-existent links between syndicated sites
We have run into an issue with linking that we are completely puzzled by. We syndicate our content to various clients, taking care to ensure that we have followed all the best practices that Google recommends for syndicating content. But recently, we noticed Google Webmaster report links from ClientA to ClientB, and we cannot figure out why it thinks that way. We have never created, and we have never found the links that Google Webmaster claims are there. It is important for us to keep our clients isolated. Has anyone seen such behavior? Any ideas/pointers/hunches would be very much appreciated. Happy to provide more information. We even asked on the Google Webmaster Forum (https://productforums.google.com/forum/#!topic/webmasters/QkGF7-HZHTY;context-place=forum/webmasters), but thought this might be a better place to get expert advice. Thanks!
Intermediate & Advanced SEO | | prakash.sikchi0 -
Disavow links established in 2009??
Sorry for the length, but I believe this is an interesting situation, so hopefully you'll enjoy thinking this one over a little. Thanks for taking the time! Historical Information We’ve owned and operated printglobe.com since 2002. In late 2009, we acquired absorbentprinting.com and operated both sites until Mar, 2015, when absorbentprinting.com was redirected to printglobe.com. The reason we chose to redirect absorbentprinting.com to printglobe.com is that they were same industry, same pricing, and had a lot of product overlap, although they did have unique product and category descriptions. We saw a long and steady decline in organic traffic to absorbentprinting.com in the last couple of years leading up to the decision to redirect. By the way, while I understand the basics of SEO, neither I nor anyone else at our company could be considered an SEO practitioner. Recent Information An SEO firm we used to be engaged with us reached back out to us and noted: “I started looking through your backlink and it looks like there has been a sharp increase of referring domains.” They included a graph that does show a dramatic increase, starting around November, 2015. It’s quite dramatic and appears anything but natural. The contact from the SEO firm went on to say: “After doing a cursory review, it looks like a handful of these new links are the type we would recommend disavowing or removing.” We do little in the way of “link building” and we’re in a relatively boring industry, so we don’t naturally garner a lot of links. Our first thought was that we were the victim of a negative SEO attack. However, upon spot checking a lot of the recent domains linking to us, I discovered that a large % of the links that had first shown up in AHREFS since November are links that were left as comments on forums, mostly in 2009/2010. Since absorbentprinting.com was redirected to printglobe.com in Mar, 2015, I have no idea why they are just now beginning to show up as links to printglobe.com. By the numbers, according to a recent download from AHREFS: Total # of backlinks to printglobe.com through mid-Feb, 2016: 8,679 of backlinks “first seen” November, 2015 or later: 5,433 Note that there were hundreds of links “first seen” in the months from Mar, 2015 to Oct, 2015, but the # “first seen” from November, 2015 to now has been 1,500 or greater each full month. Total # of linking domains through mid-Feb, 2016: 1,182 of linking domains first seen November, 2015 or later: 850 Also note that the links contain good anchor text distribution Finally, there was a backlink analysis done on absorbentprinting.com in April, 2013 by the same firm who pointed out the sharp increase in links. At that time, it was determined that the backlink profile of absorbentprinting.com was normal, and did not require any actions to disavow links or otherwise clean up the backlinks. My Questions: If you’ve gotten through all that, how important does it seem to disavow links now? How urgent? I’ve heard that disavowing links should be a rare undertaking. If this is so, what would you think of the idea of us disavowing everything or almost everything “first seen” Nov, 2015 and later? Is there a way to disavow at the linking domain level, rather than link-by-link to reduce the number of entries, or does it have to be done for each individual link? If we disavow around 5.5k links since Nov, 2015, what is the potential for doing more harm than good? If we’re seeing declining organic traffic in the past year on printglobe.com pretty much for the first time in the site’s history, can we attribute that to the links? Anything else you’d advise a guy who’s never disavowed a link before on this situation? Thanks for any insights! Rob
Intermediate & Advanced SEO | | PrintGlobeSEO0 -
Thoughts on Proactive Link Disavow
One of my newish hobby sites has began to attract some crappy links - as per Google Webmaster Tools, Links To Your Site report. The typical .ru and .pl kind of crap that seems to seep into all somewhat successful sites' link profiles. I have not received any notifications or penalties, BUT I am considering proactively disavowing these, but wanted to bounce this idea off some other SEOs before proceeding. Cheers!
Intermediate & Advanced SEO | | David_ODonnell0 -
14,000 links from affiliate
I have an active affiliate program and notice that webmaster tools is showing a huge number of links from one particular affiliate. The affiliate is called productwiki.co.uk and they are showing 14,413 links all pointing to my homepage in WMT. They don't seem to be no follow. What should I do about this? Is this a problem? I have had major issues with my organic traffic dropping right off. I appreciate any feedback
Intermediate & Advanced SEO | | Aikijeff0 -
.GOV Link - same impact on my site's rankings whether link to home or Gov related category?
I own a job site and I am about to get a link from a .GOV. My site has a category called "State Jobs". Should I ask the ".Gov" to link to my homepage or to the state job page and use the anchor text "State Jobs". I understand "State Jobs" page would get a big kick by that being the anchor text and linking to that specific page, but the question I have is this: for my site as a whole (homepage and various categories) would they get around the same "push up" whether the linking is to 1) my homepage with anchor text being my site's name or 2) to the state job specific page and in this case the anchor text would be "State Jobs"? thank you
Intermediate & Advanced SEO | | knielsen0 -
100 + links on a scrolling page
Can you add more than 100 links on your webpage If you have a webpage that adds more content from a database as a visitor scrolls down the page. If you look at the page source the 100 + links do not show up, only the first 20 links. As you scroll down it adds more content and links to the bottom of the page so its a continuos flowing page if you keep scrolling down. Just wanted to know how the 100 links maximum fits into this scenario ?
Intermediate & Advanced SEO | | jlane90 -
Image Links Vs. Text Links, Questions About PR & Anchor Text Value
I am searching for testing results to find out the value of text links versus image links with alt text. Do any of you have testing results that can answer or discuss these questions? If 2 separate pages on the same domain were to have the same Page Authority, same amount of internal and external links and virtually carry the same strength and the location of the image or text link is in the same spot on both pages, in the middle of the body within paragraphs. Would an image link with alt text pass the same amount of Page Authority and PR as a text link? Would an image link with alt text pass the same amount of textual value as a text link? For example, if the alt text on the image on one page said "nike shoes" and the text link on the other page said "nike shoes" would both pass the same value to drive up the rankings of the page for "nike shoes"? Would a link wrapped around an image and text phrase be better than creating 2 links, one around the image and one around the text pointing to the same page? The following questions have to do with when you have an image and text link on a page right next to each other, like when you link a compelling graphic image to a category page and then list a text link underneath it to pass text link value to the linked-to page. If the image link displays before the text link pointing to a page, would first link priority use the alt text and not even apply the anchor text phrase to the linked page? Would it be best to link the image and text phrase together pointing to the product page to decrease the link count on the page, thus allowing for more page rank and page authority to pass to other pages that are being linked to on the page? And would this also pass anchor text value to the link-to page since the link would include an image and text? I know that the questions sound a bit repetitive, so please let me know if you need any further clarification. I'd like to solve these to further look into ways to improve some user experience aspects while optimizing the link strength on each page at the same time. Thanks!
Intermediate & Advanced SEO | | abernhardt
Andrew0