100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will link juice still be passed if you have the same links in multiple, outreach articles?
We are developing high quality, unique content and sending them out to bloggers to for guest posts. In these articles we have links to 2 to 3 sites. While the links are completely relevant, each article points to the same 2 to 3 sites. The link text varies slightly from article to article, but the linked-to site/URLs remain the same. We have read that it is best to have 2 to 3 external links, not all pointing to the same site. We have followed this rule, but the 2 to 3 external sites are the same sites on the other articles. I'm having a hard time explaining this, so I hope this makes sense. My concern is, will Google see this as a pattern and link juice won't be passed to the linked-to URLs, or worst penalize all/some of the sites being linked to or linked from? Someone I spoke to had suggest that my "link scheme" describes a "link wheel" and the site(s) will be penalized by Penguin. Is there any truth to this statement?
Intermediate & Advanced SEO | | Cutopia0 -
Disavowing Affiliate Links - Domain or Actual Affiliate Link?
Hi everyone, Hope you're all having a great day, I have a question in regards to a site which I am about to disavow. Over the past 2 months a certain page of ours has dropped from the 2nd page, all the way to the 7th. I haven't been able to diagnose why, however, yesterday I discovered that a site has been using an Lafitte link on his sidebar, the link is a do-follow. Webmaster tools indicates that this site has linked to us over 24,000 times. I understand that this link could potentially ruin our rankings - however, in terms of disavowing, what is the best approach here? Do I disavow their domain, or do I disavow the actual affiliate link also? The link is placed within an image, once the image is clicked it redirects you to another link for a second then redirects to our money site. We have got in touch with our affiliate program and they have made the link a no-follow, however, we are pretty certain this site is causing issues for us and we want to go ahead and disavow. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Link building
ok mozers i have a few questions. I am starting a new seo campaign and i want to target traffic for "how to make money on autopilot" Question 1. when it comes to link building i have seen some articles saying that i should not send all of my links to my landing page at once but to send links to my backlinks then index then using tiered link building. Is this a must or not? will i get penalized if i build 20 targeted links to my landing page in 1 day, lets say 20, pr7-9 domains? or should i tier it out and link maybe 5 pr9 domains to my landing page, then link 10 pr5 domains to each of those 5 pr9 domains and maybe link 20-pr1 domains to each of those tiered 2 pr5 domains? eq: Tier 1 = 5 PR-9 Tier 2 = 50 PR-5 Tier 3 = 1,000 PR-1 Question 2. Is their a certain amount of backlinks i need to use in order to out do my competitor? or does it just matter on the metrics of my backlinks? and when it comes to indexing these links do i need to index just the 5 pr9 links? or do i need to index all of them? or should i just index the landing page through google webmasters tools and hope it indexes all connecting pages? will doing any of these get my landing page indexed faster in order to rank faster? Question 3. Types of link building. Ok i am targeting guest blogs, wordpress sites, etc to put a link on. Should i focus on smm 'social media marketing' as well? or can i just focus on the traditional seo tactics first? Question 4. Keyword research. ok so my blog post is 'how to make money on autpilot' and from my keyword suggestion tools it picked up a list of keywords suggestions to target. Competition ranges from low to high, search volume ranges from 10 to 1900 visitors per month, after organizing the most relevant keywords to add to my campaign should i target each of the these keywords by creating a link building campaign for each one and target it to my landing page or use it as my 2nd or 3rd tier? those are the questions i really have for now. Here is my blog post http://www.vemomedia.com/how-to-make-money-on-autopilot/ Please feel me in on what i am needing to do in order to get some ranking and on how to run a link building campaign the correct way. Thanx in advance!
Intermediate & Advanced SEO | | djgbshows1 -
Linking to own homepage with keywords as link text
I recently discovered, that previous SEO work on a client's website apparently included setting links from subpages to the homepage using keywords as link text that the whole website should rank for. i.e. (fictional example) a subpage about chocolate would link to the homepage via "Visit the best sweet shop in Dallas and get a free sample." I am dubious about the influence this might have - anybody with any tests? I also think that it is quite weird when considering user friendliness - at least I would not expect such a link to take me to the homepage of the very site I was just on, probably browsing in a relevant page. So, what about such links: actually helpful, mostly don't matter or even potentially harmful? Looking forward to your opinions! Nico
Intermediate & Advanced SEO | | netzkern_AG0 -
Disadvantages of linking to uncompressed images?
Images are compressed and resized to fit into an article, but each image in the article links to the original file - which in some cases is around 5Mb. The large versions of the images are indexed in Google. Does this decrease the website's crawl budget due to the time spent downloading the large files? Does link equity disappear through the image links? Either way I don't think it's a very good user experience if people click on the article images to see the large images - there's no reason for the images to be so large. Any other thoughts? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
How to promote some links on google
Hi our site is http://www.mycarhelpline.com If people search on our site in Google by typing - Mycarhelpline they see links - why mycarhelpline, contact us and about us how can we put some other key pages by replacing above pages
Intermediate & Advanced SEO | | Modi0 -
How quickly should you aquire links?
Hi Guys, How often should you aquire links without getting into trouble with Goolge? Should you aqure a linka day? Or a link every 2 days? What should it be? Thanks guys Gareth
Intermediate & Advanced SEO | | GAZ090 -
Does Google WMT download links button give me all the links they count
Hi Different people are telling me different things I think if I download "all links" using the button in WMT to excel, I am seeing all the links Google is 'counting' when evaluating my site. is that right?
Intermediate & Advanced SEO | | usedcarexpert0