100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Value of EDU Links?
Greetings: We are considering hiring a firm specializes in developing .EDU links. Is the ROI on EDU links better than non EDU backlinks from reputable domains? Will obtaining EDU links results in greater domain authority and improved ranking for search engine results? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
How to handle broken links to phantom pages appearing in webmaster tools
Hi,Would love to hear different experiences and thoughts on this one. We have a site that is plagued with 404's in the Webmaster Tools. A significant number of them have never existed, for instance affiliates have linked to them with the wrong URL or scraper sites have linked to them with a truncated version of the URL and an ellipsis eg; /my-nonexistent... What's the best way to handle these? If we do nothing and mark as fixed, they reappear in the broken links report. If we 301 redirect and mark as fixed they reappear. We tried 410 (gone forever) and marking as fixed; they re-appeared. We have a lot of legacy broken links and we would really like to clean up our WMT broken link profile - does anyone know of a way we can make these links to non extistent pages disappear once and for all? Many thanks in advance!
Intermediate & Advanced SEO | | dancape0 -
2015 Disavow Links on Bing?
In years past I was told not to disavow links in Bing unless the site had an issue. This was driven home when a site we were working on disavowed the links in google and saw the site recover after a few months, then when they disavowed the same links in Bing and the rankings dropped 20% over the next few months. The reasoning was that Bing was looking more at the qty of links, and didn't analyze links the way Google does. So even though you might disavow links in Google you might not want to disavow those same links in Bing. Does this still hold true in 2015? I want to get the community's opinion on this topic, should the same links be disavowed in Bing that are disavowed in Google? Why or why not?
Intermediate & Advanced SEO | | K-WINTER1 -
What Are Latest Internal Linking Strategies?
I have been doing a little research, but all the articles are really old. Even the Moz site page is pretty old. So I am wondering, has the strategy changed? Is it OK to still use internal links with your keywords in them? Do you have multiple links on a page? What about a blog post? Do you no follow? What are the thoughts out there on this?
Intermediate & Advanced SEO | | netviper0 -
To Many Links On Page Problem
Hello My Moz report is showing I have an error for too many links on my sitemap and blog. The links on both pages are relevant and I'm not sure if this has to be sorted out, as I would have thought Google would expect sitemaps and blogs to have lots of links. If I were to reduce the number of links how much of a positive affect would it have on my site? If any of you feel it is best practice to reduce number of links on these particular pages, do you have any suggestions on how I can tackle this? http://www.dradept.com/blog.php http://www.dradept.com/sitemap.php Thank you Christina
Intermediate & Advanced SEO | | ChristinaRadisic0 -
Flow of internal link equity
I've recently come across this: A site changes the URL of one internal page to something more search friendly, and 301's the old to the new as you would expect. They don't change the link on the homepage in the navigation. Instead they keep it to the old URL so they go through the 301 to get to the page even though it's internal. They say if they change the URL it will reset the internal flow of link equity to that page. I've not come across this before and so am not sure what to think. I mean I can see what they're saying but I would have though that it being internal would mean it's different and that the flow to internal pages would just kind of resume as-was quite soon afterwards. Any views?
Intermediate & Advanced SEO | | SteveOllington0 -
Footer Link
Hello, Some of my hosted clients don't mind if I put a footer link on the bottom of their website. I would like to put a footer link that looks like Seomoz's - http://imgur.com/GrC8y Basically it would look like so: "Powered by "my company name". The world's #1 "keyword" provider (LOGO goes here) Here are my questions: 1. Would this hurt or help my rankings? 2. Should the logo be hosted by my clients so that a different ip is hosting my logo (where my image name will get picked up)? Or is it best to host it myself? 3. If my company name and keyword are getting linked, is that one link too many? 4. Is it a good idea to use a different keyword so that other keywords get picked up by SERPs, or should I set myself up on one keyword ? Thank you so much! Shawn
Intermediate & Advanced SEO | | Shawn1240