OSE link report showing links to 404 pages on my site
-
I did a link analysis on this site mormonwiki.com. And many of the pages shown to be linked to were pages like these http://www.mormonwiki.com/wiki/index.php?title=Planning_a_trip_to_Rome_By_using_Movie_theatre_-_Your_five_Fun_Shows2052752
There happens to be thousands of them and these pages actually no longer exist but the links to them obviously still do. I am planning to proceed by disavowing these links to the pages that don't exist. Does anyone see any reason to not do this, or that doing this would be unnecessary?
Another issue is that Google is not really crawling this site, in WMT they are reporting to have not crawled a single URL on the site. Does anyone think the above issue would have something to do with this? And/or would you have any insight on how to remedy it?
-
The site does have and has had ranking issues since the first Penguin and has really had problems the last few months. And other than some minor things low quality links are really the only problem with the site.
-
Hi,
Adam is correct that the disavow tool should only be used if you think the links are causing you significant ranking problems. It's become quite common for people to disavow links without either a confirmed penalty or ranking issues, but those two factors were originally how Google recommended the tool be used.
What it sounds like has happened to your site with these bad pages is that spammers have created spam pages on the wiki then pointed links to those pages from elsewhere. It's a very common and old spam tactic, used on sites that allow UGC.
Those pages are now returning 404s, so technically the inbound links pointing to them should not hurt your website or cause a penalty. It's generally assumed the links to 404 pages (good or bad links) don't hurt or help. I disagree that they'll cause a "bad user experience" as it sounds like they have been built for spam purposes only - no one is going to try and visit these links.
If you believe these links are causing a ranking issue, the disavowal tool is certainly an option - I take it there's no chance you can negotiate these links' removal with the folks who built them? Removing links is always preferable to using disavowal also.
-
If you are seeing zero pages indexed and zero traffic from search then I would assume you have perhaps verified and subsequently are looking at data for the non-www version of the domain.
Double check that the site listed in WMT is www.mormonwiki.com and not mormonwiki.com. If you are looking at indexation and traffic data for the www version then there may be something else going on and unfortunately I wouldn't be able to diagnose the issue without looking at the WMT account.
Have your rankings been significantly affected? You would need to perform a fair amount of analysis before you can conclude that the site has been affected algorithmically. You would also need to be sure that any negative impact to rankings is a result of poor quality links and not something else, such as on-page factors.
Using the disavow should really be a last resort and only if it has been impossible to get troublesome links removed. As the warning from Google states, the disavow feature 'can potentially harm your site's performance' so I would not recommend using it until you have performed more in-depth analysis.
-
Right so if the pages no longer exist they need to be gotten rid of right? Most of these won't be removed by the webmasters and so they'll need to be disavowed right?
These pages were UGC and are essentially spam, and entirely irrelevant to anything on the site itself. So 301 redirects would not be wise or useful I don't think.
-
It hasn't received a manual action no. But that doesn't mean algorthimically the site isn't being affected.
So you're saying to not worry at all about these links?
They offer nothing in terms of value. If going to live pages they would be considered very spammy and completely irrelevant. But since these pages don't even exist you're saying it's unnecessary to bother with them at all?
I'm seeing the crawlability issue in WMT itself. The strange thing is that I know some pages have been indexed, we get most of our traffic organically from Google. But WMT shows zero pages indexed, zero traffic from search etc. The site has been verified as well.
-
I agree with Adam, if the links are natural then there is no need to disavow them.
However, if the links go to pages that no longer exist then it provides a poor user experience that can harm your rankings. Think of it like having dead links on your website. Have you set up 301 redirects for the pages that have become inactive? If not, set them up and make sure to redirect the pages to relevant areas of the website (no all to the homepage). Do this and the links should pass more juice and your website's performance should improve.
-
Are you performing a link analysis because the site received a manual action notification in WMT? If the site hasn't received a penalty then there is no need to use the disavow feature. As Google states:
'This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.'
In terms of the crawlability of the site, where are you seeing WMT reporting to have not crawled a single page? A simple site: search of the mormonwiki.com domain returns about 65,600 results and I can't see any major issues that would prevent search engines from crawling the site. However, I would probably fix the issue with the robots.txt file. Currently, www.mormonwiki.com/robots.txt 301 redirects to www.mormonwiki.com/Robots.txt, which returns a 404 error.
Hope that helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site shows up after re-indexing, then disappears.
I have a site, natvest.com, with which I sell real estate in Alabama and Georgia. I need to show up in an "Alabama Land for Sale" search. Same thing for Georgia. If I re-index my site, I show up for roughly one day, before disappearing again. Happens every time I re-index. Ideas?
Intermediate & Advanced SEO | | natvest0 -
Pages with excessive number of links
Hi all, I work for a retailer and I've crawled our website with RankTracker for optimization suggestions. The main suggestion is "Pages with excessive number of links: 4178" The page with the largest amount of links has 634 links (627 internal, 7 external), the lowest 382 links (375 internal, 7 external). However, when I view the source on any one of the example pages, it becomes obvious that the site's main navigation header contains 358 links, so every new page starts with 358 links before any content. Our rivals and much larger sites like argos.co.uk appear to have just as many links in their main navigation menu. So my questions are: 1. Will these excessive links really be causing us a problem or is it just 'good practice' to have fewer links
Intermediate & Advanced SEO | | Bee159
2. Can I use 'no follow' to stop Google etc from counting the 358 main navigation links
3. Is have 4000+ pages of your website all dumbly pointing to other pages a help or hindrance?
4. Can we 'minify' this code so it's cached on first load and therefore loads faster? Thank you.0 -
Realtor site with external links in navigation
I have a client with a realtor site that uses IDX for the listings feed. We have several external links going over to the IDX site for various live custom searches (ie: luxury listings, waterfront listings, etc...). We are getting a Moz spam ranking of 2/7 for both "Large Number of External Links" and "External Links in Navigation". Chances are, these are related. My question is this: (1) Being the score is only 2/7, should I bother with fixing this? (2) If I add a rel="nofollow" to all the site-wide links (in header, footer & menu) will this help? I couldn't find anything definitive in the Q&A search. Looking forward to any insights!!!
Intermediate & Advanced SEO | | lcallander1 -
Do image "lightbox" photo gallery links on a page count as links and dilute PageRank?
Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Does Google only look at LSI per page or context of the Site?
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words. For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans. Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Site Wide Link Situation
Hi- We have clients who are using an e-commerce cart that sits on a separate domain that appears to be providing site wide links to our clients websites. Therefore, would you recommend disallowing the bots to crawl/index these via a robots.txt file, a no follow meta tag on the specific pages the shopping cart links are implemented on or implement no follow links on every shopping cart link? Thanks!
Intermediate & Advanced SEO | | RezStream80 -
Site wide footer links vs. single link for websites we design
I’ve been running a web design business for the past 5 years, 90% or more of the websites we build have a “web design by” link in the footer which links back to us using just our brand name or the full “web design by brand name” anchor text. I’m fully aware that site-wide footer links arent doing me much good in terms of SEO, but what Im curious to know is could they be hurting me? More specifically I’m wondering if I should do anything about the existing links or change my ways for all new projects, currently we’re still rolling them out with the site-wide footer links. I know that all other things being equal (1 link from 10 domains > 10 links from 1 domain) but is (1 link from 10 domains > 100 links from 10 domains)? I’ve got a lot of branded anchor text, which balances out my exact match and partial match keyword anchors from other link building nicely. Another thing to consider is that we host many of our clients which means there are quite a few on the same server with a shared IP. Should I? 1.) Go back into as many of the sites as I can and remove the link from all pages except the home page or a decent PA sub page- keeping a single link from the domain. 2.) Leave all the old stuff alone but start using the single link method on new sites. 3.) Scratch the site credit and just insert an exact-match anchor link in the body of the home page and hide with with CSS like my top competitor seems to be doing quite successfully. (kidding of course.... but my competitor really is doing this.)
Intermediate & Advanced SEO | | nbeske0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0