Increasing in 404 errors that doesnt exist
-
Hi
First of all I should say that I have error in the old webmaster not the new one.
I have two WordPress blog in root and subfolder.
Today I checked my webmaster and recognize that I had 100 errors (404) that found in few days ago.
My root WordPress is OK but subfolder WordPress has error. Let me show you by example.
http://example.com/subfolder/article15245
I had error for this page:
http://example.com/article15245
looks like this subfolder deleted
I checked my links, but all of them were OK and linked to the right URL.
unfortunately this errors dont have "linked from" section
-
Thanks for your reply
Today I redirect most of these links to the right post, but it was such a borning task.
-
The page by page redirection is necessary to preserve any link juice from any incoming links to the pages in question. You can throw the links into Ahrefs or look at them in Moz to see if the pages have any links that are worth saving. Also, SEO press has the ability to edit the HTACCESS built into the plugin. But you can also do wordpress level redirects as well. It's pretty awesome.
Hope that helps.
-
You can put the domain here, I'm sure lots of people would like to weigh in on this it's an interesting problem
I have replied to your email
-
Redirect one by one? its so boring!
Does The section "linked from" updates for these links?
I use redirection plugin instead of htaccess. Its more safe
What happen if I don't redirect them for recognizing why these errors occur?
Thanks for your respond
-
Thanks for your perfect answer.
I checked these links in moz link explorer but no link found. i think this is an internal problem because most of my subfolder links (over 70%) become 404.
I have redirection plugin. It has 404 sections that shows last visitors that gone to the 404 pages but no reports like this error found!
As you said it seems I should redirect them with .htaccess
Thanks , I emailed my domain for you.
Can I put my domain here for others to check?
-
I would just 301 all the pages to the final URLs in prod, verify that they are working individually, then Fetch & Render. Many plugins like SEO press or Yoast will allow you to upload them in bulk to help save time. Or you can always update your HTACCESS file with the redirect. If you are working in Excel or Sheets, using the Find/Replace to bulk edit can be a life saver. It is usually pretty boring, but not the worst in the world. Cheers!
-
It's so annoying when things like that happen! When Google refuses to give the 'linked from' data, it's a real head-test working out where the links are coming from. Did you know that the links could even be coming from other websites, not just your own? When a user follows a link to your site (regardless of where that link is from), Google consider it your error if a valid page isn't returned
Since this error is only occurring in the old area of WMT, it probably doesn't matter much. That being said, one simple fix would be to 301 redirect all the broken links, to the functional article pages. After that you can just bulk mark them all as fixed
Usually I tell people to fix the actual link, but if it's an external link which you have no control over (or if Google can't even be bothered to tell you what the linking page is) then 301 and mark as fixed is probably your best bet. Especially since, these are only individual article pages (it's not like a malformed version of your homepage or something)
If you email me the domain (check my profile page) then I might be able to crawl your site for you to determine whether there are any obviously broken internal links. Regardless, you'd want the 301s as a back-stop anyway
Hope that helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
How can i check which inbound links to my site go to 404 pages?
I have external links coming into my site going to 404 pages, but i cant seem to find a way to search all broken links pointed at my website.
On-Page Optimization | | NickJPearse0 -
Best process for expired webinars advertised as events 301, 404
Looking for input please on best process from an SEO point of view: We hold a webinar We promote webinar in the website (wordpress) as an event When the webinar finishes we un-publish the event and create a resource page for the recorded webinar and copy the content of the original event post I'm seeing 404's due to the webinar event pages being unpublished. Should I be 301'ing the events to the resource page or keep both? or some other proposal? Many thanks!
On-Page Optimization | | w4rdy1 -
Creating a .cn site with the existing site content
Hi all, I'm planning to create a .cn site. If I simply translate the existing content on my site (.com.au) into Chinese, do you think Google will see the .cn site as a duplicate of the main site? Will this cause any duplicate content issues? Thanks
On-Page Optimization | | QuantumWeb620 -
Can I have schema.org links as relative on my site? Getting an html validation error.
I'm getting an html validation error on relative schema.org links "Bad value //schema.org/Organization for attribute itemtype on element div: The string //schema.org/Organization is not an absolute URL." This is my code for https site: <code class="input">e itemtype="//schema.org/Organization"><a itemprop="url" class="navbar-brand" …<="" code=""></a></code>
On-Page Optimization | | RoxBrock0 -
How to increase SERP for long tail keywords?
Let me describe my situation. I currently run an e-commerce site that aggregates items across various e-commerce website. I am focusing my SEO on the long tail keywords such that when user searches for 'hermes birkin bag croco black' and other specific item search, then the page that I have on my site pops up. What are some tips on increasing this kind of stuff in a general sense such that I don't have to go through all of my 10,000 items that I have in my site and optimize each and every page for keywords. Right now what I am thinking is to increase the domain authority of the overall site. Any other tips?
On-Page Optimization | | herlamba0 -
400 error - Phone number link.
I am getting 400 errors for all my pages that have a phone number with a link to Skype etc on click, is this a genuine issue or am I ok? How do I resolve this? Any bright ideas, here is an example of the issue - http://www.arts1.co.uk/5-reasons-to-choose-arts1 There are pages of these and I am not sure what to do? Many Thanks James Grimsey
On-Page Optimization | | jamesgrimsey0 -
What reasons exist to use noindex / robots.txt?
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
On-Page Optimization | | digitalstream0