How To Find and Delete Erroneous Pages From My Wordpress Site
-
I've downloaded the Seomoz csv file from the crawl data on my site and it found lots of 404 errors, duplicate content, etc.
The problem is that when i go to my wp-admin and look for the pages to delete them, I dont see them. Can anyone point me in the right direction? I've checked with HostGator and they say it's a WP problem.
I need help locating where they are so i can clean them up or delete them.
Thanks
Mike
-
Hi Mike,
Did Jeffrey's answer help you, or would you still like some more assistance getting this cleared up?
-
If you're getting a 404 error, the page probably doesn't exist any more. You might have a link to an old page somewhere that still points to that page, resulting in the 404 error.
You can locate broken links on your site using something like this: http://www.seomoz.org/blog/xenu-link-sleuth-more-than-just-a-broken-links-finder
For duplicate content issues, see here: http://www.seomoz.org/learn-seo/duplicate-content
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
Titling Category Pages Like You Would a Blog Page?
So, with our 600 or so category pages, I was curious... on each of these category pages we show the top 12 products for that category. In trying to increase click through rate, I wonder if it would be prudent to use some of the strategies I see used for Blog posts with thee category pages. i.e. Instead of Category Name - Website Name How about: Top 12 Kitty Litters We Carry - View the Best and the Rest! Or something like that. And then in the description, I could put, "Number 8 made my jaw drop!!!" (Ok, kidding about that one...) But serious about the initial question... Thanks! Craig
Technical SEO | | TheCraig0 -
Can view pages of site, but Google & SEOmoz return 404
I can visit and view every page of a site (can also see source code), but Google, SEOmoz and others say anything other than home page is a 404 and Google won't index the sub-pages. I have check robots.txt and HTAccess and can't find anything wrong. Is this a DNS or server setting problem? Any ideas? Thanks, Fitz
Technical SEO | | FitzSWC0 -
Duplicate Page Title for multilingual wordpress site
Hello all, I have received my first crawl reports and I find a lot of errors of duplicate page title. In the wordpress site I use the qtranslate plugin in order to have the site in 2 languages. I also use the Yoast SEO plugin in order to put titles, description and keywords to each web page. By looking deeply in the duplicate page title errors I think I found that the problem is that every web page takes the same SEO Title for each language. But I am not 100% sure. I tried to use some shortcodes of the qtranslate plugin like the following ABOUT [:en]About in order to indicate and give different titles per language for one web page but that doesn't seem to work. Does anybody here has experienced the same problem as me? Do you have any suggestions about how to ressolve the problem of the duplicate page title? I can give you the URL of the website if you need it to have a look. Thank you in advanced for your help. I really appreciate that. Regards, Lenia
Technical SEO | | tevag0 -
My report only says it crawled 1 page of my site.
My report used to crawl my entire site which is around 90 pages. Any idea of why this would happen? www.treelifedesigns.com
Technical SEO | | nathan.marcarelli0 -
Two different page authority ranks for the same page
I happened to notice that trophycentral.com and www.trophycentral.com have two different page ranks even though there is a 301 redirect. Should I be concerned? http://trophycentral.com Page Authority: 47 Domain Authority: 42 http://www.trophycentral.com Page Authority: 51 Domain Authority: 42 Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0