2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
-
Hello Everyone,
I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it.
Can you spot it? Does he have some settings wrong? What should he do?
Thank you.
-
The site is just looking like a site of a blog network. The domain is 5 years old & the home page has DA & PA of 34 still not indexed by Google. I searched for site:livingorganicnews.com in Google which is not giving any results. So it shows that the site is penalized by Google. Use Google webmaster tools for further verification so as to find the reason.
Most probably it's penalized because of being a site of a paid blog network.
-
LOL, the fact that there's a tonne of clearly spun content won't help. I gather this is part of a content scraping or sharing network like LinkVine?
Have you tried reading the articles published? Could do with some quality guidelines for what gets accepted imho
Even when it gets indexed, it's not going to rank anywhere... this is exactly the kind of site that Panda wanted to stop. Regurgitated, nonsensical, spun, tosh that looks as if it was written by a lunatic and only really exists for the sake of its outgoing links, that point to other rubbish.
I'd tell your friend to give up on this site entirely and start looking at less automated ways of doing things. Google is only going to get tougher and tougher on these sites so he's fighting a losing battle.
I don't mean to be rude but I hope it doesn't get indexed ever, what value does it offer to anyone for anything? Most people don't want stuff like that clogging up the web. I don't mean to sound harsh but tell your friend the problem with the site is.... it's crap.
-
Another one of the many not-quite-right things on the site are some of the older posts like http://www.livingorganicnews.com/games/2010/panasonic-announced-the-jungle-handheld-gaming-platform/1965/ which end with "incoming search terms" and several search terms that all hyperlink to that exact same article. Search engines will not see that as providing any value to the user (users are already on that page, they don't need to link to that page) and they will see it as just another attempt to manipulate the engines.
-
It is interesting to have a new set of eyes here. I had noticed his different writing but figured it was because English is not his first language. I will ask if he is actually writing this.
-
Keri is absolutely right.
I did not look at the site's content. It couldn't be much worse. It is a 100% spam site which should never be indexed. Clearly the site is under a penalty.
Google's job is to satisfy a user's search query by giving them the content they seek. If you create a site like that, NO ONE will ever want to get that site as the result of a search query. Google correctly recognizes this fact and removes the site from their database.
-
When there are a couple of thousand other pages like this, yes.
http://www.livingorganicnews.com/games/2011/get-cool-with-selected-berber-carpet-tiles-now/3215/
The subject of the article is about berber carpet tiles, yet the text has links (I used bold) that are totally off base and make no sense. For example:
"The berber carpet tiles might also be renowned for the durability and stain resistance at extended stay motel rates."
"To get rid of the difficult to vacuum Provillus scam dust particles..."
"An important benefit in using berber carpet tiles is a likelihood to eliminate the damaged location alone and replace it with a new carpet tile, a comparatively low-cost way of capatrex scam damage control, to make your ground look just like new."
-
Absolutely.
It is entirely possible he has been removed from Google's index as a result of a penalty. If he links to sites which receive a penalty (mobile casinos would be a very bad choice of sites to link to) then his site could receive a penalty as well.
My suggestion is not to jump to the conclusion the site is under penalty. Start by checking WMT, then if nothing is discovered submit the sitemap. If you don't see any results after a few days, then proceed to inquiring with Google about the site being under a penalty.
-
The text doesn't really seem like a human wrote it. The current most recent article has the title "Religious Credit card debt Enable Provides You With the Meaningful and Economical You Need". Other posts are about acne treatment reviews, alcoholism, and other seemingly random things.
It really looks like it's been through an article spinner. The article about alcoholism ends with "So, Think before you Beverage." Uh..really? Or what about "As emission safety glasses are put on in the office, they need to provide ease and comfort, safe healthy and crystal clear eyesight to make sure they are usually not golf clubs to the wearer." An article I found that wasn't spun is instead indexed 94 other times on the web.
I would say the content is why Google has not indexed it. They can't find the value to the user for returning this in a search result. Is this truly the content that your friend has put up, or has the site gotten hacked?
-
Hello Bryce,
That sounds possible to loose credibillity but could it be the reason for no index?
-
Thank you Ryan,
I will ask him about GWT. Perhaps it is just a sitemap issue but I wonder why Yahoo would spot it and Google would totally miss it. I often see that they have a difference in pages indexed but this is the first time I have seen thousands verses zero.
-
'm thinking that by linking out to Mobile Casinos and Polish Rock Bands, he's probably losing credibility.
-
I didn't notice any obvious problem with your site. Have you logged into Google Webmaster Tools and looked at the site? That would be the logical next step.
The robots.txt file looks fine, there is not any "noindex" tag on the home page, a GA code is present on the page, etc. I would suggest reviewing the site in Google's WMT and look for any issues.
If none are present, the next step would be to submit a sitemap. If your friend does not have a sitemap already set up, you can use http://www.xml-sitemaps.com/ I think the free version only maps 500 pages, but that is enough to get you started.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Google/Bing take to index
Hello we have 2-3 new pages being submitted every night to google/bing via our sitemap. Two issues I am noticing. Wondering if anyone else has same issues. a) 22 URL submitted via sitemap but only 1 indexed in two weeks. there are no errors showing b) If i submit manually using "Fetch As Google" and request indexing - the page gets indexed right way but after a day it seems to be unindexed - it will show up when i search (site:domain.com) but then disappear from the results doing the same search a few days later. Is this normal or do i have a problem that needs addressing? thank you
Technical SEO | | sancarlos0 -
Google Indexing of Site Map
We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!). On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap. So far the results have been underwhelming and google has indexed a very low number of the updated pages. As a result, only a handful of the new titles and descriptions are showing up on the SERP pages. Any ideas as to why this might be? What are the tricks to having google re-index all of the URLs in a sitemap?
Technical SEO | | Emily_A0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
How to fix google index filled with redundant parameters
Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem. I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history. A new customer wanted to improve their site ranking and SEO. We discovered the site was running an old version of Joomla and had been hacked. URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft. There was no notification in webmaster tools that the site had been hacked. So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time. This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions. Problem is that after a couple of months things aren't really improving. The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them. I think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's) The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's. It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible. Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian
Technical SEO | | iragless0 -
Bing indexing at a tiny fraction of Google
I've read through other posts about this but I can't find a solution that works for us. My site is porch.com, 1M+ pages indexed on Google, ~10k on Bing. I've submitted the same sitemaps, and there's nothing different for each bot in our robots file. It looks like Bing is more concerned with our 500 errors than Google, but not sure if that might be causing the issue. Can anyone point me to the right things to be researching/investigating? Fixing errors, sitemap crawling issues, etc. I'm not sure what to spend my time looking into...
Technical SEO | | Porch0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Help! Pages not being indexed
Hi Mozzers, I need your help.
Technical SEO | | bshanahan
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up. Our home page and most important services pages are nowhere in search results. What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txt Thanks!0 -
Properly Moving Blog from Index to its Own Page
Right now I have a website that is exclusively a blog. I want to create pages outside of the blog and move the blog to a page other than the index file e.g.) from domain.com to domain.com/blog I will have the blog post pages stay in the root directory. e.g.) domain.com/blog-post Any suggestions how to properly tell SE's and other websites that the blog has moved?
Technical SEO | | Bartell0