Pages Getting Deindexed
-
My Question Is I have 16 pages on my site that were all indexed until yesterday now there are only 3 indexed. I tried resubmitting my site map, and when i did it was the same result as before 3 pages indexed and 13 pages deindexed.
I was wondering if someone could explain to me why this is happening and what I can do to fix it? Keep in mind my site is almost three months old, and this has happened before but, it fixed itself over time thanks.
-
Okay thank you so, much for your great insights I will certainly look into it thanks once again.
-
Hi Peter - having a quick look at your backlink profile it would appear that they are all no followed so they are not passing any link juice - essentially worthless in the eyes of the search engines in terms of increasing your domain and page authority. I would really work on building a decent internal link structure - some proper navigation that is the same on each page, with relevant anchor text for your internal menu links. I would also set to work on building up your link profile and start looking for some relevant links that don't have no follows on them so that you start having some link juice passed to your site. If you go about this properly you should find that your page index issues will go away.
Also look for high quality links be careful with blog and forum comments as these are spammy and hold little to no value in terms of links and link juice. Ini the post penguin world I would also be careful not to over do your anchor text from external links.
Hope this helps...
-
Thanks for your response Matt this is my back link profile http://www.opensiteexplorer.org/links?site=www.peterrota.com
My traffic has gone down since this happened but, today I checked GWT and 13 of my 16 pages are now reindexed. I have a external links if anything. I need to add something so, if someone on a subpage they can get back to the home page because their is no way of dong that unless they type in my url. The only reason its like that is im working on a logo which when clicked on it will direct back to the main page.
-
my link profile is here http://www.opensiteexplorer.org/links?site=www.peterrota.com
the only thing I can think that has happened to cause it is. I blog commented on this one site on one article, and on each page it shows recent comments and it has my link on every page because of that it. So, bascially my site link is on 70 pages or so and growing. With the same anchor text which is "Peter Rota" which I think may be causing the problem even though all these links are no follow
PS I checked GWT today and now 13 of 16 pages have been indexed thanks
-
I would say, "I don't care" (to myself, not you!) and work on building authority to the site with links. Give Google a reason to trust each page. Honestly, there are so many reasons this could have happened speculation is just not worth the effort (eg. content duplicated off-site, throughout the site, reused title tags, lack of trust, etc.).
Just curious, what does your link profile look like right now?
-
I have seen this happen when sites new sites initially submit a sitemap for a new site most of the pages are indexed. However over time this drops and is usually down to the perceived value of the pages that are no longer indexed. Are these pages thin on content? Do they have any external links pointing to them? Do you have a good internal link structure between all your pages? A new site is more vulnerable to index fluctuations due to factors such as this and the fact that they are likely to have low domain and page authority. I would suggest you look at the factors mentioned above and also make sure you have all the on-page factors covered; making sure they are unique. If no of the above, the other times I have experienced a sudden drop in pages indexed has been when a new enhancement has caused page load to decrease? What has your traffic levels been like since this change – up, down or constant?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Link with Anchor to header of the page: Keyword is ranking
I saw something interesting this week. I am doing research and spec-ing out a content page we are creating and one of our competitors "office Depot" on their phone repair page create exact match keywords that lead to an anchor that took you to the header of that pages. They were ranking first for all of those keywords with little to no links Thier strategy is the more local long tail that includes "near me" Have you guys ever seen this
White Hat / Black Hat SEO | | uBreakiFix
this is the URL: https://www.officedepot.com/a/content/customer-service/samedayrepair/ They are ranking for these keywords ( Top 3 nationally ) iphone 6 repair near me iphone 7 repair near me I am assuming that this is both due to their PA and DA authority shifting the authority to itself, but it does not make sense how they are lacking in a lot of SEO low-hanging fruits like H1/H2 keyword saturation, URL, Title Tag within this content page....Anyone up for discussing this?0 -
Why homepage is not getting cached by Google ?
It has been more than 2-3 months that I didn't notice that our website homepage is not getting cached by Google ?? i don't know why?? help me please, thanks in advance. Regards,
White Hat / Black Hat SEO | | spellblaster
Spel why.PNG0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
Can our white hat links get a bad rap when they're alongside junk links busted by Panda?
My firm has been creating content for a client for years - video, blog posts and other references. This client's web vendor has been using bad links and link farms to bolster rank for key phrases - successfully. Until last week when Google slapped them. They have been officially warned on WMT for possibly using artificial or unnatural links to build PageRank. They went from page one of the most popular term in Chicago for their industry where they had been for over a year - to page 8 - overnight. Other less generic terms that we were working on felt the sting as well. I was aware of and had warned the client of the possibility of repercussions from these black hat tactics (http://www.seomoz.org/blog/how-google-makes-liars-out-of-the-good-guys-in-seo#jtc170969), but didn't go as far as to recommend they abandon them. Now I'm wondering if one of our legitimate sites (YoChicago.com), which has more than its share of the links into the client site is being considered a bad link. All of our links are legitimate, i.e., anchor text equals description of destination, video links describe the entity that is linked to. Our we vulnerable? Any insight would be appreciated.
White Hat / Black Hat SEO | | mikescotty0 -
Link Building: Location-specific pages
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!). Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there. I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
White Hat / Black Hat SEO | | AngieHerrera0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0