Pages Getting Deindexed
-
My Question Is I have 16 pages on my site that were all indexed until yesterday now there are only 3 indexed. I tried resubmitting my site map, and when i did it was the same result as before 3 pages indexed and 13 pages deindexed.
I was wondering if someone could explain to me why this is happening and what I can do to fix it? Keep in mind my site is almost three months old, and this has happened before but, it fixed itself over time thanks.
-
Okay thank you so, much for your great insights I will certainly look into it thanks once again.
-
Hi Peter - having a quick look at your backlink profile it would appear that they are all no followed so they are not passing any link juice - essentially worthless in the eyes of the search engines in terms of increasing your domain and page authority. I would really work on building a decent internal link structure - some proper navigation that is the same on each page, with relevant anchor text for your internal menu links. I would also set to work on building up your link profile and start looking for some relevant links that don't have no follows on them so that you start having some link juice passed to your site. If you go about this properly you should find that your page index issues will go away.
Also look for high quality links be careful with blog and forum comments as these are spammy and hold little to no value in terms of links and link juice. Ini the post penguin world I would also be careful not to over do your anchor text from external links.
Hope this helps...
-
Thanks for your response Matt this is my back link profile http://www.opensiteexplorer.org/links?site=www.peterrota.com
My traffic has gone down since this happened but, today I checked GWT and 13 of my 16 pages are now reindexed. I have a external links if anything. I need to add something so, if someone on a subpage they can get back to the home page because their is no way of dong that unless they type in my url. The only reason its like that is im working on a logo which when clicked on it will direct back to the main page.
-
my link profile is here http://www.opensiteexplorer.org/links?site=www.peterrota.com
the only thing I can think that has happened to cause it is. I blog commented on this one site on one article, and on each page it shows recent comments and it has my link on every page because of that it. So, bascially my site link is on 70 pages or so and growing. With the same anchor text which is "Peter Rota" which I think may be causing the problem even though all these links are no follow
PS I checked GWT today and now 13 of 16 pages have been indexed thanks
-
I would say, "I don't care" (to myself, not you!) and work on building authority to the site with links. Give Google a reason to trust each page. Honestly, there are so many reasons this could have happened speculation is just not worth the effort (eg. content duplicated off-site, throughout the site, reused title tags, lack of trust, etc.).
Just curious, what does your link profile look like right now?
-
I have seen this happen when sites new sites initially submit a sitemap for a new site most of the pages are indexed. However over time this drops and is usually down to the perceived value of the pages that are no longer indexed. Are these pages thin on content? Do they have any external links pointing to them? Do you have a good internal link structure between all your pages? A new site is more vulnerable to index fluctuations due to factors such as this and the fact that they are likely to have low domain and page authority. I would suggest you look at the factors mentioned above and also make sure you have all the on-page factors covered; making sure they are unique. If no of the above, the other times I have experienced a sudden drop in pages indexed has been when a new enhancement has caused page load to decrease? What has your traffic levels been like since this change – up, down or constant?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Diminishing Returns for Links to an Unrelated Page
Suppose I have a new website about cars and I had created a page about something completely not-related - like cupcakes. However, I found that it was very easy to get high quality sites to link to the cupcakes page where as it was very difficult to get people to link to the homepage about cars. If my goal is to increase the SEO for the homepage (which again is related to cars), is there a point where additional high quality links to my cupcakes page is not useful for it anymore? What if I created another page - about frosted cupcakes - which was also easy to get high quality links to?
White Hat / Black Hat SEO | | wlingke10 -
If I am getting links on competitor websites, is it safe to assume those competitors are doing this to hurt our SEO?
We have received a few notification from Google Webmaster Tools and Moz that our competitors have "mentioned" our page on their website. This is incredibly odd as you wouldn't think they'd want to do this. Further, when I go to the page that we are supposedly mentioned on, the link to our site is not on the page. What is going on? Thank you in advance for your insights!!
White Hat / Black Hat SEO | | brits0 -
Competitor website, how come they get away with it?
Hi we have been looking at competitors websites do see how we can improve, this website jumped out at me straight away as spammy gateway pages where 3 words was the only difference on all of the pages. Why does google give them so much weight still and rank them so highly? I thought this is what G was trying to avoid? Am I missing something here in terms of great SEO opportunity? A checked for noindex or canonical and I cannot see any. Love to hear some feedback. Cheers
White Hat / Black Hat SEO | | PottyScotty0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
Index page de-indexed / banned ?
Yesterday google removed our index page from the results. Today they also removed language subdomains (fr.domain.com).. Index page, subdomains are not indexed anymore. Any suggestions? -- No messages in GWT. No malware. Backlink diversification was started in May. Never penguilized or pandalized. Last week had the record of all times of daily UV. Other pages still indexed and driving traffic, left around 40% of total. Never used any black SEO tool. 95% of backlinks are related; sidebar, footer links No changes made of index page for couple months.
White Hat / Black Hat SEO | | bele0 -
Would the same template landing page (placed on 50+ targeted domains) help or hurt my ranking?
Scenario: Company ABC has 50 related domains that are being forwarding to the main company URL. Q1: Would there be SEO value by creating a template landing page for each domain that includes product info, photos and keyword links to the main URL? Q2: If all 50+ landing pages were the same, would that penalize the main site due to duplicate content?
White Hat / Black Hat SEO | | brianmeert0 -
For traffic sent by the search engines, how much personalization/customization is allowed on a page if any?
If I want to better target my audience so I would like to be able to address the exact query string coming from the search engine. I'd also like to add relevant sections to the site based in the geo area they live in. Can I customize a small portion of the page to fit my visitors search query and geo area per the IP address? How much can I change a web page to better fit a user and still be within the search engine's guidelines?
White Hat / Black Hat SEO | | Thos0030