Getting 260,000 pages re-indexed?
-
Hey there guys,
I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed?
Thanks!
-
Great, I'm going to try that, thanks a lot!
-
Link to your category pages... Or a good idea might be to prepare pages by topic that feature (and link to) some of the most informative and popular threads.
-
-
We didn't actually do a 404, we 301'd everything, and I do mean everything, to our new domain.
-
Yes
-
Aye, that's what I thought as well
-
Nothing changed except for ads, which we placed better, the site speed is the same because we didn't move hosts. It actually improved lately because of someone we hired to optimize the site's speed. The backlinks coming in have transfered and we are building new ones. The thing is, the site itself is ranking really well for its new keywords, it's just these old ones that apparently have died
-
-
260,000 threads indeed, they go back to 2006 though, so we've had some time to get posts.
Throwing those PR5 links in there would help of course, but where to I point them at? How deep do I link? I could link to all the 260,000 threads but I believe that would be a little crazy.
-
check list:
-
) 404 , done
-
301 done
-
Been two months so by now google must have settled down with the traffic
-
How about on page factors ?
- page Title
-Layout
-
ads
-
Site speed
-
Linking outside
U need to check if they are all the same.
if its not this then I am afraid I can't come up with anymore points to help you with
-
-
while this maybe true in the general since I would like to however point out that the loss of traffic is caused due to shifting of the domain.
-
Almost two months now.
-
How long has it been since you have moved your site ?
-
260,000 threads?
How many inbound links do you have to hold all of that pagemass in the index?
If you don't have lots of high PR deep links into the site the spiders will visit obscure pages infrequently and will forget about them.
You need to link deep into these pages at multiple points with heavy PR. That will force a continuous and recurring stream of spiders down into the mass and require them to chew their way out. I think that you need a few dozen PR5 links at least for healthy indexing.
-
We've checked Google webmasters for 404 and crawl errors which we all fixed a day after moving. I can't check all the pages in SEOMoz tools because of the limit. We did do a complete 301 actually, redirecting every page to its new location.
-
I wud check google webmaster for 404 and crawl errors and fix them first.
I would then do the same in using seo moz tools.
After all that I would do a complete 301 from the old domain to the new domain.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Home Page or Internal Page
I have a website that deals with personalized jewelry, and our main keyword is "Name Necklace".
Intermediate & Advanced SEO | | Tiedemann_Anselm
3 mounth ago i added new page: http://www.onecklace.com/name-necklaces/ And from then google index only this page for my main keyword, and not our home page.
Beacuase the page is new, and we didn't have a lot of link to it, our rank is not so well. I'm considering to remove this page (301 to home page), beacause i think that if google index our home page for this keyword it will be better. I'm not sure if this is a good idea, but i know that our home page have a lot of good links and maybe our rank will be higher. Another thing, because google index this internal page for this keyword, it looks like our home page have no main keyword at all. BTW, before i add this page, google index our main page with this keyword. Please advise... U5S8gyS.png j50XHl4.png0 -
Howcome Google is indexing one day 2500 pages and the other day only 150 then 2000 again ect?
This is about an big affiliate website of an customer of us, running with datafeeds... Bad things about datafeeds: Duplicate Content (product descriptions) Verrryyyy Much (thin) product pages (sometimes better to noindex, i know, but this customer doesn't want to do that)
Intermediate & Advanced SEO | | Zanox0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
Page Indexed but not Cached
A section of pages on my site are indexed (I know because they appear in SERPs if I copy and paste a sentence from the content), however according to the text-only cached version of the page they are not being read by Google.Why are they indexed event hough it seems like Google is not reading them..... or is Google in fact reading this text even though it seems like they should not be?Thanks for your assistance.
Intermediate & Advanced SEO | | theLotter0 -
404'd pages still in index
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
Intermediate & Advanced SEO | | mj7750