Adding Orphaned Pages to the Google Index
-
Hey folks,
How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls.
These pages are super low competition.
The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal)
a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned.
b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them?
c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned?
d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google.
Thanks for your opinions and if you have any hard evidence either way especially thanks for that info.
-
it's not a strategy, it's due to technical limitations on the dev side. i agree though thanks.
So, I asked this question to a very advanced SEO guru and he said they could be seen as doorways and present some risk and advised against it. That combined with the probability that they will most likely get dropped from Google's index anyway and we know that Google says they want pages to be part of the sites architecture has me leaning towards nofollowing all of them and maybe experiment with allowing 1000 to get indexed and see what happens with them.
Thanks for your input folks
-
I'd go back to the drawing board and rework your strategy.
Do you need additional sites? 150K orphaned pages you want indexed sounds spammy or poor site architecture to me.
-
Yikes, I didn't know the site was that big. Still, if you're afraid of how Google would "react" to those orphaned pages, I'd still test small, regardless of how large your overall site is.
-
Yea 1000 is probably a big enough sample.
10,000 seems like a lot i guess but not when you've got a site with 4.5 million pages.
-
yea submitting sitemap.xml files for 300k pages that are not part of the site seems a bit obnoxious.
-
we definitely want the 150k in the index since they are legitimate pages and linked to on the site. it's the 300k of orphaned ones we have to take along as a package deal that i am worried about. too many orphaned pages for Google.
-
That's a good idea. 10,000 Is still a lot. You could even test fewer than 10,000 pages. Why not try 1,000?
-
Hmmm. I am leaning towards the following solution since I would rather be on the cautious side, maybe this makes sense?
a) we noindex these 300k orphaned pages and do not submit sitemap.xml files
b) we experiment with say 10,000 pages and we allow only those to get indexed and submit sitemap.xml files for them
c) we closely monitor their indexing and ranking performance so we can determine if these are even worth opening up to Google and taking any risk.
-
In my opinion, add the 150k pages in the site map along with the 300k pages, let Google index all the pages and once they are all indexed , you can take a call on de indexing the 150k pages based on their traction.
-
I have no hard evidence, but if it were my site, I would do option C but keep an eye on what happens, and if I noticed anything strange happening, I would implement option B. But if option C makes you nervous, I see no reason you couldn't or shouldn't noindex them right off the bat.
That's merely one person's opinion, however.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Sitemap Indexed Pages, Google Glitch or Problem With Site?
Hello, I have a quick question about our Sitemap Web Pages Indexed status in Google Search Console. Because of the drastic drop I can't tell if this is a glitch or a serious issue. When you look at the attached image you can see that under Sitemaps Web Pages Indexed has dropped suddenly on 3/12/17 from 6029 to 540. Our Index status shows 7K+ indexed. Other than product updates/additions and homepage layout updates there have been no significant changes to this website. If it helps we are operating on the Volusion platform. Thanks for your help! -Ryan rou1zMs
Intermediate & Advanced SEO | | rrhansen0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Google Frequently Indexing - Good or Bad?
Hi, My website is only 4 months old and receives about 40 to 50 organic visits every day. It currently has about 100 pages out of which only 3-4 rank in the top 10 for the target KWs. I usually try to publish, at least 1 article a day but sometimes certain articles are more than 2000 words long with a few of infographics and hence takes way more time (maybe even 3 days to publish one) Only over the last week, I am observing that every time i am publishing a page (usually daily) google is indexing them the same day. This I have heard happens for moderately big sites but my site is really small at this stage. Note: For the first 80 pages, I used to "fetch as googlebot" in webmasters as otherwise my site would be crawled once in 2 weeks but over the last 3-4 weeks, i rely on googles scheduled visits. Is this a good or bad sign? I would like to assume its good because of my engagement. Though for only organic visits, my Gogle Analytics bounce rate is 65% in analytics out of the remaining 35%, the avg time on site >7 mins. That means if someone sticks to my site, they consume a lot of my content. Also, since analytics' bounce rate is not same as the search bounce (back button) I would like to consider that the bounce is actually lesser than that.
Intermediate & Advanced SEO | | dwautism0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
How to make Google include our recipe pages in its main index?
We have developed a recipe search engine www.edamam.com and serve the content of over 500+ food bloggers and major recipe websites. Our legal obligations do not allow us to show the actual recipe preparation info (e.g. the most valuable from the content), we can only show a few images, the ingredients and nutrition information. Most of the unique content goes to the source/blog. By submitting XML sitemaps on GWT we now have around 500K pages indexed, however only a few hundred appear in Google's main index and we are looking for a solution to include all of them in the index. Also good to know is that it appears that all our top competitors are in the exactly same situation, so it is a challenging question. Any ideas will be highly appreciated! Thanks, Lily
Intermediate & Advanced SEO | | edamam0 -
Why my own page is not indexed for that keyword?
hi, I recently recreated the page www.zenucchi.it /ITA/poltrona-frau-brescia.html on the third level domain poltronafraubrescia.zenucchi.it by putting it on the home page. The first page is still indexed for the keyword poltrona frau brescia . But the new page is no indexed for that keyword and i don't know why ( even if the page is indexed in google ) .. I state that the new domain has the same autorithy and that i put a 301 redirect to pass his authority to the new one that has many more incoming links that did not have previous .. i hope you'll help me thanks a lot
Intermediate & Advanced SEO | | guidoboem0 -
Adding Millions of Products to Google
What is the best way to submit all of your product pages, millions, to Google for serps? XML, RSS, Google Product Search, etc. These are products that are updated on a daily basis, and change often.
Intermediate & Advanced SEO | | Copstead0