Geolocation and Indexing
-
Hi all,
Our company owns site that have over 5 millions pages in Google index. We are locating in German, but our business aimed to US market.
So, recently I checked index of our site using region targeting in US and there were only 150k of pages, but when I checked targeting in German there were almost 5 billion pages.
Our server/IP locating in US, all the backlinks are from US sites.
So, why there it is only small part of the site indexed in US?
Regards,
Dmitry
-
What exact do you mean by "when I checked targeting in German there were almost 5 billion pages"? I know you are referring to "million" but how did you arrive at that number?
Other things to check:
-
submit an updated sitemap to Google. How many pages show in the site map?
-
what type of navigation does your site offer? Is all of the navigation visible in HTML?
-
some sites offer dozens of versions of the same page. A print-friendly version, sorted ascending by price, sort descending by price, sort by size and many other properties. Each sort is a different page on your site. You can have a site with 150k worth of canonical pages, but 5 million actual pages. Google will not list the duplicate pages.
-
-
Sorry, that was my mistake, I meant 5 million pages.
Unfortunately I can't name the domain name. Our site is 100% US based, with English content. I'm asking for is there some other issues (not Panda, content, etc.), that can cause the situation with regional indexation.
-
Our company owns site that have over 5 billion pages in Google index.
In order to help you, some specifics would be needed. What is the URL of the site?
Off the top of my head I would think Amazon.com is one of the biggest sites around and they have around 320 million pages indexed. The largest forum site in the world has about 16 million pages indexed by Google.
The only site I can think of with billions of indexed pages would be a scraper or other form of content manipulation website.
You mentioned you are located in Germany so clearly your pages are going to be considered most relevant there. If you wish to be more relevant to US sites, the content would need to be presented in US English, use English measurements, currency, references, etc. You would desire links from US sites as well. You could go into Google WMT and set US as your preferred country, but that would mean you would lose a significant amount of your German indexing.
Also consider the US has fully implemented Panda. It is coming to Germany but has not been implemented there yet (to the best of my knowledge, I could be mistaken). If you have a billion or more pages, I am going to speculate a huge percentage of pages are duplicated both internally to your site, and externally to the internet. If that is the case, the number of indexed pages will take a huge hit.
If your site is deemed untrustworthy due to scraped content, your entire site may be de-indexed until the issue is resolved.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang tags and canonical tags - might be causing indexing and duplicate content issues
Hi, Let's say I have a site located at https://www.example.com, and also have subdirectories setup for different languages. For example: https://www.example.com/es_ES/ https://www.example.com/fr_FR/ https://www.example.com/it_IT/ My Spanish version currently has the following hreflang tags and canonical tag implemented: My robots.txt file is blocking all of my language subdirectories. For example: User-agent:* Disallow: /es_ES/ Disallow: /fr_FR/ Disallow: /it_IT/ This setup doesn't seem right. I don't think I should be blocking the language-specific subdirectories via robots.txt What are your thoughts? Does my hreflang tag and canonical tag implementation look correct to you? Should I be doing this differently? I would greatly appreciate your feedback and/or suggestions.
International SEO | | Avid_Demand0 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
Google does not index UK version of our site, and serves US version instead. Do I need to remove hreflanguage for US?
Webmaster tools indicates that only 25% of pages on our UK domain with GBP prices is indexed.
International SEO | | lcourse
We have another US domain with identical content but USD prices which is indexed fine. When I search in google for site:mydomain I see that most of my pages seem to appear, but then in the rich snippets google shows USD prices instead of the GBP prices which we publish on this page (USD price is not published on the page and I tested with an US proxy and US price is nowhere in the source code). Then I clicked on the result in google to see cached version of page and google shows me as cached version of the UK product page the US product page. I use the following hreflang code: rel="alternate" hreflang="en-US" href="https://www.domain.com/product" />
rel="alternate" hreflang="en-GB" href="https://www.domain.co.uk/product" /> canonical of UK page is correctly referring to UK page. Any ideas? Do I need to remove the hreflang for en-US to get the UK domain properly indexed in google?0 -
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
IP Address Geolocation SEO - Multiple A records, implications?
Hi, We operate an ecom site, with a .com TLD. The IP address of the hosting is based in France and indeed we seem to see quite a lot of traffic from France. How relevant is the A record of the domain for SEO? Is it still an important signal to help Google geolocate? And, if that is the case, is there a case for having multiple A records for the domain? Like an IP Address in France, an IP address in Italy, etc... that way the domain would have multiple A records... Thank you
International SEO | | bjs20100 -
Google Indexing Part two
Hi Everybody, I am trying to understand how does Google works, so I ve been reading and researching a lot. But I am still having a problem that I cannot solve. My website is in several languages, but its main language is Catalan. so if you get into my webite: "www.vallnord.com" the default language will be catalan. but if someone using Google.es in Spanish I would like the spanish version of the web to be the main result not the catalan. unfortunately this does not work like this. For a search query like "Esqui Andorra" the catalan version is on the 1st page and the spanish (www.vallnord.com/es) is on the 4th page. Does anybody know why is this happening or how can I solve it? Regards.
International SEO | | SilbertAd0 -
Geolocation Questions
I'm looking to combine my company's US web presence and its United Kingdoms web presence under one common look-feel and company name. Seeing as how we are fairly small, I'm thinking the best way to do this would be to simply create a "uk" folder and creating UK specific content in there. I would also like to have some geolocation on the site to make sure users receive the content that is relevant to them. With that in mind, here my questions: 1. Would creating a "locations" page with links between the UK and the US versions of the site, be enough so that Google is sure to crawl all content? (As I understand it, Google would appear as an American visitor to my geolocation script, and wouldn't see UK content unless there was a page that would explicitly direct it in that direction, correct?) 2. I've read elsewhere that I can target specific folders to a specific geographic target using Google Webmaster Tools. However, if the "main" site is US specific (there would not be a "us" folder) Setting the geographic target for JUST the "uk" folder would still work? 3. Finally, there will unfortunately be some duplicate content between the two sites. (we have a catalog of courses, for example, that contain different groupings of courses between the two sites, but the individual courses will appear with the same descriptions within the sites) What would be the best way to deal with something like that? I would hate to point all canonical links back to the US "main" site on every instance of duplicates, but I'm not sure how else to deal with it? Thanks for any help you can give. I know this is all a bit top level, but I'm a bit paralyzed with fear of starting, seeing as how I've never had to deal with these questions before...
International SEO | | TroyCarlson0