Can't get Google to index our site although all seems very good
-
Hi there,
I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster.
What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more.
Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
-
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure.
In some cases when translating a page Google is able to detect a low quality translation or one done by automatic translators, so it may not index that page properly. It is advisable to have your web content translated by agencies that offer professional website translation services such as Blarlo. -
@pau4ner Thanks, very helpful and somehow comforting. For the FR translations, indeed we are doing them "a posteriori". As we have a canonical, I was thinking that it could not hurt, even if of course it is better once everything has been translated.
For the category pages that you suggest to noindex, are you thinking of pages like https://vintners.co/regions/france-greater-than-loire-greater-than-anjou which are probably good as internal links but are just content listing?
-
I have the same problem, I created a gross net salary calculator for Germany. I proactively reply on Quora, Facebook groups, and Reddit to questions with links to my site. But still, my site is not indexed. Also when I check the speed score for my site I get only a score of 30 for mobile. How much does this influence the indexing speed?
-
Hi, nowadays new sites can take longer to get indexed by Google. Your site is indexed, although not all pages.
I also see that you have submitted two sitemaps. It is not really necessary in that case. Having a quick look at other technical issues everything seems fine to me.
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages.
I assume content is 100% original and ahreflang is implemented amongst your language variations. I have noticed however that in the french version you still keep your titles and blog posts in English. They are canonical to your English original content, but I'd probably had them translated and add and hreflang instead.
Apart from that, getting more links to your website and reducing the internal link/outbound link ratio should work —I've had some recent cases that this helped, although it took 4-5 months to solve the indexing issues. -
@tom-capper Thanks for the reply. The concern is indeed that only a few pages have been indexed (ranking will be a concern later), and that although the sitemap has been discovered by Google, and Google Search Console said the engine has discovered all the pages, it seems like they are not being indexed.
I'm surprised as I had websites in the same domain, with very few external links too, that got indexed way faster!
-
The site appears to be indexed, but maybe not all pages.
Is your concern that the other pages are not being indexed, or that the pages that are already indexed are not ranking for any keywords?
I suspect in either case it doesn't help that this site has almost no external links (DA 1) - with this level of obscurity, Google will not prioritise crawling resources.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console. Here's what I think we may have done wrong. Let me know what you think... We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.) We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.) We didn't use rel canonicals to tell search engines that these pages were special made for the areas. We didn't use alt tags until about halfway through. A lot of the urls don't use the target keyword exactly. The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages. We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that) We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it? I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair". Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ I appreciate your time and concern. Have a great weekend!
Local SEO | | everysecond0 -
Unsolved What would the exact text be for robots.txt to stop Moz crawling a subdomain?
I need Moz to stop crawling a subdomain of my site, and am just checking what the exact text should be in the file to do this. I assume it would be: User-agent: Moz
Getting Started | | Simon-Plan
Disallow: / But just checking so I can tell the agency who will apply it, to avoid paying for their time with the incorrect text! Many thanks.0 -
Should I split long form content?
I have quite a long content on my site. By length I mean around 8000-9000 words. I optimized it to cover almost all searches related to a topic. But this length makes me uneasy for some reason. I do not think that users will find what they are looking for in such a long content. However, I don't want to neglect the SEO aspect of the content. I can talk about something like this without sharing the keywords completely: Title + for girls Title + for boys Title + for kids Title + for girlfriend Title + for boyfriend Title + for students As I said, in the current situation, these are all sub-headings (H2) of 8000-9000-word content. When I make a separate content for each of them, I can bring them all closer to 1500-2000 words. However, I am undecided whether this is the right step in terms of SEO and content optimization. What are your views?
SEO Tactics | | mozasea0 -
Sudden Drop in Mobile Core Web Vitals
Web Vitals Screengrab.PNG For some reason, after all URLs being previously classified as Good, our Mobile Web Vitals report suddenly shifted to the above, and it doesn't correspond with any site changes on our end. Has anyone else experience something similar or have any idea what might have caused such a shift? Curiously I'm not seeing a drop in session duration, conversion rate etc. for mobile traffic despite the seemingly sudden change.
Technical SEO | | rwat0 -
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
What tool can i use to get the true speed of my site
hi, i am trying to get the true speed of my site. i want to know how fast www.in2town.co.uk is but the tools that i am using are giving me different readings. http://tools.pingdom.com/fpt/#!/DkHoNWmZh/www.in2town.co.uk says the speed is 1.03s http://gtmetrix.com/reports/www.in2town.co.uk/i4EMDk34 says my speed is 2.25s and http://www.vertain.com/m.q?req=cstr&reqid=dAv79lt8 says it is 4.36s so as you can see i am confused. I am trying to get the site as fast as possible, but need to know what the correct speed is so i can work on things that need changing to make it faster. can anyone also let me know what speed i should be working for. many thanks
Technical SEO | | ClaireH-1848860 -
For Google + purposes, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase?
Relative to Cyrus Shepard's article on January 4th regarding Google's Superior SEO strategy, if I'm the primary author of all blog articles and web site content, and I have a link showing authorship going back to Google Plus, is a site wide link from the home page enough or should that show up on all blog posts etc and editorial comment pages etc? Conversely, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase since Google appears to be trying to make a solid connection with my name, and all content?
Technical SEO | | lwnickens0 -
Can someone break down 'page level link metrics' for me?
Sorry for the, again, basic question - can someone define page level link metrics for me?
Technical SEO | | Benj250