Discrepancy between Search Console & LightHouse - CLS shift
-
Curious if anyone else is having this problem. I have, for example, a page that is listed in Search Console as having a CLS of .44 - it is listed as a "CLS issue." The same page rendered in LightHouse shows 0 for field data CLS and 0.02 for lab data (both in the "green"). It has been over a month since I made updates to the page to improve CLS. I tried to submit a validation in Search Console, but "validation failed." I'm not sure what else to fix on the page when LightHouse data shows it as in the green! I have the same issue with other pages as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to recover search volume after domain name change?
On the 3rd of November we changed our company name and domain. The new site was not changed at all so the 301 process was quite straightforward. The change over was successful, no downtime, all pages redirected correctly (with a few minor exceptions). However, after a few days we started to see more and more links into the new site from the old site. They now stand at over 3 million. And links from the new site to the old site of over 200K. Links from the new site back to the old, were due to us having left a lot of links tucked away on various pages which were possibly causing loops with the 301 redirects on the old site. We fixed these and now there are no remaining links back to the old site, though we are still showing just over 200K links back to the old site. We are also seeing a LOT more back-links on the new site from old junk sites, which are not showing for the old site. A couple of years ago we went through about a year of trying to track down and remove thousands of spam backlinks. We did what we could, got a lot removed, showed Google the evidence, then Google lifted the penalty and said they had made some changes that meant the links were no longer causing the penalty. I added the old disavow file to the new site, but it doesn't cover a fraction of the sites which are being displayed as providing backlinks... many of which are clearly spammy. Is it possible that Google made some manual actions to lift the penalties but failed to associate these changes with the new domain? Changes that were not included in the disavow file? All help appreciated.
Technical SEO | | Exotissimo0 -
ATG & Endeca Integration & SEO implications
Does anyone have any first hand experience or must have recommendations around ATG & Endeca integration? I am somewhat familiar with ATG and the Oracle ATG guide, but if anyone has any specific SEO considerations they'd like to share? i.e. jumpservlet and SEO URLs Thanks!
Technical SEO | | ACNINTERACTIVE0 -
Backlink density & disavow tool
I am cleaning up my backlink profile for www.devoted2vintage.co.uk but before I start removing links I wanted some advice on the following: I currently have over 2000 backlinks from about 200 domains. Is this a healthy ratio or should I prune this? Is there a recommended max number of backlings per domain? Should I delete links to all or some of the spun PR articles (some of the article web pages have over 40 articles with links back to us)
Technical SEO | | devoted2vintage0 -
How to search HTML source for an entire website
Is there a way for me to do a "view source" for an entire website without having to right-click every page and select "view source" for each of them?
Technical SEO | | SmartWebPros0 -
On-Page Report Card & Rel Canonical
Hello, I ran one of our pages through the On-Page Report Card. Among the results we are getting a lower grade due to the following "critical factor" : Appropriate Use of Rel Canonical Explanation If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL. Recommendation We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply. This is for an e-commerce site, and the canonical links are inserted automatically by the cart software. The cart is also creating the canonical url as a relative link, not an absolute URL. In this particular case it's a self-referential link. I've read a ton on this and it seems that this should be okay (I also read that Bing might have an issue with this). Is this really an issue? If so, what is the best practice to pass this critical factor? Thanks, Paul
Technical SEO | | rwilson-seo0 -
Iframe & pulling data from higher ranked domain
Hi, i have a question regarding iframes and SEO. I know iframes are bad practice but if you have a brand new domain and want to improve its ranking more quickly, you can host the website file in a higher authority domain, and load an iframe on the new domain. Is this true? For example, if I build and host the website files on www.masterdomain.com (domain authority 48), and then load the pages within an iframe on www.newdomain.com (domain authority 5), will that help increase the domain rank for www.newdomain.com? What are the advantages (if any) and disadvantages for each domain www.newdomain.com and www.masterdomain.com if we do this? Thanks
Technical SEO | | Essentia0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0