Did Google just release another massive update in September?
-
Our number of external links has dropped by over 50% in mid-September!
So far our domain authority hasn't been impacted and traffic is only slightly down.
I did not hear of any major Google changes . . . did this happen to anyone else?
-
I don't think SEOMoz has that option but you can check out ahrefs and majesticseo - they both track backlinks gained/lost.
-
Our link building has always been legitimate so I don't think that would explain it. It is very odd to see 20,000 links (half of our total count) disappear in a single week.
It has not seemed to impact much though. At least, not yet . . .
-
How would I go about checking the links that have been dropped? From what I have been told, SEOmoz has no way of looking back at individual historic links . . .
-
Google went through 65 changes for Aug and Sept, not all related to search mind you, but some of these changes were aimed at "helping find more high-quality content from trusted sources". So if half of your external links were coming from link farms or sites that Google deems untrustworthy, that would explain it.
Hope that helps.
Mike
-
Have you checked the links that dropped? Maybe your link or that page just no longer exists. Likewise, G could be discrediting some negative links and therefor you aren't seeing them. There were many updates last month (October) and a few in September.
If your traffic is only slightly down, I would just continue to build high quality links to your site and not do anything drastic.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not detecting Hreflang
Hey everybody, We recently migrated our .co.uk to .com/en. Google for some reason is saying that the .com/en version has no hfrelang tags - even though they are clearly there and have had the same implementation as other language versions of the website. We also did a previous migration 6 months ago for the german version of our website and no hreflang problems there. We add our hreflang tags to our sitemap - which you can find here:
Technical SEO | | mooj
https://camaloon.com/en/web-sitemap.xml Any help or suggestions would be greatly appreciated!! Thanks 🙏0 -
Does Google read dynamic canonical tags?
Does Google recognize rel=canonical tag if loaded dynamically via javascript? Here's what we're using to load: <script> //Inject canonical link into page head if (window.location.href.indexOf("/subdirname1") != -1) { canonicalLink = window.location.href.replace("/kapiolani", ""); } if (window.location.href.indexOf("/subdirname2") != -1) { canonicalLink = window.location.href.replace("/straub", ""); } if (window.location.href.indexOf("/subdirname3") != -1) { canonicalLink = window.location.href.replace("/pali-momi", ""); } if (window.location.href.indexOf("/subdirname4") != -1) { canonicalLink = window.location.href.replace("/wilcox", ""); } if (canonicalLink != window.location.href) { var link = document.createElement('link'); link.rel = 'canonical'; link.href = canonicalLink; document.head.appendChild(link); } script>
Technical SEO | | SoulSurfer80 -
Google Sitemap - How Long Does it Take Google To Index?
We have changed our sitemap about 1 month ago and Google is yet to index it. We have run a site: search and we still have many pages indexed but we are wondering how long does it take for google to index our sitemap? The last sitemap we put up had thousands of pages indexed within a fortnight, but for some reason this version is taking way longer. We are also confident that there are no errors in this version. Help!
Technical SEO | | JamesDFA0 -
Google bot notification
Hi there! I've just made some changes in my website in order to optimize it but I don't know if there's a way to notify the googlebot that some aspects of the configuration (metas) have changed and must be "taken into account". The spider visited my site two days ago and obviously processed the sitemap file. I've heard that it's possible to do a ping to certain websites. Is this the way to proceed? I must say that there're not many updates in the site (just one way information) as the social media activity is still low. Thanks in advanced.
Technical SEO | | juanmiguelcr0 -
Google Analytics - Custom Variables
Hi guys, I'd appreciate any advice with this one. At the moment I'm in the process of arranging a URL re-structure. I was wondering what the best way would be to track the performance of the old URLs against new ones? We will be ammending the URLs for any new property pages which go live on our website but leaving the old URLs in play for any old properties listed. We're taking this approach for the moment so we can conduct analysis on the change. It has been mentioned to me that placing a 'setvariable' in the code of pages with the old URLs and ones with the new URLs would be a way of tracking performance. However, my knowledge in this area is a little bit grey. Any advice? Cheers, Mark
Technical SEO | | MarkScully0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0