Why does expired domains still work for SEO?
-
Hi everyone
I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains.
I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about.
What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains )
This is nothing new and SEO:s has been doing it for along time.
There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time.
Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links
So to the question:
WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them?
Is there any one here that know how this works technically?
-
Greetings, I am going to weigh in here, not because I am any kind of Yoda at all, but purely from a common sense point of view. I hope that's okay.
I would deduce that if anyone was able to know when a domain was released and how soon it sold thereafter it would have to be the domain registrar. So, let's say, hypothetically, that some domain registrar decides they are going to start publishing a list of domains that were released for sale and then sold immediately. Then let's say Google gets a feed of that list and just automatically, via the algorithm, discounts every single one of those domains down to PR 0, and strips them of all potential link authority value...
I'm sure you can see dozens of problems with that scenario. Here are just a few:
1. No one can really evaluate the new owner's identity or purpose without knowing who the new owner is. If registrars disclosed that information, I can't even imagine the number of privacy issues that would arise.
2. The assumption would be being made that the new owner is not the same, related to the old company. I'm sure there are plenty of cases where this happens.
3. Google would be making the assumption that the selling of the domain to a new domain owner was to end the business. Again, there are probably many many instances when this is not the case.
It seems to me that Google, nor any other search engine, can reasonably deduce the motives of a new domain owner. I mean, there are some smart folks at Google, but I don't think clairvoyance has entered the algorithm yet. Consequently, it probably seems more reasonable to let expired domains retain some of their value with the belief that most business owners are only going to buy domains relevant to their business and that end users will cast their "votes" for how well these new owners use the real estate by exhibiting either engagement or bouncing and viewing another site. Eventually, the algorithm will more or less accurately sift through the results and serve up results that visitors find engaging.
Sure, maybe it works for a year, two years, hell, even three years. So maybe this approach is viable, for now for a website or a page that just seeks short term benefits. But, if what you are building is a business that you want to last, a brand that you want to matter to people 20, 50, 100 years from now? Then I think there are far better uses of your time, effort and resources.
-
Please use that sarcastic tone some where else Keri. And I'm not asking for the algorithm.
I guess its me that has asked the question in a wrong way, I apologize for that. Let's take Google out of the picture completely for the most important question.
Is there ANYONE in the whole wide world that in some way can see if a domain has been expired and then been bought again just seconds later? If yes, HOW?
The next question would then be why Google doesn't just put the PR back to 0 and "block" all the linkvalue that the domain name had before it expired. Because its not very likely that its the same owner that buys the domain after it has expired (the domain doesn't just expire immediately, its sits in a quarantine for a few months). But as I said, don't ask yourself that yet, answer the first question.
Is there any technical yoda in here?
-
The only people who would know exactly how it works technically would be the people at Google who work on that section of the algorithm. They don't tend to hang out in forums and give away the inner-workings of how things rank, and likely are under many NDAs so they couldn't say even if they wanted to.
-
Thanks for the answers but I'm afraid that doesn't answer the question. How does it work technically?
-
With questions like this, I tend to look at it not from Google's point of view but from a person's point of view. The spiders are getting smarter after all and Google always says to write content and create websites for people not the Spiders.
So to answer your questions, you might want to ask yourself these questions:
- How am I supposed to know that the links on my website are broken, because a site I was linking to is now down?
- How do I know that the domain I am now visiting was down for a month or even a year?
- How do I know said blog is being used for black hat purposes? It it has relevant content and helps me, that's all that matters.
-
One reason it's difficult is that a domain may have expired because the owner forgot to renew it. Once it's expired, the owner quickly renews it. Should they begin from square one? Probably not--so that's why it isn't deleted (and may be the answer your looking for). If a domain has expired and no site goes up, it will eventually just gradually "disappear" (although may not fully).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Best tips needed to compete in SEO industry? (Thank you in advance)
Hello Moz Friends, So I wanted to ask for your friendly tips. Im in Colorado and my competition has business names like Colorado SEO and then one company owns like 5 of the top 10 Google ranked sites under different names. Im an honest guy, but how does someone compete in a crazy competitive industry? How about you? Did you start at the very bottom and never got to the top? Or did you outrank the leaders? I know seo people are smart, but it's easy to wonder if there is any room left? So just wondering your success or failure stories with competing in a competitive market online Any tips are appreciated! Chris
White Hat / Black Hat SEO | | asbchris0 -
Buy exact match domain and 301 worth it?
So there is this exact match domain that gets about 500 visitors a day. it has trust flow 17 and citation flow of 23 which is just a little lower than our own website. The website talks about one of our keywords and rank on second page in SERPs. I am not interested in buying and running that website, but rather just to liquidate all the pages with 301s into our existing domain and onto relevant pages. So the 301s would be to relevant pages. The question is, would this strategy be worth it in todays SEO world and Google updates?
White Hat / Black Hat SEO | | TVape0 -
Competitor link profile shocking - yet still out ranking!
Howdy fellow Mozzer's,
White Hat / Black Hat SEO | | TimHolmes
I have been doing some background seo checking on a competitor in my small "insurance niche" to try and see why they have recently shot up the listings and are now consistently out ranking us.
We have quality content on our site and have always taken an approach of trying to be whiter than white when it comes to developing out SEO plans. The site in question has recently moved ahead of us (along with some aggregators e.g. confused.com) possibly due to shifting patterns from possible algorithm changes favouring brand or could it be a case that Google has dropped a ball when it comes to checking back links as the competitors site is 99% linked to link farms, link submission sites, directories and lots of other spammy/poor quality sites. We do not feel they are doing anything from a content stand to justify their sudden propulsion up the ranks. I am reluctant to pursue dodgy tactics to help get out site back in position as I feel it could then contribute and hurt us down the line. Does anyone know how I can combat against their poor QUANTITY over QUALITY banklink profile that is surely helping them at the minute? At a bit of a loss so any help would be greatly appreciated. aRTu4cT0 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
New domain or flagged domain?
New client had a domain get flagged by Google and disappear from search rankings. He left is old website company and wants us to design new site using the flagged domain. Are we better off using a new domain or try to resurrect the flagged domain?
White Hat / Black Hat SEO | | Group20 -
Explain To Me How Negative SEO ISNT Real?
I'm seeing lots of "offers" springing up to do negative SEO on your competitors. I know people keep insisting this sort of thing is just a bogeyman, but follow my logic here: We know the Penguin update PENALIZED, and not just devalued "over optimization." Read: exact match keyword links. We know that if your link profile is too "unnaturally" keyword heavy, (it should be majority your brand or your domain or your company name, etc) you get penalized. Again, not devalued, PENALIZED. Ok. So what is to stop a blackhatter from using one of those software bots to just kill a competitor? Knowing the above two points, lets say a website is ranking for "cool widgets". Why not just create a bunch of exact match keyword spam links for "cool widgets" targeting that website. In a while, the Penguin penalty kicks in and bammo. The thing that scares me about the post Penguin landscape is that google has specifically named an activity ("over optimization") that will get you PENALIZED. So, don't do that, right? Except, that means they've explicitly outlined an activity that will be penalized, and is easy for others to do to you, and that you would be powerless to prevent. I await the usual "this is an age old worry that has never come true" replies. But if you reply that way, ask yourself, can you refute the logic of the points above? And also... oh no... It's happening. I'm seeing it.
White Hat / Black Hat SEO | | brianmcc1 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0