Site Launching, not SEO Ready
-
Hi,
So, we have a site going up on Monday, that in many ways hasn't been gotten ready for search. The focus has been on functionality and UX rather than search, which is fair enough.
As a result, I have a big list of things for the developer to complete after launch (like sorting out duplicate pages and adding titles that aren't "undefined" etc.).
So, my question is whether it would be better to noindex the site until all the main things are sorted before essentially presenting search engines with the best version we can, or to have the site be indexed (duplicate pages and all) and sort these issues "live", as it were?
Would either method be advisable over the other, or are there any other solutions? I just want to ensure we start ranking as well as possible as quickly as possible and don't know which way to go.
Thanks so much!
-
It seems the general consensus is to launch the "good enough" site without blocking Google, and to fix the SEO issues as soon as possible.
However, I'd say that it really all depends on what those SEO issues are. For example, if you think you're going to be releasing thousands of non-canonical URLs into the SERPs without using any "fixes" it could be a long time before you get those out of the index once they're "fixed", especially on a new site with no deep external links. If waiting a couple of weeks before allowing the site to be indexed could save me from having to do thousands of individual redirects (as in those not handled easily by regular expressions), and could keep my site from launching with thousands of pages of thin and near duplicate content (why not start off in Google's good graces? Why start off on the wrong foot?) I would seriously consider blocking everything but the home page in the robots.txt file.
You would want the home page to be indexed no matter what because the launch will likely coincide with lots of press, advertising, etc... and people will be searching for your domain and/or brand. This would allow the "domain" to be indexed, which would take care of the date of indexation ranking factor discussed above (though in the grand scheme of things a few weeks is not going to matter), and would allow you to show up for a large proportion of searches (i.e. brand and navigational queries) since you would be unlikely to rank for many big non-brand searches out of the box anyway.
Then again, if you are just concerned with some small SEO issues, such as adding alt attributes or improving internal linking, I'd go ahead and launch.
-
The debate between UX and SEO has always been a pressing concern within the internet marketing community. While years ago these two factors were considered separate, as time passes the industry has realized that these two are not independent from one another but should work together.
That being said, I am always an advocate of launching a website as soon as it is ready. Of course this is only the case if all of the duplicate content, low-quality links and SEO black hat strategies have been removed. If any of these factors are present it can have a negative impact on site performance and where possible should be removed.
Like mentioned below, how long the website has been up can have an influence on ranking as well as other factors that you can be receiving credit for by not postponing the launch. In addition, SEO is a continuous effort that is never completely done, therefore I would recommend launching the website and then implementing your changes.
-
I would not "noindex" the site.
Because once you do that, google can visit less often and you might have to wait a while before the noindex is undone - especially for a new site with a very low page rank.
-
I thought this was an interesting question. I have a lot of admiration for one particular guy who knows a lot about launching a Website before it's perfect. His company's motto is "Doing is better than perfect."
He's Mark Zuckerberg.
Yeah. I'd launch it and then make gosh darn sure you follow up and clean up after the explosion.
-
Hi,
Unless the SEO issues you are talking about are very serious, I would rather let search engines index the website from the start, to gain time. History is a factor in SEO and, for a new website, it may take time to get noticed by SE.
I mean that Google gives a positive weight to the fact that a website has been out there for a longer time, compared with new website. Moreover, if you implement Google Analytics from the start, you can start optimizing having already some data (vs. having no data at all when you start optimizing).
The only strong case in which it is wrong to index a website is if you thing people should not see it, which does not seem to be your case.
SEO is a process and a game of adaptation.
Wish you good luck.
-
Since I'd guess you're only talking about a matter of days or a few weeks, I really don't think it matters, so I would lean towards getting it indexed as early as possible and dealing with the SEO once the site is "live".
-
Thanks guys, I appreciate it. I didn't even consider that Google would evaluate a site with a noindex, just not display it.
If that's the case, it seems it's best to rank lowly at first and then have the engines crawl when they will and notice the changes we implement over the coming weeks. As you say, it'd make no difference to how the site is viewed at the time we'd remove the noindex (unless the times between crawls were massive!), but that we'd lose out on potential traffic from ranking lowly.
-
I could be wrong in this, but I have always thought of no index as meaning "don't display". I have never actually tested it, but I would be willing to be that google crawls and rates your site even with a no index tag. The only difference being it is not displayed in the serp.
If I were you I would leave the no index tag out and just get things squared away after launch. In my opinion what will happen is when google keeps crawling it, they will see that the content has changed. Which will help you more in the long run than a no index tag. You might rank low at first, but through the SEO changes your ranking should go up. In my mind it is better to rank low at first then not to rank.
-
Hey Philip,
Hope you are well...
I would focus on getting the site up and ready and removing duplicate content etc, then have google index your site through GWT.
Hope this helps
Dave
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White listing a site
A new clients site is blocked by a lot of Firewalls. And I can't work out why, the content is family friendly they sell nursery equipment. I've run it through the Google checker and there is no malicious software found on the site. Can anyone tell me what I need to do to get this site unblocked? The url is http://knuma.co.uk/
Technical SEO | | Marketing_Optimist0 -
What are the SEO considerations when migrating a whole site from http to https
Hi Mozzers, I'm in the process of migrating a whole site, which has excellent rankings built through ongoing SEO over the years, from http to https. What is the safest way of doing this, while maintaining rankings? I'm assuming 301 redirect of every page from http to https? Thanks!
Technical SEO | | A_Q1 -
Similar pages on a site
Hi I think it was at BrightonSEO where PI DataMetrics were talking about similar pages on a website can cause rankings to drop for your main page. This has got me thinking. if we have a category about jumpers so: example.com/jumpers but then our blog has a category about jumpers, where we write all about jumpers etc which creates a category page example.com/blog/category/jumpers, so these blog category pages have no index put on them to stop them ranking in Google? Thanks in Advance for any tips. Andy
Technical SEO | | Andy-Halliday1 -
SEO for sub domains
I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!
Technical SEO | | Gavo0 -
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
Could multiple languagues on one site be bad for SEO???
Our site is has content in English and in Spanish. The spanish side was translated by me, Spanish is my first language, so i know that the translations are good and its original content. We were Pandalized/Penguinnized pretty bad earlier this year. We have completely cleaned our site of anything that could be considered thin content or grey hat techniques. An associate is telling me that we need to put the spanish version of the site on its own domain, does this make sense to anyone? The spanish side of the site gets only about 5% of the visitors, bu i still don't see the logic in taking all those pages and putting them on a different domain. Would this help recover from Panda/Penguin. Thanks
Technical SEO | | 858-SEO0 -
Video submission sites
Hello, What are the top 5 sites for video submissions ? Any suggestions about which points should be taken into consideration when submitting videos ? Thanks
Technical SEO | | seoug_20050 -
Is there an onsite seo api?
Im trying to find an api which I can intergrate with my database for onsite keyword checking. Does anyone know if there is one available on the market? thanks, Chris
Technical SEO | | seomasters0