Do I need to verify my site on webmaster both with and without the "www." at the start?
-
As per title, is it necessary to verify a site on webmaster twice, with and without the "www"?
I only ask as I'm about to submit a disavow request, and have just read this:
NB: Make sure you verify both the http:website.com and http://www.website.com versions of your site and submit the links disavow file for each. Google has said that they view these as completely different sites so it’s important not to forget this step. (here)
Is there anything in this? It strikes me as more than a bit odd that you need to submit a site twice.
-
Yes, that's generally considered the correct way of doing it if you have links to both www and non-www.
-
Thanks, Chris. Do I need to tailor the disavow request for each version? i.e. only links that point to our domain with the www subdomain in one list and non www in a separate list?
-
In GWT, if you tell Google that you have a preferred domain (www or non www) it will require that you verify both versions. You're not able submit a disavow file for a domain that you have not been verified for, so if you have links pointing to non-www, you need to be verified for that domain before you can submit your disavow file for it--and vise versa.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Yet-to-be-translated" Duplicate Content: is rel='canonical' the answer?
Hi All, We have a partially internationalized site, some pages are translated while others have yet to be translated. Right now, when a page has not yet been translated we add an English-language page at the url https://our-website/:language/page-name and add a bar for users to the top of the page that simply says "Sorry, this page has not yet been translated". This is best for our users, but unfortunately it creates duplicate content, as we re-publish our English-language content a second time under a different url. When we have untranslated (i.e. duplicate) content I believe the best thing we can do is add which points to the English page. However here's my concern: someday we _will_translate/localize these pages, and therefore someday these links will _not _have duplicate content. I'm concerned that a long time of having rel='canonical' on these urls, if we suddenly change this, that these "recently translated, no longer pointing to cannonical='english' pages" will not be indexed properly. Is this a valid concern?
Technical SEO | | VectrLabs0 -
Domain Structure - without www.
I'm working on a new project and we would prefer to not use the www. - for name/branding reasons. Are there any SEO ramifications from setting the domain without the www and using 301 redirects for all home page extensions to forward to -> domain.com(without the www)? Furthermore, we will be hosting many profiles on this site and would like to structure them for optimal SEO. Would there be an issue with using sub domains - user.domain.com, or would sub directories be more optimal? Thank you in advance!
Technical SEO | | NickMacario0 -
Many "spin-off" sites - 301 or 401/410?
Hi there, I've just started a new job with a rental car company with locations all over New Zealand and Australia. I've discovered that we have several websites along the lines of "rentalcarsnewzealand", "bigsaverentals" etc that are all essentially clones of our primary site. I'm assuming that these were set up as some sort of "interesting" SEO attempt. I want to get rid of them, as they create customer experience issues and they're not getting a hell of a lot of traffic (or driving bookings) anyway. I was going to just 301 them all to our homepage - is this the right approach? Several of the sites are indexed by Google and they've been linked up to a number of sites - the 301 move wouldn't be to try to derive any linkjuice or anything of that nature, but simply to get people to our main site if they do find themselves clicking a link to one of those sites. Thanks very much for your advice! Nicole
Technical SEO | | AceRentalCars0 -
Which to redirect to, www or non-www?
Please help. I have a client who's site has duplicate content issues due mainly to the www/non-www problem. The MOZ page authority of both is the same, the www version has more linking root domains, and for most of the keywords the www version shows up in Google. The developer indicated the site is on the non-www. I need to do a .htacess 301 redirect to eliminate the dup content problems. Which do I redirect to? Any help is greatly appreciated, Thanks!!
Technical SEO | | SteveFaber0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
We recently launched a beta version of our new website in a subdomain of our existing site. The existing site is www.fonts.com with the beta living at new.fonts.com. We do not want Google to crawl the new site until it's out of beta so we have added the following on all pages: However, one of our team members noticed that google is displaying results from new.fonts.com when doing an "site:new.fonts.com" search (see attached screenshot). Is it possible that Google is indexing the content despite the noindex, nofollow tags? We have double checked the syntax and it seems correct except the trailing "/". I know Google still crawls noindexed pages, however, the fact that they're showing up in search results using the site search syntax is unsettling. Any thoughts would be appreciated! DyWRP.png
Technical SEO | | ChrisRoberts-MTI0 -
Top pages give " page not found"
A lot of my top pages point to images in a gallery on my site. When I click on the url under the name of the jpg file I get an error page not found. For instance this link: http://www.fastingfotografie.nl/architectuur-landschap/single-gallery/10162327 Is this a problem? Thanks. Thomas. JkLej.png
Technical SEO | | thomasfasting0 -
TLD - ".com.br" X ".com" which to use?
Hello I'm starting an SEO work on a site that has the domain "www.dominiodocliente.com" and "www.dominiodocliente.com.br." The problem is that the domain name. ".com" already has a low rank for keywords chosen as the domain "Com.br" has no rank. On the other hand, the domain ". Com" has 224 results in google as the domain "Com.br" has 1970 results. My question is: Which domain should I focus on SEO work? Tks
Technical SEO | | eder.machado0