Multiple Versions of Mobile Site
-
Hey Guys,
We have recently finished the latest version of our mobile site which means currently we have 2 mobile sites. Depending on what device and Os will depend on which site you will be presented with.
e.g.
iPhone 3 or 4 users on iOS4 will get version 1 of our mobile site
iPhone 5 users on iOS5 will get the new version (version 2) of our mobile site.Our old mobile site is currently indexed in Google and performing pretty well.
Since the launch of the second mobile site we have not see any major changes to our visibility in Google and so was curiousMy main concern here is duplicate content so I am curious can Google detect that we have 2 mobile site that we serve depending on device? And if Google can detect this, why has our sites not been penalized!
Thanks,
LW
I know the first thing that comes to your mind is Duplicate content
-
Hi LW,
Sorry for the extreme delay here - the Q&A notification system went wonky for a bit and I never got the response message for this thread.
I'm sure you're passed this issue by now, but yes - Googlebot Mobile should just index the mobile version of the page.
Best,
Mike -
Hey Mike,
Thanks for your feedback, it is really helpful.
We are serving up unique source code on the same URL per device, with the user agent being detected on the server-side.
Am I right in assuming that he Googlebot Mobile will only see one version of the pages and index accordingly?
Cheers,
LW
-
Hi LW,
I'm wondering about some particulars of your setup for this.
How are URLs handled between the three sites (1 desktop, 2 mobile)?
Are you serving up unique source code on the same URL per device, or do you have device-specific URLs for all content?
What are you using to detect the useragent and redirect the user? Is this happening server-side, or with JavaScript?
The particulars of your setup will determine your best approach. When in doubt I would follow the instructions on this page.
I would not expect two mobile versions of your site to cause a duplicate content issue - more likely that Googlebot Mobile will only see one version of the pages and index those (but as above, the technical particulars will determine this).
Best,
Mike -
Thanks for your response, You raise a very valid point about the time taken for Google to index it. The new site has been live for a couple of weeks now, so I was hoping to see the new site to be starting to get indexed by Google by now!
In regards to rel="canonical", Yes we have implemented on the mobile site referencing the desktop site.
Reason behind developing a new version rather than just updating the previous version was because we had new functionality to include and a fair few changes to design based on learning from the old site. That being said code from the first site was still being used so it wasn't a completely new build.
-
If you have only just launched the new version of your mobile site, it may take some time before Google indexes it and detects that there are duplicate content with your previous version. Google bot doesn't crawl all new sites instantly.
Just wondering, have you done anything to prevent duplicate content penalty, such as using rel="canonical" tag? Also, why not update your previous version instead of creating a different mobile site entirely?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
How would you optimize a new site?
Hi guys, im here to ask based on your personal opinion. We know in order to rank in SEO for a site is to make authority contents that interest people. But what would you do to increase your ranking of your site or maybe a blog post? leaving your link on blogs comment seem dangerous, nowadays. Is social media the only way to go? Trying to get people to write about you? what else can be done?
White Hat / Black Hat SEO | | andzon0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Site architecture change - +30,000 404's in GWT
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future. But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated. Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Has my site been penalized by google
Hi all I have noticed a sudden drop in rankings for most of my keywords on kerryblu ,co,uk and was thinking the site may have been manually penalized by google. I have not received any notification of this in webmaster tools but can't think of any other reason for the loss of rankings. I have searched the web for info on this but can't find a definite answer. Is there any way of knowing for sure. At the time of the crash the only real change I made was adding google adsense to my blog. Could this be responsible. Thanks for looking.
White Hat / Black Hat SEO | | Dill0 -
You're a SEO manager for a new company working on a new site. Where to?
So, you've recently begun as a SEO manager for a new company who's just launched a lovely, gleaming corporate site to boot. The onsite stuff is taken care of and your attention turns to link building. Now you've been in the game for a few years. You've seen things change in that time. Directories are out. Link networks are done. You're not going to embark on reciprocal linking either because it's bad and looks horribly tacky. Black Hat, White Hat - you know the score. You're lucky that the company produces a page or two of news a day - it's original, informative, is great for keeping your clients informed and you punt this on Twitter and FB. A bit of link bait, eh? But there's a rub: your competitors, with their bigger budgets, and industry clout, have been around for a some time longer than your company has been. They've snapped up all the good (industry-related) sites to get links from. You've approached all potential targets with the offer of good, relevant content and affiliate partnerships but they aren't having any of it. You're simply out-sized by the big boys next door - you can't compete. They're rich kids. There just seems nowhere to get links from. Do you just go the route of press releases and articles? Do you use paid blogging services? Grovel at doorsteps. The industry you're in is incredibly commercial - no meek altruist is going to take pity and give you a couple backlinks out of kindness. What do you do? What indeed...?
White Hat / Black Hat SEO | | Martin_S0 -
Partner Site Hit with Penguin - Links hurt me
I work for a network of international websites, the site I work on is for Canada. Our partners in Australia were hit by penguin hard because they hired a black hat SEO guy and didn't know. He was creating profiles on highly authoritative sites and keyword stuffing them. Now, they've completely dropped off the SERP. This is where the issue occurs, because we are all international partners we are all linked together on the header of every page so visitors can choose their country. Now, because they were hit hard and we have reciprocal links (not for rankings but for usability) will we be affected? It seems like we have, but I just want some opinions out there. Also, should we go ahead and stop linking our sites between countries to avoid this mess?
White Hat / Black Hat SEO | | BeTheBoss0 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0