Converse.com - flash and html version of site... bad idea?
-
I have a questions regarding Converse.com. I realize this ecommerce site is needs a lot of seo help. There’s plenty of obvious low hanging seo fruit. On a high level, I see a very large SEO issue with the site architecture.
The site is a full page flash experience that uses a # in the URL. The search engines pretty much see every flash page as the home page. To help with issue a HTML version of the site was created. Google crawls the
Home Page - Converse.com
Marimekko category page (flash version)
http://www.converse.com/#/products/featured/marimekko
Marimekko category page (html version, need to have flash disabled)
http://www.converse.com/products/featured/marimekko
Here is the example of the issue. This site has a great post featuring Helen Marimekko shoes
http://www.coolmompicks.com/2011/03/finnish_foot_prints.php
The post links to the flash Marimekko catagory page (http://www.converse.com/#/products/featured/marimekko) as I would expect (ninety something percent of visitors to converse.com have the required flash plug in). So the flash page is getting the link back juice. But the flash page is invisible to google.
When I search for “converse marimekko” in google, the marimekko landing page is not in the top 500 results. So I then searched for “converse.com marimekko” and see the HTML version of the landing page listed as the 4<sup>th</sup> organic result. The result has the html version of the page. When I click the link I get redirected to the flash Marimekko category page but if I do not have flash I go to the html category page.
-----
Marimekko - Converse
All Star Marimekko Price: $85, Jack Purcell Helen Marimekko Price: $75 ...
www.converse.com/products/featured/marimekko - Cached
So my issues are…
Is converse skating on thin SEO ice by having a HTML and flash version of their site/product pages?
Do you think it’s a huge drag on seo rankings to have a large % of back links linking to flash pages when google is crawling the html pages?
Any recommendations on to what to do about this?
Thanks,
SEOsurfer
-
Tom,
Thank you for taking the time to look at the site and giving a detailed response. I’ve been doing some research myself and my findings mirror your assessment. Thank you for recommended action items too. Converse uses http://www.asual.com/swfaddress/ which is a good site experience but as you pointed out not so hot for SEO.
--SEOsurfer
-
Great question!
Firstly - unfortunately, Steve's suggestion isn't going to be viable for you. The # portion of the URL is not available to your code server-side, so you won't be able to determine where the rel canonical should point.
Furthermore, if they are committed to keeping the flash for now, and all as a single unit so one URL (the homepage), then you are going to have to accept that some juice intended for subpages is going to go to the homepage. You cannot do anything about that aspect, so you need to focus on the rest of the problem. However, whilst far from ideal, at least the juice is hitting the site somehow.
So… what to do?
Firstly, I'd start getting into the mindset of thinking in terms of the HTML site as the main/canonical site, and the Flash site as the 'enhanced experience' version. In this way, the HTML version is going to be the version that should be crawled by Google, and should be linked to.
Actions:
- Setup detection for mobile user-agents (out of preference I'd say all, but at least those known not to support flash, such as iPhone/iPad) and search engine bots, and ensure they get served the HTML version. Currently your homepage requires a click through on iPad offering an impossible Flash download, why not serve them the HTML page off the bat.
Is this cloaking? No! The HTML version is the main version, remember? It's no more cloaking than if you detected the user agent and then chose to serve the Flash version to Googlebot.
I actually discussed this with Jane Copeland at the fantastic Distilled link building event a couple of weeks back, and she agreed with me and said if it would stand up to a manual inspection then it is the right course of action.
-
Get all links in articles, press releases, directories or whatever else that are linking to specific pages and are originating from in house (or any source you have control over) to link to the HTML pages.
-
If the user arrives, has Flash and has arrived to an HTML link, you can now redirect to the Flash link for that page so they get the 'enhanced experience'. Don't use a 301 redirect -- remember the HTML version is the main version!
-
If the user arrives via a Flash link, but doesn't have Flash, but does have javascript you can detect the # variable and redirect them to the HTML page to help them along.
-
Educate the relevant stakeholders regarding point 2. I see you have a 'flashmode=0' option, tell them about this and how to use it get the URLs they need.
So where does this leave us?
-
The search engines can crawl all your lovely content, and they can ignore the flash version completely.
-
You are getting inbound links to specific pages. These pages have their own titles and meta descriptions… and content! Because they are the real site!
-
Users with Flash arriving via these links are landing on the correct Flash page of the site and are experiencing the rich site that you want them to.
-
Users arriving without Flash are getting the correct page if they arrive via an HTML URL. If they arrive via a Flash url then they get the correct page if they have javascript on (e.g iPad users), or they get the fallback of the homepage (rare).
I had a client with an almost identical situation, and I rolled out an almost identical solution to this, and they got crawled very quickly, shot up in Google and have stayed there for months.
Hope it helps. Let us know how you get on!
-
It's definitely a drag to have your links diluted between 2 versions of the site. There are a few solutions you can use, but the easiest would probably be to start using the rel=canonical tag on the flash version which points back to the same or similar page on the HTML site. That way, the engines know that the version you want indexed is the HTML version.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our client's site was owned by former employee who took over the site. What should be done? Is there a way to preserve all the SEO work?
A client had a member of the team leave on bad terms. This wasn't something that was conveyed to us at all, but recently it came up when the distraught former employee took control of the domain and locked everyone out. At first, this was assumed to be a hack, but eventually it was revealed that one of the company starters who unhappily left the team owned the domain all along and is now holding it hostage. Here's the breakdown: -Every page aside from the homepage is now gone and serving a 404 response code -The site is out of our control -The former employee is asking for a $1 million ransom to sell the domain back -The homepage is a "countdown clock" that isn't actively counting down, but claims that something exciting is happening in 3 days and lists a contact email. The question is how we can save the client's traffic through all this turmoil. Whether buying a similar domain and starting from square one and hoping we can later redirect the old site's pages after getting it back. Or maybe we have a legal claim here that we do not see even though the individual is now the owner of the site. Perhaps there's a way to redirect the now defunct pages to a new site somehow? Any ideas are greatly appreciated.
Technical SEO | | FPD_NYC0 -
Why is my site not being indexed?
Hi, I have performed a site:www.menshealthanswers.co.uk search on Google and none of the pages are being indexed. I do not have a "noindex" value on my robot tag This is what is in place: Any ideas? Jason
Technical SEO | | Jason_Marsh1230 -
Verify all versions of site in Bing Webmaster Tools
Hello, We recently migrated our site to a new shopping cart, https, and from www to non-www, and it's been a rough transition. We've lost a lost of traffic particularly in Bing. All the versions of our site are verified Google WMT, sitemaps are submitted correctly, etc. Unfortunately, this was not done for Bing. Currently only the new version of our site (https, non-www) is verified in Bing WMT. Do we have to verify all versions of our site in Bing, the way they are in Google WMT? Also, now that it's been a few months since the switch, should we still submit a site move to Bing WMT or is it too late? Thanks in advance!
Technical SEO | | whiteonlySEO0 -
Mobile site not ranking
Hello, I have a m.site.com version of my original site. It is about 1/10 the size, and no matter what I do-I can't get the site to rank. I've added more pages and specified canonical etc etc. Should I add as many pages as my larger site has? Are there specific places I should be submitting this version beyond the typical? I am at a loss, so any help would be greatly appreciated! Thanks! L
Technical SEO | | lfrazer1 -
Https Cached Site
Hi there, I recently switch my site to a new ecommerce platform which hosts the SSL certificate on their end so my site no longer has the HTTPS status unless a user is going through the checkout. Google has cached the HTTPS version of the site so in search it comes up sometimes which leads to a nasty warning that the site may not be what they are looking for. Is there a way to tell google NOT to look at the https version of the site anymore? Thanks! Bianca
Technical SEO | | TheBatesMillStore0 -
My site is not being regularly crawled?
My site used to be crawled regularly, but not anymore. My pages aren't showing up in the index months after they've been up. I've added them to the sitemap and everything. I now have to submit them through webmaster tools to get them to index. And then they don't really rank? Before you go spouting off the standard SEO resolutions... Yes, I checked for crawl errors on Google Webmaster and no, there aren't any issues No, the pages are not noindex. These pages are index,follow No, the pages are not canonical No, the robots.txt does not block any of these pages No, there is nothing funky going on in my .htaccess. The pages load fine No, I don't have any URL parameters set What else would be interfereing? Here is one of the URLs that wasn't crawled for over a month: http://www.howlatthemoon.com/locations/location-st-louis
Technical SEO | | howlusa0 -
What i should do about bad links ?
Hi, my blog is http://www.dota2club.com/ and i have many bad links to my blog what i should do about that and how ? i started 10 days ago guest blogging but my bad links from before are hurting my blog. please help 🙂 thank you !!!
Technical SEO | | wolfinjo0 -
.com or .co.uk in UK index? but the .com has higher domain authority...
Hi there i have a .com and a .co.uk for a site that has been around a while. However not much seo has been done on it, i was wonderign do i continue to optimise for the .com or switch to the .co.uk to rank in Google UK index for various search terms. .COM = 40 domain authority .CO.UK - 10 domain authority. Let the debate start 🙂
Technical SEO | | pauledwards0