Is white text on a white background an issue when...?
-
Hi guys,
This question was loosely answered here (http://www.seomoz.org/q/will-google-index-a-site-with-white-text-will-it-give-it-bad-ratings), but I wanted to elaborate on the concern.
The issue I have is this,
http://www.searchenginexperts.com.au/preview/white-text-white-background-issue
Of the four div elements on the page, which;
-
is best practice for SEO? and
-
which of them would not be penalized by google on the grounds of hidden text?
The reason I ask is that I have a site that is currently implementing the first div styling, but if you either remove the image OR uncheck the repeat-x (in inspect element) the text is left as white on white.
I have added the transparent image on green to prove that having a background colour to back up the tiled image is not always going to work. What can be done in this scenario?
Thanks in advance,
Dan (From my managers account)
-
-
Yes Dan something like that could get reported. You should do your best not to have this happen, mostly on a large scale, a single incident would likely be ignored.
-
Thx Gents,
To clarify, the content in question was footer links on my clients site.
It sounds like the consensus is that the approaches I have in the example should be fine as my intention is not to deceive and only visitors (most likely competition) would flag this manually if it was.
What remains unanswered is that the last two examples on my test page will still create issues.
The third example inadvertently has a transparent section of the background image where text exists. You can see this if you click/drag over the middle section. I would imagine this would get flagged by visitors as hidden text (as it currently shows white text on white), but aside from offering a complimentary background colour to either the div element or the entire site (say a pastel colour) is there a better way to manage this than the fourth example (where I have simply offer a fallback green colour. This looks pretty bad)?
Thanks again...
Dan
-
Hey Dan
Ultimately, I don't think this would be a problem on an otherwise non spammy site. There is generally a big difference between a site that is using a set of spammy or manipulative techniques and one that makes a simple mistake like this so I doubt you have much to worry about if everything else is as it should be.
That said, I guess the simple question here is:
If you are using a background image and white text, why not use a background colour as well?
This would address the obvious usability issues relating to the image not displaying and clarify that there is no bad intention here to trick anything. Better for users, better for search engines, better for your SEO penalty related anxiety issues.
Hope that helps.
Marcus
-
Dan the rule of thumb is if the text is readable and not purposelessly hidden then you're safe. The operative word there is purposelessly.
I will also add that in general crawlers are not going to find these types of problems rather they are reported by users or more often than not your competition. From there search engines may have a human evaluate the report and make a manual ruling.
-
Ok the thing is, if text is humanly readable, you are safe. Just because you are using white texts and then something goes wrong with the style and the texts go invisible for a few days will not necessarily get your website banned. However, here I am assuming that you are not stuffing keywords there
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with the apperance of cross country sitelinks
Hi Moz community, My questions is related to the international SEO, esepecially sitelinks. The problem is that the users from US see in the search results the sitelinks which comes from different countries, e.g. users from US see the sitelinks from Australia or the sitelinks from our international website, which has obviously no specification. I must say, that we've done everything to be in accordance with Google interantional SEO recommendations, hraflang & lang attributes, properly set location in GSC. All of these were tripplechecked. I also need to say, that it happens only to the websites that include content written in English nad French. All other branches show proper sitelinks. It think Google can't properly locate the content, if the language is the same regardless country. Previously it could be solved with disavow tool, but today's I don't know about any manual action that could deal with the issue. I also noticed that some other pages are affected with the same issue. To better understand the issue, please see the image link. The image shows the results from US. Despite the location, it shows sitelinks form UK or International website. Do you have similar experience? I will be thankful for any help. 1NNtJ
Intermediate & Advanced SEO | | eset0 -
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
301 issues
Hi, I have this site: www.berenjifamilylaw.com. We did a 301 from the old site: www.bestfamilylawattorney.com to the one above. It's been several weeks now and Google has indexed the new site, but still pulls the old one on search terms like: Los Angeles divorce lawyer. I'm curious, does anyone have experience with this? How long does it take for Google to remove the old site and start serving the new one as a search result? Any ideas or tips would be appreciated. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
GWT url parameter issue/question
Hi Moz community, I'm having an issue with URL parameters in GWT. The tracking taxonomy for my websites is used as either /?izid=... (internal) OR /?dzid=... (external) I put tracking parameters in GWT as izid & dzid, but it hasn't picked up any URLs or examples in regards to these parameters. It's been about 2 months since we've started using this so I want to make sure Google isn't indexing as duplicate content. Side note: any page that uses a tracking parameter automatically adds rel="canonical" to the original page. Could this be the reason that GWT doesn't pick up any URLs for tracking parameters and/or do I not need to worry about adding paramters if I already have the canonical attribute automatically in place. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Canonical Issue with urls
I saw some urls of my site showing duplicate page content, duplicate page title issues on crawl reports. So I have set canonical url for every urls , that has dupicate content / page title. But still SeoMoz crawl test is showing issue. I am giving here one url with issue. The below given urls shown duplicate content and duplicate page title with some other urls all are given below. Checked URL http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7635 dup page content http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622&category_id=270&colors=Black_Tones&click=colors&ci=1
Intermediate & Advanced SEO | | trixmediainc
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 dup page Title http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7636&category_id=270&sizes=12x15,12x18&click=sizes
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7636
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622&category_id=270&colors=Black_Tones&click=colors&ci=1
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 But I have set canonical url for all these urls already , that is :- http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 This should actually solve the problem right ? Search engine should identify the canonical url as original url and only should consider that. Thanks0 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0