How can I provide titles and descriptive text for our list of USPs on the same page optimized both for usability and SEO
-
I am rebuilding our website together with an agency and I am stuck with the following problem:
We have a page which will provide the visitor with a quick and convincing impression why he should chose our enterprise. On this page we want to show our USPs (Unique Selling Points) each with a title and a short description. Now my preferred way of presenting those USPs would be of a list of the titles (which permits to see all USPs without having to read a lot of text) where each title can be clicked to expand the description (in case you want to know more about this specific USP) and if you click on another title the previously clicked title description will collapse and the new description expand and so on (similar to this page: http://www.berlin-city-immobilien.de/38.html - I'm talking about the list in the middle of the page starting with the headline "Dabei profitieren Sie von folgenden Vorteilen"). Since I also want to use these descriptions as on page SEO-texts I checked whether Google might not index or at least value "click to expand content" less than plain text in the body of the page and I stumbled over this article: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html. According to this article Google will definitely discount the descriptions on my page.
Does anyone have an idea how to solve this problem? Either by suggesting a different way to show titles and descriptions on the page or maybe by suggesting a workaround so Google will not treat the descriptions as "click to expand text".
Thank you already in advance for your input.
Ben -
First of all thank you both for taking the time to answer my question.
@Russ
I also was hesitating whether I could display the text first and then collapse it with some JS but I also read somewhere that Google is or will be analyzing JS in the future and of course this could lead to a penalty if not now than somewhere in the future. So I think I will follow your advice to stick with your first suggestion.
As to your first suggestion: In this case the user has to click more so this is a slight limitation when it comes to usability but I guess to some extend I have to accept a compromise. Do you think it is a problem if content (in that case headline and teaser) is repeated on the same page?
@ Dimitrii
Well what Matt is saying is that they won't count it as some spam and penalize the website. But he does not say anything about how the click to expand content is weighted.
The solution with the different pages will not work in my case as I need all descriptions on one page for SEO and it is also a slight limitation to usability as the user has to keep on switching between the pages.
-
Hi there.
Well, in the same article you are referring to, is this text:
Amazon use to use a lot of tabs but now they seem to output most of the content directly on the page, making the user scroll and scroll to see the content. _Google's own help documents does use click to expand but only to see the questions. _
Also there was this video from Matt: https://www.youtube.com/watch?v=UpK1VGJN4XY
I understand that a lot of this content contradicts each other etc, but I'd look at this problem like this: it's not a secret at all that Google puts (or at least states that they put) User Experience first. So, Look at your page and see if users, after they land on it, would be happy. If everything makes sense from User point of view. If "expand" buttons are large enough and portrait that by clicking on them you'd expand content etc.
Also, as Matt said, is there 8 pages of content hidden and being displayed after you click "expand" and ruining your day?
I believe that as long as it looks good, makes sense to user and is good content, there shouldn't be any problems. The only workaround i see is instead of expandable content, to have simply links to other pages. I've seen both scenarios work.
Hope this helps.
-
This is a question that is getting a lot more attention lately. You have two choices...
1. Accept the reality that Google doesn't want to rank you for content that is hidden...
In this case, I would recommend starting with the list of your USPs at the top, maybe each with 1 sentence below explaining (like a headline and a tagline). Below that, repeat the headlines but each with a much longer description of text. Make the first listings links to the anchored headlines below, so if you click on the 1st USP, you are taken to the full description of it below. Then use a "return to top" anchor to bring you back to the list. This would allow you to get your USPs front-and-center and still get the content on the page.2. Or try and get around it.
Start with the content showing and then hide it with some JS event like a scroll, mouseover, timed event, etc.In the end, I would recommend finding a way to accomplish #1 so you don't worry about losing ill-gotten gains by tricking Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
We just can't figure out the right anchor text to use
We have been trying everything we can with anchor text. We have read here that we should try naturalistic language. Our competitors who are above us in Google search results don't do any of this. They only use their names or a single term like "austin web design". Is what we are doing hurting our listings? We don't have any black hat links. Here's what we are doing now. We are going crazy trying to figure this out. We are afraid to do anything in fear it will damage our position. Bob | pallasart web design | 31 | 1,730 |
Technical SEO | | pallasart
| website by pallasart a texas web design company in austin | 15 | 1,526 |
| website by the austin design company pallasart | 14 | 1,525 |
| created by pallasart a web design company in austin texas | 13 | 1,528 |
| created by an austin web design company pallasart | 12 | 1,499 |
| website by pallasart web design an austin web design company | 12 | 1,389 |
| website by pallasart an austin web design company | 11 | 1,463 |
| pallasart austin web design | 9 | 2,717 |
| website created by pallasart a web design company in austin texas | 9 | 1,369 |
| website by pallasart | 8 | 910 |
| austin web design | 5 | 63 |
| pallasart website design austin |0 -
Duplicate Page Title Crawl Error Issue
In the last crawl for on of our client websites the duplicate page title and page content numbers were very high. They are reading every page twice. http://www.barefootparadisevacations.com and http://barefootparadisevacations.com are being read as two different pages with the same page title. After the last crawl I used our built in redirect tool to redirect the urls, but the most recent crawl showed the same issue. Is this issue really hurting our rankings and if so, any suggestions on a fix for the problem? Thank you!
Technical SEO | | LoveMyPugs0 -
Advice on improve this content page for seo and google
Hi, i use joomla and i am looking for some help to find out what i should be doing to make my content pages better for seo and google. I would be grateful if people would look at the following page as an example http://www.in2town.co.uk/trip-advisor/top-american-ski-resorts-for-over-50s and let me know what i should be doing to make it better for seo and for google so people can find the page. I am using the above page as an example so i can learn from it. I would be grateful if people could look at the source code for the page to see if there is anything that should be in their that is not and if i should be looking at any joomla plugins for the content pages to improve the seo of the page. Any help to improve my seo for my content pages would be great. many thanks
Technical SEO | | ClaireH-1848860 -
SEO On-Page Planning - Wire Framing
This is open to discussion. I'm interested in getting help/opinions from others on how they plan their next SEO Project. What methods do you use for laying out On-Page Keyword targeting? Do you use any specific wire framing tools for laying out a large website? With larger websites, this stage is very important so I'd find it very useful to get help in this area. If you know of any useful threads covering this area, share them here.
Technical SEO | | Nick-SEOSpark0 -
Getting Rid of Duplicate Page Titles After URL Structure Change
I've had all sorts of issues with google when they just dropped us on our head a few weeks ago. Google is crawling again after I made some changes, but they're still not ranking our content like they were so I have a few questions. I changed our url structure from /year/month/date/post-title to just /post-title and 301 redirected the old link structure to the new. When I look I see over 3000 duplicate title errors listing both versions of the url. 1. How do I get google to crawl the old url structure and recognize the 301 redirect and update the index? 2. Google is crawling the site again, but they're not ranking us like they were before. We're in a highly competitive category and I'm aware of that, but we've always been an authority in our niche. We have plenty of quality backlinks and often we're originators of the content which is then rewritten by a trillion websites everywhere. We're not the best at writing and titles, but we're working on it and this did not matter much to google previously as it was ranking us pretty highly on the front page and certainly ranking us over many sites that are ranking above us today. Some backlinks http://www.alexa.com/site/linksin/dajaz1.com A few examples - if you google twista gucci louis prada you'll see many of the sites who trackbacked to us since we premiered the song rank much higher than us. 3 weeks ago we were ranking above them. http://dajaz1.com/twista-gucci-louis-prada/ google search jadakiss consignment mixtape 3 weeks ago we were ranking higher than all 4 sites ranking above us. The sites ranking above us even link to us or mention us, yet they rank above us now. original content here http://dajaz1.com/watch-jadakiss-confirms-cosignment-mixtape-2012-schedule/ I could throw out a ton of examples like this. How do we get google to rank us again. It should be noted that I'm not using any SEO plugin's on the site. I hand coded what's in there, and I know I can probably do it better so any tips or ideas is welcome. I'm pretty sure that our issues were caused by the Yoast SEO Plugin as when I search site:dajaz1.com the pages and topics that display were all indexed while the plugin was active. I've since removed it and all calls to it in the database, but I'm pretty nervous about plugins right now. Which brings me to my third and final question How do I get rid of the page category and topic pages that were indexed and seem to be ranking higher than the rest of our content? I lied one more. For category url I've set it to remove the category base so the url is dajaz1.com/news or dajaz1.com/music is that preferable or is this causing me issues? Any feedback is appreciated. Also google is crawling again (see attached image) but the Kilobytes downloaded per day hasn't. Should I be concerned about this? Gd9i6
Technical SEO | | malady0 -
Canonical - how can you tell if page is appearing duplicate in Google?
Our home page file is www.ides.com/default.asp and appears in Google as www.ides.com. Would it be a good thing for us to include the following tag in the head section of our website homepage?
Technical SEO | | Prospector-Plastics0 -
Micro formats to block HTML text portions of pages
I have a client that wants to use micro formatting to keep a portion of their page (the disclaimer) from being read by the search engines. They want to do this because it will help with their keyword density on the rest of the page and block the “bad keywords” that come from their legally required disclaimer. We have suggested alternate methods to resolve this problem, but they do not want to implement those, they just want a POV from us explaining how this micro formatting process will work. And that’s where the problem is. I’ve never heard of this use case and can’t seem to find anyone who has. I'm posting the question to the Moz Community to see if anyone knows how microformats can keep copy from being crawled by the bots. Please include any links to sites that you know that are using micro formatting in this way. Have you implemented it and seen results? Do you know of a website that is using it now? We're looking for use cases please!
Technical SEO | | Merkle-Impaqt0