Need for a modified meta-description every page for paginated content?
-
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page.
The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3
Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure.
Any advice?
-
We don't have a view all page(We found them so slow, so long, and with so meny links we had a notable improvement in rankings in general when switching to the quicker paginated versions). And other then the first page none of the other pages are currently in our site map.
I'm not entirely sure how that would stop gwt flagging it as a duplicate meta though. Less you imply to also no-index them.
-
Do you have "View All" as an option for your paginated pages? If not, you might consider it, and then just include the "View all" version of the page in your site map. Just a thought...
-
That scale of unique descriptions is well beyond our capacity. We're actually considering dropping the number of items per page too.
Thanks for the help.
-
Could ignore cause any problems? (such as pages that should/shouldn't be indexed) I was rather suprised to discover that using cannonical wasn't enough.
-
I believe appending the page number, for example: (Page 3 of 5) to the end of the meta description would suffice from SEOmoz's crawler's or GWT's perspective, however, the best would be to have the ability to create completely unique meta descriptions.
-
It sounds like you have canonical and rel next/prev setup correctly so you shouldn't worry about duplicate meta descriptions. You could add ?page= as a query string to "ignore" in WMT and then it will ignore those pages and you won't be getting any errors from duplicate meta's on those pages.
Hope this helps,
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
Google Six Pack Meta Descriptions
Anyone know of some good resources on the way to control the meta descriptions in the Google six pack for a branded search. Google appears to often bypass the meta description and pull random text from the page. We have tested this by adding the brand name to the meta description and this appears to help but doesn't always sort the issue. You often get the same results with the site comand. It not really an SEO issue more a branding one. Anyone get any ideas or resources they can point me at?
Technical SEO | | highwayfive0 -
Finding websites that don't have meta descriptions
Hi everyone, as a way to find new business leads I thought about targeting websites that have poor meta descriptions or where they are simply missing. A quick look at SERPs shows this is still a major issue for many businesses. Is there any way I can quickly find pages for which meta description is lacking? Thank you! Best regards, Florian
Technical SEO | | agencepicnic0 -
Using Product Page Content from an Offline Website
Hi all, We have two websites. One of the website's no longer sells product range A. However, on the second website, we would like to sell range A. We paid a copywriter to write some really good content for these ranges and we were wondering if we would get stung for duplicate content if we took these descriptions from website 1 and placed them on website 2. The products / descriptions are live anymore and haven't been for about 6 weeks. We're ranking for some great keywords at the moment and we don't want to spoil that. Thanks in advance! D
Technical SEO | | 10dales0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
Why does our page show a description in english in google spain?
Hi! We have a multilingual page and I have set in Google Webmaster Tools the language preference for the root domain to be none, Spanish for the .com/es, English for the .com/en, and German for the .com/de. The title and description show in the right language in Google Germany and google UK, but in google.es (Spain) the title and description appear in English instead of Spanish. Does anybody know why could this be happening and how to fix it? kJtF3.png
Technical SEO | | inmonova0