Implementing Schema.org on a web page
-
Hi all,
As we know, implementing Schema doesn't change the look & feel of a web page for the users.
So here is my Question..
Could we implement Schema markup on the web pages only for Bots (but not visible to users in a Source code) so that page load time doesn't increase?
-
Hello Anirbon,
You never want to show Google one thing in the code, and show everyone else something different. That is the very definition of cloaking.
Have you looked into using JASON-LD instead of Schema markup? Built Visible has a great article on micro data that includes a section about JSON-LD, which allows you to mark up code in a script instead of wrapping the HTML.
-
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
-
Hi,
But using Schema, providing a well structure data will help bots to understand what type of content/information is present on a page & i think that will definitely help a page to rank better in Google search either its SRP or JD.
Regards,
Anirban
-
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
-
Hello Dirk,
Thanks for the reply.
Agreed that the impact of adding the few lines of extra code of schema.org will be zero on the load time of the pages. But it totally depends what content you are going to show on a page.
I want to implement Schema.org on the Search Result pages where a single page contains more than 50 listings with different information like Job Title, Company name, Skills, Job posted etc. For each i will have to use different properties as recommended by Google by which the load time of a page will definitely increase.
Please let me know for the above listed case.
Thanks
-
Try adding schema with meta tags in the html, for example:
This way you're telling bots your phone number with schema but it doesn't appear visibly to users. This is normally done with the latitude and longitude schema tags but you can use it for the others as well. Though I wouldn't rely on this as a permanent long-term solution as Google may change their policies on how they interpret content that is not visible to users.
-
It's a game of words. In the context of the question - if you would provide the schema tagging only to bots the tagged info could also be listed in the SERP's and the bots get a better understanding of what the page is all about. Final goal is off course to serve the user the best answers when he's searching. On the page itself however the user doesn't see any difference if the page is tagged with schema or not.
Dirk
-
Dirk I think you misunderstand my words. Schema for user means exactly the same that you wrote in last lines "Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Thanks
-
Hi Alick,
Schema.org is not for users - it is "a collection of schemas that webmasters can use to markup HTML pages in ways recognized by major search providers, and that can also be used for structured data interoperability (e.g. in JSON). Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Source: http://schema.org/
rgds,
Dirk
-
Hi Anirban,
I'm completely agree with Dirk second thing I would like to know what is the purpose of showing schema to bot only. In my limited understanding we use schema for user to show price, offers to users not bot.
Thanks
-
Hi Anirban,
The impact of adding the few lines of extra code of schema.org will be zero on the load time of your pages.
Apart from that, serving different content to bots & human users could be considered cloaking by search engines.
Implementing schema.org on the normal pages should do just fine!
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
What to do about pages I have deleted?
I have been working through the dead links on my page and recreating the page with new content for those pages that it still makes sense to have on the site. But I have a few that were just changes of the title, spelling mistakes or other ways of saying the same thing In other words I created a page called "areas of the UK we cover" but decided to change it to "areas covered" However, I must have created links to this page and now it is a dead link with a page authority of 19 I think it would be spammy to have two pages, one called "areas covered" and the other called "areas of the UK we cover. It's not a disallow in Robots.txt because the page does not exist Please note I do not have access to the header to add code for a 301 redirect. I'm still using webs.com but not for new sites. I also have a page called singing telegrams london, that I changed from singagrams london. These are two words for the same thing but they are two very different keywords would it be ok to recreate this page and create content for singagrams london. Help is much appreciated
On-Page Optimization | | singingtelegramsuk0 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
Why a page with an On Page A grade has a less good rank than a page with a F grade?
Why a page with an On Page A grade is ranked 17 in Google when the home page with a F grade is ranked 9 ? Thanks
On-Page Optimization | | Amadeus_eBC0 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
Blog page outranks static page for KW -- why?
Blog page ranks 10 in Google, while the static page is on page 7. What makes it more interesting is that the blog page scores an "F" with the Term Target tool while the static page scores an "A". Static page has more inbound links and a mR/mT of 3.89/ 4.54 vs. 3.71/ 4.14 for the blog page. Any ideas on how to approach this one?
On-Page Optimization | | 540SEO0 -
Duplicate Page Content Issue
For one of our campaigns, we have 164 errors for Duplicate Page Content. We have a website where much of the same content lives in two different places on their website. The information needs to be accessible from both areas. What is the best way to tackle this problem? Is there anything that can be done so these pages are not competing against one another? If the only solution is to edit the content on one of the pages, how much of the content has to be different? Is there a certain percentage to go by? Here is an example of what I am referring to: 1.) http://www.valleyorthopedicassociates.com/services/foot-center/preventing-sprains-and-strains 2.) http://www.valleyorthopedicassociates.com/patient-resources/service/foot-and-ankle-center/preventing-sprains-and-strains
On-Page Optimization | | cmaseattle1 -
Pages crawled
I noticed there is a limited in the number of pages crawled on galena.org? Will this number increase over time?
On-Page Optimization | | nskislak240