Implementing Schema.org on a web page
-
Hi all,
As we know, implementing Schema doesn't change the look & feel of a web page for the users.
So here is my Question..
Could we implement Schema markup on the web pages only for Bots (but not visible to users in a Source code) so that page load time doesn't increase?
-
Hello Anirbon,
You never want to show Google one thing in the code, and show everyone else something different. That is the very definition of cloaking.
Have you looked into using JASON-LD instead of Schema markup? Built Visible has a great article on micro data that includes a section about JSON-LD, which allows you to mark up code in a script instead of wrapping the HTML.
-
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
-
Hi,
But using Schema, providing a well structure data will help bots to understand what type of content/information is present on a page & i think that will definitely help a page to rank better in Google search either its SRP or JD.
Regards,
Anirban
-
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
-
Hello Dirk,
Thanks for the reply.
Agreed that the impact of adding the few lines of extra code of schema.org will be zero on the load time of the pages. But it totally depends what content you are going to show on a page.
I want to implement Schema.org on the Search Result pages where a single page contains more than 50 listings with different information like Job Title, Company name, Skills, Job posted etc. For each i will have to use different properties as recommended by Google by which the load time of a page will definitely increase.
Please let me know for the above listed case.
Thanks
-
Try adding schema with meta tags in the html, for example:
This way you're telling bots your phone number with schema but it doesn't appear visibly to users. This is normally done with the latitude and longitude schema tags but you can use it for the others as well. Though I wouldn't rely on this as a permanent long-term solution as Google may change their policies on how they interpret content that is not visible to users.
-
It's a game of words. In the context of the question - if you would provide the schema tagging only to bots the tagged info could also be listed in the SERP's and the bots get a better understanding of what the page is all about. Final goal is off course to serve the user the best answers when he's searching. On the page itself however the user doesn't see any difference if the page is tagged with schema or not.
Dirk
-
Dirk I think you misunderstand my words. Schema for user means exactly the same that you wrote in last lines "Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Thanks
-
Hi Alick,
Schema.org is not for users - it is "a collection of schemas that webmasters can use to markup HTML pages in ways recognized by major search providers, and that can also be used for structured data interoperability (e.g. in JSON). Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Source: http://schema.org/
rgds,
Dirk
-
Hi Anirban,
I'm completely agree with Dirk second thing I would like to know what is the purpose of showing schema to bot only. In my limited understanding we use schema for user to show price, offers to users not bot.
Thanks
-
Hi Anirban,
The impact of adding the few lines of extra code of schema.org will be zero on the load time of your pages.
Apart from that, serving different content to bots & human users could be considered cloaking by search engines.
Implementing schema.org on the normal pages should do just fine!
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deleted pages still registering as 404 pages.
I have a an all html site that I can only work on through the ftp. The previous marketing company ran a script that built thousands of location landing pages, but all they did was change the tags and headers and the keywords in the pages, other than that they are all duplicate pages. I removed them, but Google is reading them as 404 pages. How do I tell Google those pages don't exist? or do I just need to let the bots crawl it a few times and it will see that eventually?
On-Page Optimization | | SwanJob0 -
In-page Optimization check list....
Hey, Mozzers I have a question about optimizing a web-page. I was just having a conversation with our web developer about optimizing our website and which changes would make the most difference in affecting the SERPs as far as on-page optimization goes. I was explaining to her we wanted to optimize our pages in the following levels (ranked most important to least important): <title>: key word laden </p> <p>2) <H1> : key word in <strong>bold</strong></p> <p>3) page content: keyword laden as well</p> <p>The idea was to have three or four layers of keywords on the page. I work for a real estate brokerage and the context was the actual page of the property listing. So in this case, the keywords would be the address of the property. She explained to me that the the H1 tag is more important than the title tag.</p> <p>Does this ring true with you guys?</p></title>
On-Page Optimization | | AubbiefromAubenRealty0 -
Home page keyword effecting internal page ranking
Hello, My client has a second keyword for the home page that is competitive. The home page is not being ranked for this keyword. Instead, an internal category page is ranking. This internal category page is more relevant than the home page - it shows the categories for the actual products that this term refers to. But everyone around us in Google's page results has far more backlinks than the internal page, and we're all heavily optimized for this term. My question is, is it safe to pull the second term off of the home page or is this internal page strong because it is somehow being strengthened by the home page optimization?
On-Page Optimization | | BobGW0 -
Opinions please on Duplicate page titles & too many on-page links warnings.-
Hello folks, I'm a total SEO newbe but totally enjoying
On-Page Optimization | | CSC
using SEOmoz to learn more. We have ecommerce sites and the 1st crawl flags – as appears typical too many on-page links. We display up to 20 products (each with three links!)
and I’m trying to push to have fewer but meeting resistance from colleagues.
We have links duplicated all over the site believing it eases navigation. My question is just how critical is the number of products displayed
and the resulting volume of links to SEO results? Also we currently have collections of products displayed
across several pages which of course have the same page title and this is flagged
as a duplication error. I wonder if product auto-scrolling help as this means only a certain number of products are displayed at one time on one page thus reducing links and the need for duplicate page titles? My superiors are resisting change (perhaps nervous of spoiling
what already works) and I need to know where to direct my persuasive powers! Many thanks in anticipation, Spence0 -
Which redirect to use when redirecting to https page from http page
I have one form under https which is redirected from the regular http page. this site was not made by me and I am trying to understand if the way it was redirected using 302 redirect is a problem Thanks
On-Page Optimization | | ciznerguy0 -
Page Review Please
Can you please tell me what I can do to improve the SEO of this page. I feel like I am seeing the forest and not the trees or the other way around. I just want to learn from the errors of this page, so I can just simply learn http://azbestlistings.com/casa-grande-az-real-estate-homes-for-sale-in-casa-grande-az Thank you,
On-Page Optimization | | sansonj0 -
More than 100 internal links from a page
Hi, we have been developing our new site and improving the internal linking for 2 reasons, 1 to improve spidering and 2 to up sell more to customers. The error reports from SEOMoz are showing our biggest problem is too many internal links from 2000+ pages. How much of an impact does it have by having say 180 internal links compared to say 99 on a page? Our website has been moving up the SERPs so should i worry about it or should I ignore the warnings and continue with the menu system and internal linking we have in place already? Thanks
On-Page Optimization | | PottyScotty0 -
® is displaying as o in the SERP pages
Hello All, A client's site was recently redesigned and since the redesign the ® is not displaying as such in the search engines when created using Option+R on a Mac. If it is created using ® it works fine. I suspect it has something to do with the programming or the CMS. Other sites on this CMS seem to not have this problem from what I have seen so far. http://www.google.com/search?aq=f&sourceid=chrome&ie=UTF-8&q=cliff+stevenson#sclient=psy&hl=en&q=cliff stevenson calgary realtor&aq=0&aqi=g1&aql=&oq=&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=23c2bd992ac36d9d&pf=p&pdl=3000 As more time goes on, title tags and descriptions that are months or years old are also digressing to the 0. I have cued up a few tests and once crawled will have narrowed down whether it is a CMS issue or not. Won't have completely done so, but much closer. The CMS says they have made no changes that would have caused this. The programming gang on my end says they did nothing different on this site then on others so it can't be on us. The classic 'no way it can by us, it has to be them' from all angles. I am stuck in the middle trying to find the solution. Has anyone else ever come across this problem? Thoughts?
On-Page Optimization | | kyegrace0