Best Schema Advice
-
Hi,
I am new here and I have searched for but not got a definitive answer for this. I am sorting out a website which is a scaffolding company operating in a particular area.
They are only interested in targeting a particular area and from what I have read through here I need to mark the site up with schema mentioning their company name and address.
My issue is that I seem to find lots of conflicting advice about what should go it and how it should be laid out.
I would love to know peoples opinions on where the best guide for setting up schema correctly for a site like this. They use wordpress, I am ok with inserting code to the site etc, I just want to make sure I get it right from the start.
Once I have done this, I understand that I need to get local citations using the same NAP as how the site is marked up.
Sorry for what might seem like a daft question but I am a designer and I am still learning the ins and outs of SEO.
Thanks
-
Jayson DeMers wrote a great article regarding how to use schema markup for local SEO. This isn't a complete list, but should definitely help you get started.
http://www.searchenginejournal.com/how-to-use-schema-markup-for-local-seo/60245/
I hope that helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for making a very long URL shorter
Hi Moz folks! We are redesigning a website of 30,000+ pages. We are pulling together a spreadsheet for 301 redirects. So basically this: http://www.mywildlifesite.org/site/PageServerpagename=priorities_wildlife_endangered_species_protection#.Ws54SNPwbAw/mexican-spotted-owl Will direct to here, this is the nav architecture:
Technical SEO | | CalamityJane77
https://mywildlifesite.org/wildlife-conservtion/endangered-species-act-protections/endangered-species-list/birds/mexican-spotted-owl My question is, can I and should I truncate that new destination URL to make it easy for Google to see that the page topic is really the owl, like this:
https://mywildlifesite.org/endangered-species-list/mexican-spotted-owl Your input is greatly appreciated! Jane0 -
What's the best way to integrate off site inventory?
I can't seem to make any progress with my car dealership client in rankings or traffic. I feel like I've narrowed out most of the common problems, the only other thing I can see is that all their inventory is on a subdomain using a dedicated auto dealership software. Any suggestion of a better way to handle this situation? Am I missing something obvious? The url is rcautomotive.com Thanks for your help!
Technical SEO | | GravitateOnline0 -
Schema.org markup for breadcrumbs: does it finally work?
Hi, TL;DR: Does https://schema.org/BreadcrumbList work? It's been some time since I last implemented schema.org markup for breadcrumbs. Back then the situation was that google explicitly discouraged the use of the schema.org markup for breadcrumbs. In my experience it had been pretty hit or miss - sometimes it worked without issues; sometimes it did not work without obvious reason. Consequently, I ditched it for the data-vocabulary.org markup which did not give me any issues. However, I prefer using schema.org and currently a new site is being designed for a client. Thus, I'd like to use schema.org markup for the breadcrumb - but of course only if it works now. Google has dropped the previous warning/discouragements and by now lists a schema.org code https://developers.google.com/structured-data/breadcrumbs based on the new-ish https://schema.org/BreadcrumbList. Has anybody here used this markup on a site (preferably more than one) and can confirm whether or not it is reliably working and showing the breadcrumb trail / site hierarchy in the SERP? Thanks for your answers! Nico
Technical SEO | | netzkern_AG0 -
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
Manual Actions tab advice on message
Ok so I have this message in manual actions (with no examples of links): Manual Actions
Technical SEO | | pauledwards
Site-wide matches None
Partial matches Some manual actions apply to specific pages, sections, or links
Reason Affects
Unnatural links to your site—impacts links
Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. I am not surprised by this as an agency a few years ago did mass aritcle submissions for the same anchor text, I have manually removed 119 or so domains in the last year and a half and 4 weeks ago i disavowed the last 40ish domains left. Obviously the back-link profile can be seen to have an unnatural anchor-text distribution still but not as bad. In terms of rankings we lost some core terms on the homepage, not completely but most have gone from say page one to page 2/3/4 etc We are still getting good traffic to internal pages, so i am assuming action was taken to the homepage - where the mass of those links are pointing to. Where do you guys recommend I go from here, shall i go ahead and click the reconsideration request? or wait longer for the disavow. I am still also trying to remove bad links. Any advice much appreciated.0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
301 help, whats the best way
Hi all right now i have 301 redirects setup in my htaccess file i recently redesigned our site so i have been redirecting all the old urls to the new ones. I saw a post about having all your urls the same format, so i updated my htaccess file to redirect all urls from http://www.mysite.com/food to http://www.mysite.com/food/ (added a forward slash). Now on my latest seo crawl i see all my site urls, redirecting to the forward slash url. am i doing this right, thanks will
Technical SEO | | Will_Craig0