Are the menus created by Locu crawlable?
-
As many of you might now, Locu is a company that allows restaurant owners to manage and post their menus on multiple websites. Their service is pretty slick, but is does raise the issue of whether their menus are crawlable or not.
You can see an example here: http://thequarternyc.com/menus.html. The menus are embedded into the website using a simple script:
Using Google Fetch, it doesn't look like there's any content to crawl, but Locu claims that the content IS crawlable.
I would love to get some other opinions on this question.
Thanks!
-
Out of interest, I tried Bing and the searches failed.
One more reason to add a plain text version in the noscript tag.
-
Could be. They could also be linked to on those phrases from other sites.
So I tested a different string from both menus:
"goat bucheret, carmody, dry aged jack, pt." -> success
and
"Satur Farms Green Salad" -> success
Perhaps you can confirm with your own test but it appears the claim is true.
However as a back up, it couldn't hurt to include no-script content since that's literally the purpose of the tag. Just remember to maintain the content.
-
This is what they responded with:
**Examples of working Search Quotes: **-- Vadouvan spiced squash seeds (1st hit)-- Mahogany Style Catfish Fillet (2nd hit )
Their menus do indeed show up, and the script is the same as in my previous example:
and
Is this proof enough?
-
If they claim it, ask them to back it up with a real example or two. Then copy what they did (ie noscript link perhaps?)
-
Googlebot can crawl javascript (which is how Locu displays text) but it's not a guarantee. If it's an option I would stick with HTML. There was a great case study done on YOUmoz awhile back on this : Can Google Really Access Content in JavaScript? Really?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating Tables with Multiple Links
Hello, Is Creating Tables with Multiple Links and using schema markup bad SEO? For example I have a real estate website which I have a hierarchy of home page -> County pages -> city pages-> blog posts.. However at the bottom of the county pages I list the city pages i serve under a table with links..
On-Page Optimization | | sqc0 -
Renamed a page and created a 301, page lost its rankings.
We changed a page name to fall under the root of our site from domain.com/page1/page301d/ to domain.com/page301d/ and after 2 weeks it still is not back to its #3 position. Now it is on the bottom of page 3. I cant figure out what im doing wrong here. The original .com/page1/ that this page fell under was removed totally and redirected to antoher page that was more relevant. I went ahead and re-enabled this page and its contnent, because the page was linking out to the page we 301d. This page we re-enabled had about 150 links poitning to it and therefore i was thinking that maybe the link juice from this page (or relevancy) via an internal link was helping it rank. This was updated about 6 days ago and the internal link is back Any other ideas why this might not be working. Ive checked all the 301s, content has not changed on the page. We have updated the strcuture for many pages. Instead of having the pages in question fall under anotehr page, they all fall under the root and its sub content is now only 2 levels deep , instead of being 3. hope that makese sense.
On-Page Optimization | | waqid0 -
Gallery system creates duplicates
Hi, Does anybody know what can I do with those “duplicate content pages”? 1/ home page shows 4 different urls with different parameters. Should I use meta-robots tag to eliminate it? Or block it in robots.txt? http://screencast.com/t/xqNiowCYBwgh 2/ Also, there are dozens of duplicates created by the “gallery system”. Like this: http://screencast.com/t/qTq4YERG All showing for the same url. There are multiple pages for each location. Some people told me that it's irrelevant for rankings anyway. I suggested getting rid of flash website alltogether and getting a smooth wordpress installation, but it's not an option. Can you please help me with it? Best Regards, JJ
On-Page Optimization | | jjtech0 -
Nice looking ecommerce menus with featured product categories - bad for SEO due to duplicate content?
My ecommerce website has menus which contain 'featured product sub-categories'. These are shown alongside the other product sub-category links. Each 'featured product category' includes a link, an image (with link) and some text. All menu content is visible to search engines. These menus look nice and probably encourage CTR (not tested!) but are they bad for SEO?
On-Page Optimization | | Coraltoes771 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
Tool to creat a good XML sitemap
Hello lads, I need to creat a XML sitemap for a website so I can add to Google Webmaster and Bing Webmaster. What do you guys recommend? Tks in advance! PP
On-Page Optimization | | PedroM0 -
Mega Menus? A good or bad idea for link juice.
Hi Just wondering what people think of using mega menus for navigation? We have used them on our new site http://nicontrols.com/uk/ When I run the site through the excellent SEOMoz campaign tools I see that we have too many on page links. I now believe the menu is good for customers but maybe not for link juice. Anyone got an ideas? Do I remove the mega menu or just reduce the number of links? Many thanks David
On-Page Optimization | | DavidLenehan0