Why do some serps have a + map symbol?
-
Hi from sunny but freezing wetherby UK..
Ive noticed when you enter "York solicitors" some listings have + Show map symbol. Hers is a screen shot to illustrate:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/plus-map-serps-langleyscopy.jpg
I'd like to know please what i would have to do to emulate this.
Thanks in advance,
David
-
Hi Again, David,
I've had a chance to talk this other with a few other Local SEOs, and here is the consensus of opinion on the Plus Box.
1. It's algorithmically generated - so nothing you can do to force a plus box on your listing.
2. That being said, following Google's general recommendations on this page might be useful:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=92319&from=52171&rd=1
3. Also, be advised that there appears to be some kind of a new crackdown going on regarding plus boxes and certain businesses who have lost Google's trust seeing their plus boxes disappear. This is a new issue and I don't have any solid data on it. But, it leads me to ask...did you have a plus box before and see it go missing? Curious!
Hope this reply helps some.
Miriam
-
Hi David, The Google Plus Box was first rolled out in 2006. Here is a Matt Cutts post on the topic from way back when: http://www.mattcutts.com/blog/new-google-ui-feature-plus-box/ Here is an excellent sleuthing post from Mike Blumenthal in 2008 about figuring out the sources for bad data in the plus box: http://blumenthals.com/blog/2008/03/06/google-plus-box-where-does-the-wrong-data-come-from/ The trouble with looking for documentation on this right now is that anything you search for related to 'google' and 'plus' is bringing up studies of Google+ and not the old Google plus box. I'm going to dig a little deeper, David. I'll return to this thread when I have some more info to share with you.
-
At least one other site has google maps embedded, but that one is using an iframe. The site in question is using javascript and has a google maps API key in the header.
Both have claimed Google places listings with the location indicated.
-
Hi David
Had a quick look in the SERP's and my guess is that they 'seem' to be the only page one result that has the map embedded on their site. http://www.guestwalker.co.uk/find-us/google-map/
It's just a guess from me - but I'd give that a try and hopefully google will add that map link for your clients site.
edit : Have a look at the clients google places page and also look into rich snippets if you've not done so already. I ran the URL that you mentioned through http://www.google.com/webmasters/tools/richsnippets?view=&url=http%3A%2F%2Fwww.guestwalker.co.uk but they are not using rich snippets btw
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden issue: no google cache for any product detail pages and offer schema missing from SERPS
Absolutely no idea what is going on. All of our category / subcategory and other support pages are indexed and cached as normal, but suddenly none of our product pages are cached, and all of the product / offer schema snippets have been dropped from the serps as well (price, review count, average rating etc). When I inspect a product detail page url in GSC, I am either getting errors or it is returned as a soft 404. There have been no recent changes to our website that are obvious culprits. When I request indexing, it works fine for non-product pages, but generates the "Something went wrong
Technical SEO | | jamestown
If the issue persists, try again in a few hours" message for any product page submitted. We are not SEO novices. This is an Angular 7 site with a Universal version launched back in October (new site, same domain), and until this strange issue cropped up we'd enjoyed steady improvement of rankings and GSC technical issues. Has anyone seen anything like this? We are seeing rapid deterioration in rankings overnight for all product detail pages due to this issue. A term / page combination that ranked for over a decade in the top 10 lost 10 places overnight... There's just no obvious culprit. Using chrome dev tools to view as googlebot, everything is kosher. No weird redirects, no errors, returns 200 and page loads. Thank You0 -
Is the Google results serp broken?
Hi everyone! We've been trying to decipher how many of our pages are indexed by google at the moment. If we do the usual "site:https://www.hobbydb.com" search term, the serp says that we have more than 740,000 pages indexed. However, when I do a deep dive and click through to the last page of results, I can only get to page 54, and then there are no more results. This would mean that I only have 540 pages indexed, not 740,000. We have also done other queries for other sub-sections of our website, and the results also truncate at 50 pages. Has anyone run into this problem? Any suggestions are appreciated! Best, Alex
Technical SEO | | mpchobbydb0 -
Duplicate content issue: staging urls has been indexed and need to know how to remove it from the serps
duplicate content issue: staging url has been indexed by google ( many pages) and need to know how to remove them from the serps. Bing sees the staging url as moved permanently Google sees the staging urls (240 results) and redirects to the correct url Should I be concerned about duplicate content and request Google to remove the staging url removed Thanks Guys
Technical SEO | | Taiger0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Using symbols in the html title of a webpage
If you a symbol in the title of a webpage will this dilute the keywords in the title
Technical SEO | | mickey11
thus making it rank worse in search engines here is an example <title><br /> Black Shoe Polish<br /></title> versus <title><br /> ▶ Black Shoe Polish<br /></title> will the extra symbols count as words and thus the dilute the effectiveness of the Black Shoe Polish keyword. sort of making like 4 words instead 3. By the way, The reason to use a symbol is to make it stand on in the search engine results0 -
Does removing product listings help raise SERP's on other pages?
Does removing content ever make sense? We have out of stock products that are left on the site (in an out of stock section) specifically for SEO value, but I am not sure how to approach the problem from a bottom line conversion stand point. Do we leave out of stock products and hope that they turn into a conversion rate via cross selling, or do out of stock products lower the value of other pages by "stealing" link juice and pagerank from the rest of the site? (and effectively driving interest away) What is your perspective? Do you believe that any content that is related or semi-related to your main focus is beneficial, or does it only make sense to have strong content that has a higher rate of conversion and overall site engagement?
Technical SEO | | 13375auc30