Organic listing & map listing on 1st page of Google
-
Hi, Back then, a company could get multiple listings in SERP, one in Google Maps area and a homepage or internal pages from organic search results.
But lately, I've noticed that Google are now putting together the maps & organic listings.
This observation has been confirmed by a couple of SEO people and I thought it made sense, but one day I stumble with this KWP "bmw dealership phoenix" and saw that www.bmwnorthscottsdale.com has separate listing for google places and organic results.
Any idea how this company did this?
Please see the attached image
-
Thanks man!
-
If a site can build up enough trust signals with their on-site SEO, off-site references to the site itself, and also optimize for local listings, it is possible to get found in both. There is no one single formula or threshold because every competitive niche is unique, however I've seen it and had clients reach that point over time.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog-posts pages are dominating in search console "Internal Links". Only home-page at top!
Hi all, Ours is WordPress website and we have a blog...website.com/blog/. All the important pages in the website are well linked from top and footer menu. But in our webmasters...internal links section, only homepage is at the top. Blog-posts are others followed by homepage. I wonder why blog pages are dominating our website pages. Please give your suggestions on this. Do you think Google will give more priority for the blog-posts than website pages as they are more linked technically? Thanks
Algorithm Updates | | vtmoz1 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Why am i not ranking in the top 50 for the keyword 'cocktails' even though all my other cocktail related keywords are in the first 2 pages of Google???
I have checked the first 50 pages of google for my website www.socialandcocktail.co.uk using the keyword 'cocktails'. It is NOT to be found. However, if I search for other keyword combinations eg cocktail recipes, cocktail bars etc they are all in the first 2 pages! What is going on????????
Algorithm Updates | | cocktailboss0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Why is there no compiled list of the different types of search results on Google, and what the content qualifications are to generate those results?
Seems to me that this list should exist out there somewhere, but I can't seem to find it. Am I just not as good of a Googler as I thought I was?
Algorithm Updates | | Draftfcb0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0