Duplicate Listings on Google Maps
-
About 3 weeks ago google created a duplicate listing for our law firm on google maps.
In building links I have tried very hard to ensure that our address and company name was always listed identically.
Our correct firm name and address is
Feldman Feldman & Associates, PC 2221 Camino Del Rio South, Suite 201
inevitably somehow the new listing stated Camino Del Rio S, Ste 201
All of our reviews moved over to this new profile, I claimed it, changed it to make it the same reported it to Google. Google merged them.
Now Google has created another profile this time the firm name and address matches ours exactly (South and Suite both spelled out), but all of the reviews have moved over except for the most recent one(s).
I have claimed it again and reported it to google, changed the address. Google then created another listing.
Our page rank for keywords has been hurt by this. any idea why this keeps happening suggestions?
Here are the two pages. This is our original listing
http://maps.google.com/maps/place?hl=en&cid=468564492130231259
This is the new one google self created that stole all our reviews, but is ranked very poorly for the keyword searches.
-
Your two listings:
Feldman Feldman & Associates PC
12 reviews<a></a> 2221 Camino Del Rio South, Suite 201 San Diego, CA 92108-3609(619) 299-9600Other: Feldman Feldman & Associates PC 2 reviews<a></a>2221 Camino Del Rio South, Suite 201San Diego, CA 92108(619) 299-9600immigrateme.com
The only difference I see is the zip code "92108-3609"Not looking good. The best suggestion I have is try again. Click on the more link on the profile page, report a problem, and select "Place has another listing." then include a link to both profiles. Luckily your reviews will probably not "get lost" as they are not on Google's site, otherwise you might lose those.The only other reason, beyond the conflicting zip code, could be that there is another account that had claimed it prior. Word of caution, don't move too quickly or too anxiously with Google places. They scare easy.If you want to read the long answer:
http://www.searchenginepeople.com/blog/why-google-local-listings-merge-and-how-to-unmerge-listings.htmlAdditional thoughts -
It looks like both listings were edited about 11 hours ago.(edit log: Changed 11 hours ago - Phone| Added: | 619-299-9600 |
)And on May 3rd two different moderators made changeshttp://maps.google.com/maps/user?uid=217460202084778845705and http://maps.google.com/maps/user?uid=216814767163749414179I would suggest that you leave the one which you have to access alone and work to merge the one you don't have access to. Don't submit tickets for both at the same time.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Ranking Factors for Google
Yesterday a blog post appeared on SEOMOZ titled 'A Tale Of Two Studies' - http://www.seomoz.org/blog/a-tale-of-two-studies-google-vs-bing-clickthrough-rate It suggested some of the ranking factors Google and Bing take into account when ranking. A few of them I want to talk about: Social Signals, Age of Domain and H1 HTML Tag So I thought age of domain and H1 both had some weight in Google? I guess not! And social signals, now I know it gives some weight but its right up there in the list for both SE's, so should getting likes, tweets, plus1's now be part of my everyday link building? Bing-Google-CTR-Infographic-e1321978731479.png
Intermediate & Advanced SEO | | activitysuper0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
Intermediate & Advanced SEO | | itrogers0 -
Block Google Sitelinks for DSEO?
I am trying to manage DSEO for a client. The question is: would blocking a page listing from my client's Google Sitelinks cause that blocked sitelink page to be independently listed in the rankings and therefore potentially stuff a negative listing further down the rankings? Or would the blocked sitelink not show up at all in the SERPs
Intermediate & Advanced SEO | | bcmull0