Geo content and where Googlebot crawls from.
-
Does anyone have experience with geo-specific content on their homepage and how the location of the Googlebot impacts rank and/or traffic?
I ask because looking in Search Console today, I noticed the thumbnail image of our site is different than usual and it was pulling in a specific geo-location and wondered if there is any value/concern on how Google sees our site from different locations and if it could impact SERP's.
-
Google is location-agnostic, though will act like it cares about location depending on the location of the search. If it pulled the wrong thumbnail for you, it got there via a link (internal or external) and felt that is an appropriate result. What you do now depends on your goal (change the thumbnail for example).
It's good that you appear with a geo-targeted piece of content. This means you're responding to local searches. Google will show different SERP results for every person and location, so there isn't much value/concern over how they see your site. They see it from "all" locations.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console. Here's what I think we may have done wrong. Let me know what you think... We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.) We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.) We didn't use rel canonicals to tell search engines that these pages were special made for the areas. We didn't use alt tags until about halfway through. A lot of the urls don't use the target keyword exactly. The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages. We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that) We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it? I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair". Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ I appreciate your time and concern. Have a great weekend!
Local SEO | | everysecond0 -
Best Practice For Multisite Targeting Different States With Same Content
I am auditing a Joomla website that uses the MightySites component to create multiple versions of the same site for different state/province areas. For example, the site structure looks something like: example.com/fl/
Local SEO | | MatShepSEO
example.com/mn/
example.com/ny/
example.com/wa/ etc. Each of the state home pages are largely identical and much of the content within each state sub-folder is a copy of the original content on the main example.com site, with minor changes here and there. The client is a national organization and needs to keep this structure to allow each state to be able to edit and change their own content, though as far as I can see content doesn't actually vary much. What's best practice here in reducing duplicate content issues? We can't use hreflang as it is all within one country (although it does also provide two different language versions of content, for which I will use hreflang.) Should we just canonical everything back to the corresponding pages on the example.com site? Any thoughts or recommendations much appreciated.0 -
SEO and IP based content
Hello, We are building a guide/directory that will service multiple cities across Canada. Currently, our home page will detect your IP, and display local content on the home page. Although we feel this is incredibly useful to the end user, we are worried about how search engines will interpret our home page. In addition to our home page, should we have landing pages for each city that we are in? and should we follow site structure like this? www.thesite.com/vancouver So if a user from Vancouver goes to our home page, they will see Vancouver related content, but how would a search engine see the home page? We would like to know the best approach to placing well for searches in different Canadian cities. Most of our searches will be city specific: Calgary widgets, Vancouver widgets, etc. Thanks
Local SEO | | ebk0 -
Building Great Content
When writing content. Let's say I write fantastic useful content that most home buyers (since I'm a realtor) would benefit from, but they don't have a website, so they aren't going to link back to me anywhere. Whats the best way to get your content seen? Do you recommend putting it on facebook and promoting it? It's just tough in my business because it's such a commodity but I know there has to be a way. I'm just trying to see the best way before I spend TONS and TONS of time on writing actual useful and great content. As of now it's been a risk vs. reward thing and I haven't done it, but I feel like now is the time. Thanks!
Local SEO | | Veebs0 -
Google's Geo Search Setting Gone Cuckoo!
Hey Everybody! I thought I'd post about this because pretty much all of our members who do Local SEO are bound to run into this. Last week, when I was in the middle of training someone, I ran into something bizarre. Using Google's search settings to set my location to a remote locale, the local packs were returning me results for the correct city, but the organic results accompanying the pack were showing me results that appeared to be based on my own IP address instead ... in other words, Google was overriding my designated geolocation in favor of where it knows I'm actually located. I was relieved to see Mike Blumenthal post on this (helped me realize I wasn't going crazy - haha) and I recommend that everyone who does Local for a living take a look: http://blumenthals.com/blog/2015/05/24/google-location-results-still-screwy/ I also recommend checking out this G+ convo going on between John Mueller and others: https://plus.google.com/u/0/+TerrySimmonds/posts/1BZ6guvy9mE John's initial thought was that nothing has changed ... but something has definitely changed. Do some of your own searches and see what you come up with. Main takeaway here is that if you are trying to approximate clients' rankings in cities not your own, the results you are seeing may be very weird right now. Not sure if this is a temporary glitch or the forerunner to some change coming our way. This is a story to stay on top of, for sure. What do you you all see?
Local SEO | | Moz.HelpTeam0 -
Content Rewriting and Page ranking
Lets say that a prior writer did a horrible job with more then a few pages on your site and you wanted to rewrite the content for each landing page. A few of these landing pages are actually ranking pretty decently would it be ok to rewrite them as long as you kept the keywords and the density some what equal?
Local SEO | | Spartan222 -
Is it necessary to implement hreflang for translated content on different ccTLDs?
Hello there, new MOZ here. I hope someone of the international SEO MOZs can share their opinion on a doubt I have. I've been reading a lot about hreflang and I understand the importance for subdomains and subfolders not only for targeting the same language in different countries (.com, .co.uk, .ca, etc) but also for websites partially or fully translated in other languages. However for these I've always seen examples where you want to have hreflang with subdomains or folders e.g. ru.example.com ; example.com/ru What if I have my translated websites on different ccTLDs - i.e. example.com example.ru. example.br example .fr Do I still need to implement hreflang or in this case is not necessary?
Local SEO | | selectitaly0 -
Francise Space: How to handle Duplicate Content?
We have a client - http://www.certapro.com/ with 330+ individual franchises. The individual franchisees all share the same content. If you perform a series of search by zipcode, you'll see the different regions all share the same copy blocks. How would you handle this situation? New content for all 330+? Canonicalize them to a single source? Keep in mind we need to scale and would have to work with the local partners who may not be web savvy. Also thinking about iframing the same content as an alternative.
Local SEO | | Aviatech0