Changing the city of operation and trying to know the best way of informing Google
-
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
-
Hi Sukhbir,
Currently, the best way I know of reporting that a location has closed is to go to this page:
https://support.google.com/places/
Click the red 'Contact Us' button.
Go through the wizard, choosing the 'my listing has incorrect information' and then the 'this business no longer exists' options.
Link to the URL of the Google+ Local page for the closed location, and in the additional notes section, explained that this branch of the business has closed, though the others remain open.
My understanding is that this does not completely delete the location from Google's system - there is currently no way to do so - but will prevent it from appearing for your service related terms. It may still appear for people searching specifically for that location, but will have a label on it stating that the business is closed. Not a perfect solution, but the best I know of that Google currently offers.
Beyond this, I would recommend that you manually remove as many third party citations of the business on other directories such as YP.com, Yelp, CitySearch, etc., so that you are getting rid of as much data as possible that supports the existence of the business.
Your website should be edited to remove absolutely all references to the closed location. Not sure about redirecting pages. My main goal would simply be to get rid of any references to the closed location.
Hope this helps!
-
Hi Sukhbir,
What you could is redirect the C city urls to one of the closest locations so at least your users will know that location C is closed. You then could also add the URLs for location C to your robots.txt to make sure Google won't be crawling them anymore and stop indexing hopefully.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
If content is at the bottom of the page but the code is at the top, does Google know that the content is at the bottom?
I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!
Technical SEO | | DA20130 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Change of domain name?
Hello, We are currently developing a new site for an existing online clothing retailer. The existing site is on a .co.uk domain, however we are targeting a global market and wondered whether we could/should launch the new site under a .com address and whether this would be beneficial? Most of our back links come from Affiliate blogs and we could quite easily change these to the new URL. Thanks Bilal
Technical SEO | | PLP1 -
The best way to organize a gallery for SEO?
I need to redo the following gallery
Technical SEO | | UnderRugSwept
http://goo.gl/PFvjE
because besides the fact that it looks ugly, it's an SEO mess. Since all the pages are comprised of images, and the only text is the navigation, I'm getting duplicate content issues. I tried adding a little paragraph of text on some of the pages, but this thing needs a total revamp. My main question is this: is that menu being repeated on all the pages really a good thing? What good is it to, say, on the fire patches page, to have a menu that includes all these keywords for sports patches? Would it be better to just have a main gallery page that lists the main patch types: applique, motorcycle, Scouting, ect, and then once you get to that page, list all the different sub categories?0 -
Google & Separators
This is not a question but something to share. If you click on all of these links and compare the results you will see why _ is not a good thing to have in your URLs. http://www.google.com/search?q=blue http://www.google.com/search?q=b.l.u.e http://www.google.com/search?q=b-l-u-e http://www.google.com/search?q=b_l_u_e http://www.google.com/search?q=b%20l%20u%20e If you have any other examples of working separators please comment.
Technical SEO | | Dan-Petrovic3 -
Best way to create page title to products catalog
Hi guys, i'm having problems with duplicated page title, and i want to know what is the best way to avoid this problem. The example is like this: Title page (A): Product name A - category - section Title page (B): Product name B - category - section, What you think guys i can resolve this problem, Thank you so much for your help.
Technical SEO | | NorbertoMM0