P.O Box VS. Actual Address
-
We have a website (http://www.delivertech.ca) that uses a P.O Box number versus an actual address as their "location". Does this affect SEO? Is it better to use an actual address? Thanks.
-
Hi Anton,
If yours is a local business serving local customers in person, then, yes, a P.O. box will severely limit your local search marketing. Because P.O. boxes are not considered legitimate physical addresses, you'll be unable to build a full set of citations for the business, and that will likely hamper your local search ranking efforts.
Definitely better to use an actual address! Hope this helps.
-
Hi Anton,
You really only need to worry about having an actual address if your customers come to you, i.e. retail stores, restaurants, mechanics, etc.
I can't access the site you reference, but it looks like it's probably a courier service or something along those lines. In that case, your serving the customers at their location, in which case they don't really have a need to know your actual address, just what areas you service.
So no, you're not likely to see any affect on your SEO by providing an actual address as opposed to a P.O. Box.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IP Address of Server an SEO Factor??
Hello all, Interested to hear your thoughts on this. What's best practice re server IP location. Is it OK for that to be in the US if your company is in Europe? Any potential issues? John Mueller says server location is irrelevant, but some developer I work with thinks IP address of the server is a factor. I can't see how it would be in this day and age. https://www.seroundtable.com/seo-geo-location-server-google-17468.html Many thanks, Gill.
Intermediate & Advanced SEO | | Cannetastic0 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Canonical Vs No Follow for Duplicate Products
I am in the process of migrating a site from Volusion to BigCommerce. There is a limitation on the ability to display one product in 2 different ways. Here is the situation. One of the manufacturers will not allow us to display products to customers who are not logged in. We have convinced them to let us display the products with no prices. Then we created an Exclusive Contractor section that will allow users to see the price and be able to purchase the products online. Originally we were going to just direct users to call to make purchases like our competitors are doing. Because we have a large amount of purchasers online we wanted to manipulate the system to be able to allow online purchases. Since these products will have duplicates with no pricing I was thinking that Canonical tags would be kind of best practice. However, everything will be behind a firewall with a message directing people to log in. Since this will undoubtedly create a high bounce rate I feel like I need to no follow those links. This is a rather large site, over 5000 pages. The 250 no follow URLs most likely won't have a large impact on the overall performance of the site. Or so I hope anyway. My gut tells me if these products are going to technically be hidden from the searcher they should also be hidden from the engines. Does Disallowing these URLs seem like a better way to do this than simply using the Canonical tags? Any thoughts or suggestions would be really helpful!
Intermediate & Advanced SEO | | MonicaOConnor0 -
Removal tool - no option to choose mobile vs desktop. Why?
Google's removal tool doesn't give a person the option to tell them which index - mobile friendly, or desktop/laptop - the url should be removed from. Why? I may have a fundamental misunderstanding. The way I thought it works is that when you have a dynamically generated page based on the user agent, (ie, the SAME URL but different formatting for smartphones as for desktop/laptop) then the Google mobile bot will index the mobile friendly version and the desktop bot will index the desktop version -- so Google will have 2 different indexed results for the same url. That SEEMS to be validated by the existence of the words 'mobile-friendly' next to some of my mobile friendly page descriptions on mobile devices. HOWEVER, if that's how it works--why would Google not allow a person to remove one of the urls and keep the other? Is it because Google thinks a mobile version of a website must have all of the identical pages as the desktop version? What if it doesnt? What if a website is designed so that some of the slower pages simply aren't given a mobile version? Is it possible that Google doesn't really save results for a mobile friendly page if there is a corresponding desktop page-- but only checks to see if it renders ok? That is, it keeps only one indexed copy of each url, and basically assumes the mobile title and actual content is the same and only the formatting is different? That assumption isn't always true -- mobile devices lend themselves to different interactions with the user - but it certainly could save Google billions of dollars in storage. Thoughts?
Intermediate & Advanced SEO | | friendoffood0 -
Cross Domain Rel Canonical tags vs. Rel Canonical Tags for internal webpages
Today I noticed that one of my colleagues was pointing rel canonical tags to a third party domain on a few specific pages on a client's website. This was a standard rel canonical tag that was written Up to this point I haven't seen too many webmasters point a rel canonical to a third party domain. However after doing some reading in the Google Webmaster Tools blog I realized that cross domain rel canonicals are indeed a viable strategy to avoid duplicate content. My question is this; should rel canonical tags be written the same way when dealing with internal duplicate content vs. external duplicate content? Would a rel=author tag be more appropriate when addressing 3rd party website duplicate content issues? Any feedback would be appreciated.
Intermediate & Advanced SEO | | VanguardCommunications0 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Anchor Text Diversification – Branded VS Non Branded – What is the best approach… if any?
Our organization competes in the Drug & Alcohol Treatment Category… very competitively I must say. While we create content for long-tail keywords, we focus on linking (blogging + Press Release + Acquisition, etc…) as the main strategy to increase relevancy for 4 major keywords. (Alcohol Rehab, Drug Rehab, Alcohol Treatment, and Drug Treatment)… all these terms have their respective landing pages, and we try to provide a good flow of new links coming to these pages on a weekly basis… Lately we have been acquiring more links than we anticipated… not a bad thing since they are from reputable websites… however I am a bit concern regarding the Anchor Text distribution of these links. Example Let’s say I get 100 links to my ‘Alcohol Rehab’ page – what is an appropriate percentage for the anchor text distribution? For example: Branded Links 20 - Keyword: St Jude Retreats
Intermediate & Advanced SEO | | dhidalgo1
Exact Match Links 70 - Keyword: Alcohol Rehab
Broad Links 10 - Keyword: Rehab Is this an ok distribution, or should I change things around? Hope you guys can help! Thanks!!!!0