Site Wide Link Situation
-
Hi-
We have clients who are using an e-commerce cart that sits on a separate domain that appears to be providing site wide links to our clients websites. Therefore, would you recommend disallowing the bots to crawl/index these via a robots.txt file, a no follow meta tag on the specific pages the shopping cart links are implemented on or implement no follow links on every shopping cart link? Thanks!
-
Hi! Thanks! I completely understand. We would never want to prevent URLs on the client's domain from being crawled. That could clearly put our client's online presence at risk. However, we're more concerned with Google noticing the shopping cart's domain is pointing to every page of the client's website which could appear unnatural & potentially, put the client's site at risk. What we're hoping to achieve is preventing from Google crawling the third party URL on every page to avoid any penalization.
-
Rez you gotta consider a few things.
When looked at the site structure and AI of your site you have to think about the Juice flow as a funnel. More Juice to the top distributed less juice to the bottom. So for shopping cart pages or product pages ( depending on how deep they are ), i usually incorporate Long tail , targeted keywords ( ie: Mimi Juie baby sippy cups ) where the volume is not much but its targeted enough that even with a limited juice flow you can rank.
My initial suggestion to you was to contact the person or company that built the shopping cart in order to remove the link. ( THAT IS MY FIRST OPTION ). I would not do a no follow to the product page. ( dont do anything crazy like that ) Specially if you have Share bar options for your products and reviews etc. ( you will lose all that ) .
LAST OPTION for you should be to do a robot.txt to ONLY that link, NOT the page.
Again please understand you should not DEVALUE your page like that .
Hope this helps.
Let me know how it turns out
Hampig M
BizDetox
-
Hi-
Thanks for the feedback! So the robots.txt is the best way?
The shopping cart's URL does not have much authority so it's not important for us to get the link juice from the separate domain which is why we're debating how to implement a no follow. Do you see any harm in doing so?
Thanks,
Rez
-
Rez.
You should be able to remove that sitewide link from your shopping cart. I had a similar situation with a joomla site i did that had a sitewide link situation on the product page of JoomShopping and you can purchase to remove it. Unfortunately thats the way it is. Take a look at the help files or forums of the shopping cart site. What shopping cart is it?
If you cannot remove it, then robots.txt is the best way i would NOT do a no follow to that page. Unless you dont care about the data or care about getting ranked for those pages. But you are saying its site wide.
So i am a little confused on that.
Hope it helps.
Best Wishes,
Hampig M
BizDetox
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Juice + Site Structure
Hi All, I have attached a simple website model.
Intermediate & Advanced SEO | | Mark_Ch
Page A is the home page attracting 1000 visitors per month.
One click away is Page B with 400 visitors per month, so on and so forth. You get an idea of the flow and clicks required to get to various pages. I have purposely placed Pages E-G to be 3 clicks away as they yield very little traffic. 1] Is this the best way to distribute link juice?
2] Should I point Pages C + D back to page A to influence its Page Rank (PA) Any other useful advice would be appreciated. Thanks Mark vafnchI0 -
URL Value: Menu Links vs Body Content Links
Hi All, I'm a little confused. I have read a number of articles from authority sites that give mixed signals over the importance of menu links vs body content links. It is suggested that whilst all menu links spread link juice equally, Google does not see them as favourably. Inserting a link within the body will add more link juice value to the desired page. Any thoughts would be appreciated. Thanks Mark
Intermediate & Advanced SEO | | Mark_Ch0 -
How to find affiliate sites linking to a competitor website?
Hello here, I am trying to understand the best way to find sites that are affiliate of a competitor, through link research. Typically our competitor's affiliates link to our competitor website via any of the following links: http://www.musicnotes.com/sheetmusic/ard.asp?SID=[aff_id]&LID=[link_id] http://click.linksynergy.com/link?id=[aff+id]&offerid=[off_id]&type=2&murl=http%3A%2F%2Fwww.musicnotes.com%2Fsheetmusic%2Fmtd.asp%3Fppn%3D[item_id] The first link looks much easier to find, so I have tried to find the first kind of links with Google by using the "link:" clause as follows: link:http://www.musicnotes.com/sheetmusic/ard.asp Or, similarly, by using Open Site Explorer. But I always get 0 results! It is weird because I know there are thousands of affiliates out there with the same tracking code. How's that possible? Why does it look impossible to find the sites I am looking for? Would you suggest any different approach? Any ideas, suggestions and thoughts are very welcome! Thank you in advance. Fab.
Intermediate & Advanced SEO | | fablau0 -
Google WMT Turning 1 Link into 4,000+ Links
We operate 2 ecommerce sites. The About Us page of our main site links to the homepage of our second site. It's been this way since the second site launched about 5 years ago. The sites sell completely different products and aren't related besides both being owned by us. In Webmaster Tools for site 2, it's picking up ~4,100 links coming to the home page from site 1. But we only link to the home page 1 time in the entire site and that's from the About Us page. I've used Screaming Frog, IT has looked at source, JavaScript, etc., and we're stumped. It doesn't look like WMT has a function to show you on what pages of a domain it finds the links and we're not seeing anything by checking the site itself. Does anyone have experience with a situation like this? Anyone know an easy way to find exactly where Google sees these links coming from?
Intermediate & Advanced SEO | | Kingof50 -
Unnatural Links From My Site Penalty - Where, exactly?
So I was just surprised by officially being one of the very few to be hit with the manual penalty from Google "unnatural links from your site." We run a clean ship or try to. Of all the possible penalties, this is the one most unlikely by far to occur. Well, it explains some issues we've had that have been impossible to overcome. We don't have a link exchange. Our entire directory has been deindexed from Google for almost 2 years because of Panda/Penguin - just to be 100% sure this didn't happen. We removed even links that went even to my own personal websites - which were a literal handful. We have 3 partners - who have nofollow links and are listed on a single page. So I'm wondering... does anyone have any reason to understand why we'd have this penalty and it would linger for such a long period of time? If you want to see strange things, try to look up our page rank on virtually any page, especially in the /gui de/ directory. Now the bizarre results of many months make sense. Hopefully one of my fellow SEOs with a fresh pair of eyes can take a look at this one. http://legal.nu/kc68
Intermediate & Advanced SEO | | seoagnostic0 -
Depth of Links on Ecommerce Site
Hi, In my sitemap, I have the preferred entrance pages and URL's of categories and subcategories. But I would like to know more about how Googlebot and other spiders see a site - e.g. - what is classed as a deep link? I am using Screaming Frog SEO spider, and it has a metric called level on it - and this represents how deep or how many clicks away this content is.. but I don't know if that is how Googlebot would see it - From what Screaming Frog SEO spider software says, each move horizontally across from Navigation is another level which visually doesnt make sense to me? Also, in my sitemap, I list the URL's of all the products, there are no levels within the sitemap. Should I be concerned about this? Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Moving from a static HTML CSS site with .html files to a Wordpress Site while keeping link structure
Mozzers, Hope this finds you well. I need some advice. We have a site built with a dreamweaver template, and it is lacking in responsiveness, ease of updates, and a lot of the coding is behind traditional web standards (which I know will start to hurt our rank - if not the user experience). For SEO purposes, we would like to move the existing static based site to Wordpress so we can update it easily and keep content fresh. Our current site, thriveboston.com, has a lot of page extensions ending in .html. For the transition, it is extremely important for us to keep the link structure. We rank well in the SERPs for Boston Counseling, etc... I found and tested a plugin (offline) that can add a .html extension to Wordpress pages, which allows us to keep our current structure, but has anyone had any luck with this live? Has anyone had any luck moving from a static site - to a Wordpress site - while keeping the current link structure - without hurting any rank? We hope to move soon because if the site continues to grow, it will become even harder to migrate the site over. Also, does anyone have any hesitations? It this a bad move? Should we just stay on the current DWT template (the HTML and CSS) and not migrate? Any suggestions and advice will be heeded. Thanks Mozzers!
Intermediate & Advanced SEO | | _Thriveworks0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0