Different URLs for signed in and signed out users
-
Hello,
I have a client that plans to use different URLs for signed in and signed out customers.
My concern is that signed in and signed out customers will provide back links to different URLs of the same page and thus split page rank. I'm assuimg the URL for signed in customers won't be fetched by Google and therefore rule out canonicalizing the signed in URL to the signed out version.
The solution for me would be to ensure that there is only one URL for each content page, and to instead use cookies to prompt customers to sign up to the service that aren’t already a customer.
However, please correct me if I’m wrong in my assumptions.
Thanks
-
How is it that signed in and signed out users get different urls for the same page, I have never actually seen this practice in any CMS I have seen or used before. Client should not decide these things... you as the developer or the seo should provide the right way of doing things and explain to them why. To say the least you are dealing with duplicate content as as risk and in that case you can use canonical tag to specify the primary page.
But I would not listen to the client for what they demand, a lot of times clients have no friggin clue about this stuff and ask the vendors to shoot them in their own foot... good vendors will counter with proper reasoning and explanation and if you are the SEO in charge in this case, know that their insisting decisions will have major implications in the results you get and the treatment and longevity of contract you get from these clients... just my 2 cents from few years of doing this as a freelancer, but now I am running my own team and agency and client says this does not mean anything to me anymore when it comes to technical decisions. I ask them about their end goal and find the right way of achieving that.
-
JM123, you can be assured that Google and it's bots don't create accounts--so they will never be able to sign in and sign out.
That said, you could technically deliver unique content to all signed-in users, because Google will never sign in and see that content.
What's important is the content that you do give to users that are NOT signed in. You should concentrate on that content to that it's unique and other pages aren't duplicated (thus having a duplicate content issue).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
Robots User-agent Query
Am I correct in saying that the allow/disallow is only applied to msnbot_mobile? mobile robots file User-agent: Googlebot-Mobile User-agent: YahooSeeker/M1A1-R2D2 User-agent: MSNBOT_Mobile Allow: / Disallow: /1 Disallow: /2/ Disallow: /3 Disallow: /4/
Technical SEO | | ThomasHarvey1 -
Canonical URLs in an eCommerce site
We have a website with 4 product categories (1. ice cream parlors, 2. frozen yogurt shops etc.). A few sub-categories (e.g. toppings, smoothies etc.) and the products contained in those are available in more than one product category (e.g. the smoothies are available in the "ice cream parlors" category, but also in the "frozen yogurt shops" category). My question: Unfortunately the website has been designed in a way that if a subcategory (e.g. smoothies) is available in more than 1 category, then itself (the subcategory page) + all its product pages will be automatically visible under various different urls. So now I have several urls for one and the same product: www.example.com/strawberry-smoothie|SMOOTHIES|FROZEN-YOGURT-SHOPS-391-2-5 and http://www.example.com/strawberry-smoothie|SMOOTHIES|ICE-CREAM-PARLORS-391-1-5 And also several ones for one and the same sub-category (they all include exactly the same set of products): http://www.example.com/SMOOTHIES-1-12-0-4 (the smoothies contained in the ice cream parlors category) http://www.example.com/SMOOTHIES-2-12-0-4 (the same smoothies, contained in the frozen yogurt shops category) This is happening with around 100 pages. I would add canonical tags to the duplicates, but I'm afraid that by doing so, the category (frozen yogurt shops) that contains several non-canonical sub-categories (smoothies, toppings etc.) , might not show up anymore in search results or become irrelevant for Google when searching for example for "products for frozen yoghurt shops". Do you know if this would be actually the case? I hope I explained it well..
Technical SEO | | Gabriele_Layoutweb0 -
# in url affecting rank
Hi I am building links to a page www.companyname.com/category.index.php There is also another similar url www.companyname.com/category.index.php#. This page is linked to from the non # page. This is a new client and I'm not entirely sure why that link is there. Am I correct in thinking that these two urls are different in the eyes of the search engines? If so, would some of the link juice to www.companyname.com/category.index.php be transferred to www.companyname.com/category.index.php# and affect the ranking of the non # page? I hope this makes sense! Thanks
Technical SEO | | sicseo0 -
User Reviews Question
On my e-commerce site, I have user reviews that cycle in the header section of my category pages. They appear/cycle via a snippet of code that the review program provided me with. My question is...b/c the actual user-generated content is not in the page content does the google-bot not see this content? Does it not treat the page as having fresh content even though the reviews are new? Does the bot only see the code that provides the reviews? Thanks in advance. Hopefully this question is clear enough.
Technical SEO | | IOSC0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Special characters in URL
Hello everybody, my question focus on special parameters in URL. I i am working for a website that use a lot of special entities in their URLS. For instance: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1.html I am about to make 301 redirect rules for all these urls to clean ones. IE: www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1
Technical SEO | | objectif-mars
would become:
www.mydomain.com/mykeyword1-mykeyword.html I just wanted to know if anybody has already done this kind of "cleanup" and if i could expect a positive boost or not. Thanks0 -
URL Length
What is the ideal length for an item's URL. Theirs a few different options. A) www.mydomain.com/item-name B) www.mydomain.com/category-name/product-name C) www.mydomain.com/category-name/sub-category-name/product-name Please choose A, B, or C and explain why you made that decision. Looking forward to the responses.
Technical SEO | | Romancing0