What effect does previous page visits have in SERP?
-
We've all seen it before, right before a result, you see "You visited this page on ____"
What effect does a single visit have? Multiple visits?
-
Really interesting question about user engagement metrics that we don't have a clear answer to, but we've received hints from the engines that they track this sort of thing through toolbars, logged in searches and other methods.
Bill Slawski recently wrote a post on a Google patent that would adjust rankings on exactly this type of behavior. Quoting directly from the his article, the patent described user signals such as:
- The percentage of searches in which the user selected the first result (or one of the top results) in the list of search results
- The average first click position (i.e., the numerical position within the list of results)
- The percentage of searches that had long clicks (i.e., the percentage of times that a user selects a link to go to a result page and stays on that page for a long time, such as more than 3 minutes)
- The percentage of searches that did not have another search within a short period of time
- The percentage of searches that did not have a reformulated search (i.e., a search where one or more search terms in the original search are added, deleted, or changed) within a short period of time
- A combination of different metrics, and/or the like
So a single click, in the ocean of web results, probably will never make a large difference. Or even multiple visits by a single user.
But if you have a large number of users click on a result, and then click the back button to choose another search result - this is almost definitely going to have impact on rankings. Or, another example, if a large number of searchers consistently choose the URL in the 3rd position of a given SERP - and staying on that site - that particular domain might have a good chance of rising.
This past April, Google announced "we are beginning to incorporate data about the sites that users block into our algorithms." Again, a single person blocking a site from their search results probably isn't going to have an impact, but a large number of such actions probably will.
Engagement metrics are becoming increasing important, but all indications are they must be taken in aggregate.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Google: What factors contribute to rank a landing page in a specific country?
Hi community, I would like to know what is the criteria to rank a landing page in specific country Google search. For example, if we want to rank our landing page for "GDPR Australia", what are the factors which will impact beside writing the cotent related to the above mentioned keyword? Thanks
Algorithm Updates | | vtmoz0 -
Do pages with canonicals need meta data?
Page A has a canonical to Page B. Should Page A have meta data values such as description, keywords, dublin core values, etc.? If yes, should the meta data values be different on Page A and Page B?
Algorithm Updates | | Shirley.Fenlason1 -
Google SERPs showing blog comments in Answer Box?
I was recently researching Schema markup for local businesses and I was presented with an Answer Box that used blog comments as answers (at least I feel that's what they were attempting to show). This is what is says currently when I search for "schema markup hours" (screenshot also attached): 12 thoughts on “How to Use Schema Markup for Local SEO” Lauren says: March 11, 2013 at 2:22 pm. ... souleye says: March 11, 2013 at 3:29 pm. ... Daniel Bennett says: March 11, 2013 at 8:51 pm. ... sammy. says: ... Nathan says: March 11, 2013 at 11:53 pm. ... Rishav says: March 12, 2013 at 5:51 am. ... Paul Sherland says: ... keyword removed says: Right now it shows the time and date of the comment, but is this something that's new or has it been around? Thanks in advance! tp5y1od.png
Algorithm Updates | | TomBinga11250 -
Can you be both penalised and uplifted in SERPS?
hello everyone, We've literally had dozens of high ranking pages which are location specific, wiped out of the Google.co.uk SERPs. Can't imagine that it is anything other than a manual penalty but no message has been sent by Google. For example "campervan hire surrey" would produce our surrey page at the top of the SERP, now this page has completely disappeared. On the other hand, we have been promoted on national keywords like "vw campervan hire" and "campervan hire" where we are second and third. Does anyone agree that this is a penalty?
Algorithm Updates | | swimwithfishes0 -
Google indexing my website's Search Results pages. Should I block this?
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed. I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
Algorithm Updates | | Jenny10 -
How Do I Make My Google SERP "SiteLinks" more relevant?
I have a shopping website with thousands of products, and the sitelinks that google has chosen for me (for a long time) are random product pages, which makes no sense to me. I do not emphasize those products on the home page, and I have a sitemap that clearly lists the directory of all the categories. I also added a "nofollow" attribute to almost every link on the home page that is not important. These products in the site links seem completely random and there isnt even a sitelink for "about" or any of the footer content! What gives? Also, my sitelinks never updated to the new, better version. Any suggestions?
Algorithm Updates | | cDNAInteractive0 -
What do you think of Google SERP encryption?
Really interesting post by Search Engine Land about this "issue" for tracking conversion, especially for long tail keyword research. I suppose this change will be also applied on all google search pages (.ca, .fr etc.). I Really don't think Webmaster tools is a serious compensation in Analytics for this.
Algorithm Updates | | Olivier_Lambert0