Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
-
Hi,
We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc.
The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them.
As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent.
The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is
1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it).
2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors).
Many thanks in advance
-
You might be better with a server side tracker like http://awstats.sourceforge.net/
The answer from Mat probably has the best logic, but the only problem is are you legally responsible for mitigating the possibility of fraud?
I would make sure you add this to the contract, as I am not sure you are going to be able to defeat a proxy or spoofer, just in case the referrer gets smart and decides to work the system.
An anti fraud system can be put into place, but LOL I am not sure you will have the access to the multi million dollar fraud monitoring tools that Google does, that are contstantly updated and algorithmically and systematically monitor as well as have auditors who manually do random checks...
-
Hi - Well we are really just acting on behalf of the client - that's what they want.
Also its only visitors from that specific website (very close niche) - not just any site
-
Google Analytics doesn't report IP Address though - which is another reason to take a different root. Not knocking GA, I love it. However it isn't the right tools for this.
I suspect that the fiverr gigs use ping or something the create the mass of "unique visits". Very easy to spot. Unless you have some fairly sophisticated tools to hand i'd imagine that any method that can deliver 5000 for $5 is going to be pretty easy to spot.
Might try it now though. I love fiverr for testing stuff
-
If you must use Analytics, I would drill down to the source of referral within analytics. This will give you the URL, page, or whatever. I think you can also drill down to the referring IP etc...
You need to log were they come from through them. Export your results every month and see a pattern.
If you get 500 referrals from website B's IP or URL, then its a sure way of knowing they are throwing people at you.
But Mats answer is best, will give you times, not just dates and will also give you more detailed info.
-
My question is: is unique visitors the right metric that you should be measuring? On Fiverr.com I can get 2000 to 10,000 unique visitors for $5. http://fiverr.com/gigs/search?query=unique+visitors&x=0&y=0
Can you tie your metrics to something else that might have more value for you, such as purchases, newsletter signups (still easy to fake, but at least takes a little more time), etc?
-
Google Analytics isn't designed to pull the data in the way you really want to for something like this. It can be done I suppose, but it'd be hard work.
There are only so many metrics you can measure, and all are pretty easy to fake. However having the data is an easy to access form means that you can spot patterns and behaviour, which are much harder to fake.
Probably a starting point would be to measure distribution of the various metrics on the referred traffic v the general trend. If one particular C class block (or user agent, or resolution, or operating system, or whatever) appeared at a different frequency in the paid traffic that would be a good place to look deeper.
Thinking less technically for a moment though, I bet you could just implement one of the many anti click fraud systems to do most of this for you. same idea, but someone else has already done the coding. Googling for click fraud brings up a stack of ads (tempting to click them loads and set off their alarms!!).
-
Hi Mat,
A very informative answer.
If someone is going to try and spoof analytics, then would they not also be able to equally try and fool the script?
If someone was to try this do you know how they would likely try and do it - essentially if I know what is likely to be tried, then I can work out something that could counteract it. Are there certain things that can't be fooled, or are very difficult to fool ? - EG things like browser resolution, location etc - or are this just as easy to spoof as anything else?
many thanks
-
It isn't hard to fake this at all I am afraid. Spotting it will depend on how sophisticated the person doing it is.
My personal preference would be not to use analytics as the means of counting it. Doing that you are going to be slightly limited in the metrics you have available and will always be "correcting" data and looking for problems rather than measuring more correctly and having problems spotted.
I'd have a script on page that logs that checks for a referrer and it if matches the pattern for website B creates a log record instead.
You then have the ability to set your rules. For instance if you get 2 referrals from the same IP a second apart would you count them? What about 10 per hour 24 hours a day? You can also log the exact timestamp with whatever variables you want to collect, so each click from the referring site might be recorded as:
- Time stamp
- Exact referring URL
- User agent
- IP
- Last visit (based on cookie)
- Total visits (based on cookie)
- #pages viewed (updating cookie on subsequent page views )
- and so on
Analytics doesn't give you access to the data in quite the same way. I'd definitely want to be logging it myself if the money involved is reasonable.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Redirecting Pages During Site Migration
Hi everyone, We are changing a website's domain name. The site architecture will stay the same, but we are renaming some pages. How do we treat redirects? I read this on Search Engine Land: The ideal way to set up your redirects is with a regex expression in the .htaccess file of your old site. The regex expression should simply swap out your domain name, or swap out HTTP for HTTPS if you are doing an SSL migration. For any pages where this isn’t possible, you will need to set up an individual redirect. Make sure this doesn’t create any conflicts with your regex and that it doesn’t produce any redirect chains. Does the above mean we are able to set up a domain redirect on the regex for pages that we are not renaming and then have individual 1:1 redirects for renamed pages in the same .htaccess file? So have both? This will not conflict with the regex rule?
Intermediate & Advanced SEO | | nhhernandez0 -
SEO and former site
Hi, my client had a site built and hosted with Avvo but we now shut it down and are using a new server. My concern is that Avvo's internal link structure is causing SEO issues. For example, his site will list for "San Diego Criminal Defense Attorney", but is then removed for no reason. Far worse, while he had the AVVO site, it would never rank at all on Google. He's got great content, and no spammy links. This is the site: www.thesandiegocriminallawyer.com. Any thoughts of what I could do to disavow the AVVO pages that Google still has indexed? Does it matter? Or, is it simply a function of time? Thank you for your help.
Intermediate & Advanced SEO | | mrodriguez14400 -
Google webmaster tools showing "no data available" for links to site, why?
In my google webmaster account I'm seeing all the data in other categories except links to my site. When I click links to my site I get a "no data available" message. Does anyone know why this is happening? And if so, what to do to fix it? Thanks.
Intermediate & Advanced SEO | | Nicktaylor10 -
Site rankings down
Our site is over 10 years old and has consistently ranked highly in google.co.uk for over 100 key phrases. Until the middle of April, we were 7th for 'nuts and bolts' and 5th for 'bolts and nuts' - we have been around these positions for 5-6 years easily now. Our rankings dropped mid-April, but now (presumably as a result of Penguin 2.0), we've seen larger decreases across the board. We are now 5th page on 'nuts and bolts', and second page on 'bolts and nuts'. Can anyone please shed any light on this? Although we'd fallen some before Penguin 2.0, we've fallen quite a bit further since. So I'm wondering if it's that. We do still rank well on our more specialised terms though - 'imperial bolts', 'bsw bolts', 'bsf bolts', we're still top 5. We've lost out with the more generic terms. In the past we did a bit of (relevant) blog commenting and obtained some business directory links, before realising the gain was tiny if at all. Are those likely to be the issue? I'm guessing so. It's hard to know which to get rid of though! Now, I use social media sparingly, just Facebook, Twitter and G+. The only linkbuilding I do now is by sending polite emails to people who run classic car clubs that would use our bolts, stuff like that. I've had a decent response from that, and a few have become customers directly. Here's our link profile if anyone would be kind enough as to have a look: http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com Also, SEOMOZ says we have too many links on our homepage (107) - the dropdown navigation is the culprit here. Should I simply get rid of the dropdown and take users to the categories? Any advice here would be appreciated before I make changes! If anyone wants to take a look at the site, the URL is in the link profile above - I'm terrified of posting links anywhere now! Thanks for your time, and I'd be very grateful for any advice. Best Regards, Stephen
Intermediate & Advanced SEO | | stephenshone1 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
Intermediate & Advanced SEO | | Kenn_Gold0 -
Clone sites at new company
I just came in house to our company for SEO. We have one main site and 182 that are exact duplicates and almost exact clones of the main site. It's no surprise that half of these clones are deindexed already. The meta tags are just the domain name URL. I want to add unique text on the home page to each site, fix the meta tags and switch them up so they aren't clones. Other than a huge rewrite of the code for each site, I'm not sure what else to do to prevent the rest from getting deindexed. Is there any way to prevent the rest from getting deindexed?
Intermediate & Advanced SEO | | CFSSEO0 -
Is there a FastTrack to re-index? a site?
Hello... i just started with a new client this week, before working with us his last domain-hosting-webdev provider cancel their account and took off the entire site and left them with a nice "under construction page" (NOT) and added the noindex, nofollow tags. 4 weeks after that, we come into the scene and of course our client it's expecting us to reinsert at least for branded terms the site, and he wants it done on a matter of hours... I tried my best to explain that it's not possible and we are doing everything we can't.... now i ask you guys.. I already created de GWT account, Created a well structured Sitemap and submitted it to google and bing, did the onpage optimizitation at least the basics... there is a way to speed up the process? kind of like "hey you! google bot, forget about the noindex nonsense a come crawl again?" Any help would be great Daniel
Intermediate & Advanced SEO | | daniel.alvarez0