Moderated by Jeff Rohrs from ExactTarget. He welcomes people back from lunch and introduces the panel. Having covered this panel in the past, we can hope that there will be a few fireworks between Shuman from Google and Tom Cuthbert from Click Forensics, as happened last year. We shall see…
First panelist will be Jon Myers, the Head of Search from MediaVest. He explains that everyone will be tight on time and the panel will be fast-paced. Not exactly great news for us live bloggers. Jon reminds us that it has actually been 3 years since the Lanes Gift and Collectables AdSense Click Fraud (CF) case. He claims that the refund was never really a great deal of cash for the plaintiffs.
He describes how people starting making “MFAs” (Made-for-AdSense) sites which are fake blogs which scrape bits and pieces form real blogs and the n host AdSense ads on them for income. In 2006, Eric Schmidt said “let it happen.” Shuman quickly corrected Eric on the Google blog and attempted to change the perceived message from his statement.
In 2007…botnet activity on the rise, click fraud is the new spam. A significant part of the click fraud traffic comes from botnets and MFAs. He shows a couple illustrative slides and talks about how it can be considered Google vs Yahoo in their efforts to combat fraud, or “them” collectively versus us (the internet marketers). At this point, we all agree that the problem exists and that advertisers should not have to pay for CF.
However, the search engines have for years told us that it is a negligible problem (paraphrased), when more recently they have begun to admit it still exists. This is where the media comes in…is it “scaremongering?” Shows a slide of an image of a Search Engine Journal, article on the topic with a somewhat sensational title.
The global picture: was CF up 15% in 2007? This is information from Tom from Click Forensics. He shows a “heart map” which shows China and the rest of eastern EU continent being the hot areas for this kind of fraudulent activity. he shows a couple more percentages numbers form Click Forensics. US Online spend in 2007 is about $21.1 Billion. spo at 40% of US Search spend being 8.44B, we will see 240K dollars in CF occurring just during this session.
He talks about the Google detectors tools available within the system, and then lists a bunch of companies that are involved in CF detection and prevention. To sum up…it is very important that you are recording the right data. Get the clickers actual IP address, the time and date, the keyword, and the referring website, and then report all this to the SE. You also need to record performance. What to watch for: second tier URLs. Repeat visits form IPs or domains. Where visitors are coming from? Click spikes. spend anomalies. Conversion drop off…etc maybe Jon will give us the rest of the list in the comments.
Is mobile search going to be the next CF area? maybe. Free mobile phones for clicks? Google supplying Yahoo Search…will this lead to less CF? (sorry I did not quite understand this point)
Next up is Reggie Davis who runs the Marketplace Quality team at Yahoo! He states that Jon raised some good points. In terms of the CF litigation, G paid 90M., and Yahoo only paid 5M… he will talk about value, Mitigation, Quality Improvement, and Business enhancement. They are clearly committed to driving value to advertisers. They spend a lot of time not just focused on CXF, but also how to drive better quality clicks which will in turn deliver better ROI for advertisers.
Network quality = ad quality + traffic Quality. Ad quality is more in the users’ realm, and traffic quality is on the side of the publishers. he will focus on the publisher side. He states that there was great improvement in 2007, listing key events that happened in each month from June to December. He focuses on the problem in Korea, and how Yahoo spoke with the government to help modify legislation that actually made it easier for Fraudsters. The government complied.
In 2008, the commitment to quality continues. They have automated systems in the front, as well as proactive review, and also reactive review. they feel that the amount of CF has been “grossly overstated” in the press. They focus on an inverted pyramid with low quality traffic on top and CF on the bottom (unwanted traffic in the middle).
They admit that content match is where the majority of the CF occurs. They have seen a significant spike as a percentage of the total discounted clicks since Q3 2007. He reminds that this is on a percentage basis and the overall traffic is much lower that the Search traffic.
One of the things seen as being problematic is when looking at web logs that appear to be clicks, but are not actually considered clicks by Yahoo. They are very concerned about delivering good filtering systems around the traffic coming in. They will launch a new Click Filter Report next month and he shows a mockup of the interface which looks pretty impressive.
Yahoo! has teams that are dedicated to helping advertisers with this problem. They urge advertisers to sign up for analytics tools which can help them manage this. They stand first in line to be certified against the forthcoming IAB guidelines. They also want to work with Click Forensics and partner with them. They signed a contract with them 6-8 weeks ago in order to leverage the data they are getting from advertisers in order to combat the problem. They welcome the opportunity to work with anybody that will help reach the common goal of driving value to advertisers. He stated at the onset that they took a different stance and will not be swinging a bat at Click Forensics. He lastly shows a couple cartoons that are available at the “Traffic Quality Center” which light heartedly poke fun at CF.
Next up is Tom Cuthbert from Click Forensics. His deck is titled “Under the Iceberg.” He thanks Reggie for not hitting them with a bat. At this time, he will not be throwing out any numbers, since we all now agree that this is a problem. He wants to make us aware of some of the hidden issues behind this. CF is only the tip of the iceberg. Other major issues: Botnet scams are exploding (a headline from today’s USA Today). Out of country clicks, Click farms, DNS pharming; low quality traffic from social networking sites; low quality traffic from parked domains and made for AdSense sites.
What does this mean? Like spam, CF continues to get worse. Traffic quality on the content networks is getting worse. Advertisers are hesitant to spend there since there is lack of controls. Diminishing trust is negatively affecting growth. Demand for higher quality traffic is leading to the increase in traffic acquisition cost. What is ahead? Traffic management tools are improving. Advertisers are demanding higher q traffic. “Black box approach” will not work. Smart publishers are blocking low quality traffic. They are taking steps to manage this on their own which is good. All people have the same goal in mind to improve ROI.
This means that everyone is going to be working together to team up to fight CF. CF will provide additional feedback to yahoo in a protected way (for those advertisers that are worried about Yahoo seeing conversion data etc). What we have learned: the problem is getting worse not better. Standards will help, but not stop fraudulent clicks. No one company can solve the problem alone. We need to empower advertisers with better tools and controls, and we need to create accountability for publishers, ad networks and search engines.
Next up is Shuman Ghosemajumder from Google. He will give an overview of G’s approach to fighting CF from an engineering perspective, and some other information. there are two main incentives to creating attempted CF. Attacking advertisers, or inflating affiliates. There are numerous methods used to do this, which range in complexity from something as simple as manually clicking on ads, to more sophisticated schemes using botnets and clickbots. He describes botnets as being like pyramid schemes.
The way they attack the problem of CF is by preventing the advertisers from having to pay for the clicks. The basic idea, he states, is “cast the net of invalid clicks wide enough that we have a high degree of confidence that we are catching the actual malicious clicks.” Of course, this produces a large number of false positives. This is good from the advertisers’ perspective in that they are getting some real clicks for free. It is also good for Google because it acts like a “sale on clicks.” This will drive higher ROI, and thus spur advertisers to actually spend more than before.
The proactive system used consists of real time filters and offline analysis to look at a longer time period of “let’s say” sixty days. The reactive is driven by actual investigations spurred by advertiser inquiries. The number of reactive invalid clicks is actually relatively rare and has remained at a lower level than those caught in the proactive phase.
“Reality at Google” from 2002 to present. Less than 10% but still a significant number of clicks are being filtered. Nearly all invalid clicks are detected proactively. reactively detected invalid clicks are a negligible proportion (<0.02%). The two main methods used for detection are simple rules and statistical anomaly detection. An example of the simple rule would be “N number of clicks from the same IP within a short period of time.” These are easy for fraudsters to avoid. The statistical anomaly detection is much harder to game. He lists a number of features that help mitigate the problems of CF, including “smart pricing” from 2002; auto tagging, site targeting; site exclusion; invalid click reports; placement reports; IP exclusion; AdSense click area change; all the way up to additional category exclusions in 2008.
Overall the best advice he can give an advertiser is to use tools that are needed to manage campaigns effectively. CF consulting firms have now evolved to offer advertisers better reasons to use them than to simply identify CF. he suggest that if you see anything suspicious, to report it to the Ad traffic Quality center and notify them. google.com/adtrafficquality (there are a couple video seminars from one of their engineers here which he recommends.)
Last up is Richard Zwicky from Equisite. He says an easy way to solve the overseas problem is to use geo-targeting to only get clicks from the US. He aggress that G and Yahoo are working hard to minimize exposure to CF. They are not the problem, however. You have to audit paid listings. You have to ask questions: did your campaigns execute properly? Radio has Arbitron, TV has Nielsen, businesses have accountants…what does Paid Search have?
You need to know what is your exposure and what you can do about it. What is PPC Assurance? A third party campaign verification system. he shows some cool charts form the system which identify strange spikes in traffic. (Unfortunately my laptop battery is about to die so I’ll egt as much more as possible). When you look at all your campaigns and a few spike somewhat, you cant be sure, but if one really spikes then you know you have been it.
What to audit? Set lots of campaign parameters. Every one you set reduces the chance of being hit by CF. They do create a higher likelihood of mistakes by Google, Yahoo and others, but that is OK. He congratulates the engines for actually taking responsibility and standing up and owning the problem.
This is live blogging coverage of SES New York 2208, so some typos or grammatical errors exist. Panelists or other attendees are encouraged to comment below to share any inaccuracies, and to help fill out the rest of the story.