Multichannel Metrics

Feb 27, 2006 - 12:05 pm 0 by

Moderated by Rebecca Lieb, ClickZ Network. Smaller room, very full. Welcomes everyone. Decided to try something new this year…wanted to dedicate a few sessions in Day 1 to look at search in a broader advertising context. Interested in feedback. One speaker (Eric) is in a cab coming from JFK.

Neil Mason – ClickZ. Multichannel Metrics – or Metrics Mayhem? “A way of thinking about measuring ebusiness performance." Helps businesses understand analytics strategy. The challenge: how to drive business from echannels, since this isn’t a data poor environment, however a data rich environment. There sometimes seems like too much going on…huge amounts of data coming in at all times, including ad-serving data, customer data, transaction data…etc… Also collecting performance data in a variety of forms. No wonder that someone on the receiving end of all of this may feel like they are “rabbits in the headlights.” Wants to outline thoughts about how to deal with this: how to think about how you are going to measure your business.

So, what to do? Have a strategy: determine what is important. Secondly, have a plan: how are you going to measure the things you want to measure. It’s about “counting the things that count.” “A journey to avoid metrics mayhem. A simple process to determine where you are now and where you need to go in order to drive increasing value to online business. Starts off with performance tracking metrics. Need: clarity of goals, definition of KPI’s and key metrics, and development of key reports and deliverables. A challenge is to not be too tech-driven by these things, which can blur what you are trying to do. Use an analyst that knows what they need to do. Performance tracking is about having faith in the numbers, and then moving forward to analysis and optimization. Everyone has a wheel: you start somewhere, do something, measure response, and then do analysis/make decisions. Another level is user centricity…very much around being customer centric instead of site-centric. It is about optimizing customer opportunities

It is clear to Neil when looking at conversion rates: you need to have a holistic approach: Use four major areas: 1. Have a strong grip on what is happening with your site. 2. Understand the site’s performance. 3. Profile your users and use market intelligence. 4. Understand your position in the marketplace. Use benchmarking surveys…user profiling answers the question of “telling you why, not what.” About understanding the “who behind the who.” What do they actually think about what they see? Lastly: how is site performing in terms of technical excellence. Does the site do what it is supposed to do, is the user experience good?

Case study background: working with UK high street retailer with well established presence online. Naturally experiencing significant growth, but they began to realize they needed a better strategic understanding of what was successful/unsuccessful. They recognized that analytics data could only take them so far in understanding the efficiency of their site. By taking it further, they recognized that there were different audience segments doing different things trying to buy different products. Are they actually coming to buy? Are they just doing research? What are the combinations of different products and categories they may be looking at? By deploying a wide range of tools and methodologies, they could better understand what they call the customer journey. Once they understand relative behavior, design relative types of usability test in order to dig deeper. In this example, 80% of customers are female, so they want site to appeal more to female. ClickZ said, well that’s interesting: 50% of traffic is actually male. So while 80% of customers were women, since 50% of traffic was male, maybe they were missing something. Second question was why do they visit the site? For a specific product? To look for a new outfit? Many different end goals, and in order to understand this, they used a process of an entry and exit survey. Find out the visitors’ intentions at the beginning, and if they achieved what they wanted to achieve. They found that only one third of people that came with the intention to buy actually ended up buying something. They continued the research to look for common patterns in order to determine what was happening. They identified 9 different “core journeys.” All of them had different % of visits, length of visits, numbers of categories visited, # of products viewed, conversion ratios, and a general demographic profile. By identifying the “mode” people were in, they could better predict the outcome of the visit. They found the most important group was the group of ladies coming in on “Journey #2,” which he describes as an “inspire me” mode. Thus they determined they the site needed to beef-up the “inspiration” factor. How to appeal to “less engaged” customers? This ranges a number of challenges, essentially based around data integrations. How to go about compiling “messy” data? All the different formats, levels of granularity, etc…

“Hard integration” – combining data with different formats, granularity, and periodicity…the tech challenges with this are high. “Soft integration” - a holistic approach to measuring channel performance, with a different skill set of challenges. Final thought: AC Nielsen quote” the price of light is less than the cost of darkness?’ So can you afford to not spend the time and money on analytics and a wide variety of analysis and testing?

Rebecca asks Eric to discuss the diff between a personae scenario and a journey scenario like he just described. Eric: journeys are about segments and modes, and then can be linked to persona descriptive data. Personae plus behavior = journey.

Jason Burby – From zaaz.com and also ClickZ

Their focus is on data analytics. Will speak about different examples of Multichannel measurement.. What are the behaviors they are trying to drive? There is now greater focus on analytics. Usually people are bogged-down by data since there is so much. Source Marketing Sherpa 2006: 61% of US Marketers want better analytics software for their clients. 56% want paid search and mgmt tools. 51% want A/B landing page comparison tests. 50% want integrated web analytics with search and emails.. In short: marketers are increasing their investment in analysis and optimization of campaigns based on such data.

They look at 3 types of core behavior data: 1. Attitudinal data. 2. Behavioral data. 3. Competitve data (from Hitwise, ComScore, and Nielsen, for example). Found that when these 3 combined, they can tell a great story. Case: they were tasked with improving conversions of a large telephone site. Analytics told them that there was an issue with conversions, found the top two best converting competition sites. They did the above process and used attitudinal data, behavioral data, and competitor data. They found that the best performing page in the industry was the Nextel page, but ironically when Nextel was recently bought-out, they “killed it.” (laughs) They create a KPI scorecard specific to the particular site to measure organization, website, site section, web/team/agency and individual performance in relation to their business goals. Then they shared the scorecard throughout the organization.

They do not want to look solely at the number of leads, but follow the leads throughout the buying cycle. They can then try to measure offline converting online, online converting offline, etc… So for example if they have someone who fills out a mortgage application online, this isn’t really a conversion because the bank hasn’t made the loan yet. They must measure on an overall performance by counting the total number of apps versus actually loans closed. Back to scorecards: asks how many people use individual scorecards to track site…a few hands raised. Important to hold people accountable when evaluating site. Discussed some software briefly used for each are: behavioral: Omniture, Webtrends etc. Optimization: Offermatica, etc. Competitive: Hitwise, etc. Attitudinal (missed it…he said the slide would be online as he went on to the next slide, as if laughing at us trying to blog this :p) Keep your “Ize’s” on the prize: 1. Identify opportunities. 2. Monetize them 3. Prioritize them, realize them. Determine the actual value of an inquiry, and optimize campaigns to capitalize on the more valuable ones. This system measures future conversion rates and allows to set goals based on past performance, recommended.

Once opportunity is optimized, time to prioritize. Examples include tuning landing page, improving onsite help, etc… Measure the “potential lift through optimization” with differently structured “opportunities.” Now they know there are a number of different tactics that may lead to success: a couple based on optimizing for revenue, one based on business lead-generation. They prioritize them then based on. Case study: Large travel brand was struggling to gather and compare al data from different channels and portals. They built a tool to be able to overlay data from Omniture, Google, tracking systems, etc…this allowed them to better see the big picture.

Avoid common issues with analyzing metrics: 1. Lack of process or methodology: 2. Not establishing proper KPI’s. 3. Data overload. 4. Failure to identify and prioritize opportunities. 5. Failure to monetize the impact of changes. 6. Limited access to data. 7. Lack of data integration. 8. Individual and group goals not tied to KPI’s. 9. Starting too big: Too often, people bite off more than they can chew recommend starting small 10. Overly data driven: Do not fall into the trap that simply measuring the data will answer your questions. Interpretation and testing are very important. 11. Lack of commitment and executive support. Jason suggests reading his article from last year at ClickZ that discusses ways to integrate data. ClickZ is working on a white paper regarding measuring attitudinal data that you can get from him by asking.

Q&A: Rebecca asks what she prefaces as a rather “confrontational question.” What about people that have smaller businesses than can’t afford to hire a large company like yours? Jason: Starting small. Identify one opportunity at a time, and begin slowly to measure what people are doing at the site. Neil agrees, it is “about counting th things that count.” What is the first thing that needs to be done to improve business, and then what do I need to measure that.?

Neil mentions that when you are dealing with analytics, you should remember that it is “garbage in, garbage out.”

Eric Peterson – Visual Sciences. Gets in very late from airport, but made it. Will speak about KPIs: key performance indicators. What can decision makers use in order to effectively run the business. Important to “translate” web data into KPIs. KPIs are not owned by IT, but owned by the “business folks.” Things such as conversion rates etc…you only need to understand your business in order to understand the data. Reminder: HITS stands for How Idiots Tracks Statistics. Make sure you are not just looking at hits, but more important, actual KPIs. Use worksheets to list KPIs, and how they are performing in one time frame versus another. Need to put these into context versus where you were yesterday, last week, or last year. The essence of good KPIs: 1. Definition – summarize relationships among meaningfully compared data. 2. Expectation – establish targets for improvement. Presentation – highlight changes (literally) for easy identifications. Action – Direct additional study and ID areas of the website that need work. Core of good KPIs – they always drive action. KPI’s should be presented in the following quantity: only 3-4 for senior executives, 6-8 for next level down, 10-15 for analysts, etc…Sample: Senior strategist: given only 2 main KPIs including average email response time and Percentage low and high satisfaction customers. Think about KPIs hierarchically, and use simple enough data to drive action.

Good KPIs are clearly defined, easy to determine expectations, clearly presented changes in data. Discusses a sample KPI worksheet with metrics such as: Order conversion rate. Buyer conversion rate. Average order value. Avg revenue per visit. Avg cost per conversion. Etc…. a great KPI for customer support that isn’t often measured is “average time before customer support issue is addressed.” Check out webanalyticsdemystified.com for more about their thoughts.

This is part of the Search Engine Roundtable Blog coverage of the New York Search Engine Strategies Conference and Expo 2006. For other SES topics covered, please visit the Roundtable SES NYC 2006 category archives.

SES NYC Tag:

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 20, 2024

Dec 20, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google December Core Update Done, Spam Update Starts, Google Ranking Exploit Leaked, Google Tests Double Serving Ads

Dec 20, 2024 - 8:01 am
Google Updates

Google December 2024 Spam Update 👾 Rollout Shocks Before Holidays

Dec 20, 2024 - 7:51 am
Google

Google Testing Shaded Button Sitelinks On Mobile

Dec 20, 2024 - 7:41 am
Google

Google Search To Gain AI Mode

Dec 20, 2024 - 7:31 am
Google Maps

Google Tests Nearby Hotels & Restaurants In Business Profile Listing

Dec 20, 2024 - 7:21 am
Previous Story: Contextual Ads