Stop Blaming Marketing Problems on Software
I often hear statements like “Our client has a Tableau problem.” Or, it is something about Hadoop or data platforms, as in “We have an issue with Hadoop.” What did it do, use offensive language? I wonder what the real issue is.
In any case, such general statements don’t help much. I guess a medical doctor feels the same way when she hears that her patient has a headache. What does that even mean, headache? What kind of headache? Prolonging or sporadic? Throbbing or sharp pain? Overall, or one-sided? Or, do you just want to avoid conversations with your spouse?
Symptoms are not always related to root causes. Why would marketers think they have a problem with Tableau? Isn’t that a reporting and display tool? Unless one doesn’t like the way a bubble chart comes out, nothing really is a Tableau problem.
More often than not, reporting issues stem back to the data. What could be the major issues with the report? Inaccuracy, inconsistency or just plain suckiness? If the data on the report don’t make any sense, we must dig deeper. And let’s not forget that reporting tools are not even designed to handle heavy-duty data manipulations. But if the report doesn’t make any sense or is hard to understand — well, then — let’s blame the designer of such a report, not the toolset.
For the record, I do not represent analytical toolset companies like SAS, SPSS or Tableau. Maybe they should share some blame, because they must have sold the toolsets as an almighty data mining tool that just does it all. But I am addressing the issue this way; as, at least for now, forming proper questions, defining problem statements, data modeling (for analytics), report design and, most importantly, deriving insights out of the report solidly remain as human functions.
Let’s break it down further. When faced with a large amount of unrefined, unstructured and uncategorized data, we must indeed fix the data first. Let’s not even think about blaming the data storage platforms like Hadoop, MongoDB or Teradata here. That would be like blaming rice storage facilities for not being able to refine rice for human consumption. In other words, we should not put too much of a burden on the data collection and storage systems when it comes to data refinement.
Data refinement should be dealt with as a separate entry altogether; between data collection (such as Hadoop) and data delivery (such as Tableau), each requiring different skillsets and expertise. Such data refinement work includes:
- Data Hygiene and Edit: As no data source is immaculate. In fact, many analysts waste their valuable time on fixing dirty data (and following the steps listed below).
- Data Categorization and Tagging: As uncategorized freeform data must be put into buckets and properly tagged for advanced analytics (refer to “Free Form Data Are Not Exactly Free”).
- Data Consolidation: As disparate data sources must be “merged” (to create a “360-degree view of the customer” around a person, for example), or “concatenated” (to increase coverage by adding similar types of data).
- Data Summarization and Variable Creation: To transform data to describe different levels (transaction, emails, customers, companies, etc.), as in converting transaction or event-level data into “descriptors of individual customers” (refer to “Beyond RFM Data”).
- Treat Missing Values: As no data will ever be fully complete, we need to fill in the gaps either with statistical models or business rules (refer to “Missing Data Can Be Meaningful”).
If the salesperson who sold you the reporting toolset promised that the product would do all of these things, well, just ignore him. Even in the age of AI, these steps must be performed by separate machines (or teams) trained for specific tasks. Simply, machines are not that smart yet; AI trained for “recognition” won’t be able to “predict” and fill in the blanks for you. That also means that these are not to be done by human analysts all by themselves.
Nonetheless, the steps listed here must be completed before the reporting or any other analytical work even begins. We can even say that the reporting step is the simplest one of all. But only if the reports are designed properly first. And that is the catch.
No amount of pretty charts can be meaningful if there is no story behind it. That would be like watching a movie filled with so-called state-of-the-art special effects with no character development or viable storyline. That may work as a trailer, but that’s about it. Now, if you are an analyst having to present findings to a client or your boss, you don’t want to be the one who loses steam five minutes after the meeting begins. A 40-page PowerPoint deck? So what? What does all of that mean? What are we supposed to do about it?
Often unbearable reports happen for the following reasons:
- Reports Without Clear Goals: What is the purpose of it all? Or the readers are supposed to draw their own conclusions?
- No Storyline: Without a viable storyline, a series of charts often looks like a data dump, or worse, data puke. Be selective and have a story to tell.
- Too Much Information on a Page: Get to the point fast. Don’t make them dizzy or distracted. Don’t rely on your audience to figure things out in real time at the presentation.
- Lie With Numbers: Unfortunately, this happens all of the time. Charts without consistent scale, labels or legends, trend lines when there is no trend, 3-D effects that blur judgements, etc. (I recommend a classic book “The Visual Display of Quantitative Information” by Edward R. Tufte.)
- Too Little or Too Much Narrative: Charts without any summary of findings are not beneficial, but writing a dissertation on a PowerPoint page is even worse. Stating the obvious can be really annoying.
I’m sure there are more reasons, but I’m sure readers have seen plenty of meaningless charts throughout their careers. The bottom line is that none of these things happen because of the toolset, though it is easy to wrap it up as a “toolset problem.” Like the toolsets are not to be blamed for imperfect data, a bad storyline can’t be blamed on them, either. Simply: Don’t blame the piano when you don’t know how to play it.
So, how should an analyst go about it? Here is one example — as there are many ways to skin it — of storyline development.
- Thoroughly Understand What Is at Stake. Why does the consumer of information lose sleep at night? What can we do to help her? Is it about low conversion rate, lack of response, decreasing customer value or skyrocketing cost? How can any consultant or analyst come up with a good advice if the he doesn’t know what’s at stake?
- Based on What Matters, come up with a handful — not too many — success metrics. Such as opens, clicks, responses, conversions, renewals, customers and dollars.
- Break Up the Metrics “by” levels of information that matter. Examples are: brand, division, country, region, store/branch, channel, product category, product line, year, month, date, daypart, etc. The list goes on, but what does matter the most to this particular audience?
- Create Ratios: Based on key metrics, develop percentages (e.g., conversion rate= # customer / # exposed to campaign x 100), averages (e.g., average dollar per transaction and/or per customer), ratios (e.g., transaction per customer and items per transaction), etc. Again, the list goes on, but what factors would tell us the most compelling story?
- Develop Index Values: Define a baseline for comparisons (e.g., Total customers active in past 24 months) and calculate index values against it (e.g., conversion rate of the comparison group divided by that of the baseline). Highlight index values that are too high (e.g., over 120) or too low (e.g., less than 80) in different colors. Even untrained eyes can see patterns that way.
- Get the Story Out of the Reports. Let the numbers speak for themselves; don’t force it, as predetermining the lines never ends well (that’s not science).
- Once the Storyline Emerges, develop graphical representation of it. Do not overdo it, unless it is imperative to emphasize multiple storylines. This will be the most fun part, if you have a story to tell.
- Create an Executive Summary. This is the most difficult part of the reporting work, requiring some experience and hard discipline. But, a good story can be covered in a few minutes, and that may be all the time that you have. The rules are:
- Never more than one to two slides
- Not more than five bullet points per slide
- Not more than 10 words per bullet point
- Finally, the Next Steps. What are we going to do about it? What should be the immediate action? Does that deviate us from the long-term goals? This is where the difference between a consultant and a contractor emerges. Simply, contractors take orders, but consultants tell clients what to do. If an analyst sees the things out of mounds of data, credibility comes from the data, anyway. If he saw things that no one else did before? Ah, that is the moment for a geeky analyst to shine; proudly providing recommendations based on findings.
Obviously, the last part of the article is for current and future analysts. But I think the consumers of analytical services must be aware of these steps, as well. For one, marketers will be able to put the blame in the right places when things go wrong.
Not that this is about starting a blame game, but knowing what can go wrong is the first step toward the right kind of investment in this data and analytics game, and marketers will be able to manage the analysts and vendors more effectively, too. Because, like I said in the beginning, nothing really is a toolset problem.
Stephen H. Yu is a world-class database marketer. He has a proven track record in comprehensive strategic planning and tactical execution, effectively bridging the gap between the marketing and technology world with a balanced view obtained from more than 30 years of experience in best practices of database marketing. Currently, Yu is president and chief consultant at Willow Data Strategy. Previously, he was the head of analytics and insights at eClerx, and VP, Data Strategy & Analytics at Infogroup. Prior to that, Yu was the founding CTO of I-Behavior Inc., which pioneered the use of SKU-level behavioral data. “As a long-time data player with plenty of battle experiences, I would like to share my thoughts and knowledge that I obtained from being a bridge person between the marketing world and the technology world. In the end, data and analytics are just tools for decision-makers; let’s think about what we should be (or shouldn’t be) doing with them first. And the tools must be wielded properly to meet the goals, so let me share some useful tricks in database design, data refinement process and analytics.” Reach him at email@example.com.