Five companies turn data dilemmas into marketing solutions
Too often, companies think of their databases as … databases: collections of names and numbers, of dates and dollar amounts. And technically, that’s what they are. But if you think of your database as just names and numbers, that’s all you’ll get out of it.
Your data is your record of customer interaction; as such, it is your most valuable commodity. Like the story of the jeweler who locked up his customer list—not his diamonds—in his safe each night, marketers must revere their databases.
The sad fact, however, is that much data is in disarray. Many companies’ data usage is a shadow of what it could be, perhaps limping along on outdated systems or lurching to and fro thanks to cutting-edge technology suboptimally employed.
It’s high time for many companies to make over their data practices, to turn over a new leaf where data use is concerned. For this month’s cover story, we talked to experts in the field of database architecture on what needs to be done to ensure a database overhaul provides you with a database that does more than simply track transactions, but that can increase your efforts’ speed to market; allow you a higher level of targeting and segmenting sophistication; help you reduce waste and any number of other benefits. We also picked the brains of marketers and consultants willing to share their own database makeover success stories, which you’ll read about in the four case studies later in this story.
If You Build It … Build It Right
The problem for many companies looking to revamp their databases, says Bernice Grossman, president of the DMRS Group Inc. in New York City, a firm that consults on the design and development of marketing database systems, is that they don’t think through the details.
A database, explains Grossman, is like a house, and how it’s built depends on how you intend to use it. “Pretend that you’re going to buy a house, and there’s a kitchen and a dining room,” she explains. “Most people would say, ‘Well, I really like easy access between those two rooms because so much of what happens in those rooms is related.’ Things like the bedroom and the den, those could be elsewhere.”
“Think of what the house would be like,” suggests Grossman, “if the bedrooms and the bathrooms were next to the kitchen, and the dining room was far away. … Someone would say, ‘Who was the architect of this house?’”
Just like designing a house, designing or redesigning your database requires a good deal of forethought about needs.
For example, offers Grossman, the marketing database of a pharmaceutical company will never have a field that has anything to do with money. “Pharmaceutical companies don’t sell prescription drugs,” she explains. “You can’t go to a pharmaceutical company and buy a prescription … so when you look at the variables and relationships between data, dollars and cents don’t have to be there.”
On the flip side, catalog companies need to be able to track the amount of product sold, as well as when certain purchases were made and how many purchases were made.
Getting your database to reflect your business rules is the most important, and most arduous, part of the overhaul process.
“We’re going to talk about what you want to do with this tool,” says Grossman of the process she goes through when consulting with companies. “We’re going to look at the data; we’re going to see whether you guys even know what you have. [I’ll ask,] ‘Have you ever really looked at your data?’ It might not be the most interesting book you’ve ever read, but we’re all going to sit down and read it.”
Grossman finds, not surprisingly, many companies—and often these are companies that complain the solution tools they purchased to build their databases are “broken”—haven’t put the work in up front. “Sometimes they’ve got some of the [data] definitions,” says Grossman. “Sometimes they don’t even know where all their data is. Sometimes they don’t even know how they got the data. Sometimes these are the largest companies in America.”
This process is called a needs assessment or discovery and, says Grossman, it’s the least expensive part of the whole process. “It’s also very boring. It’s not sexy,” she says. “But if you don’t do it, you usually end up with a marketing database you’re dissatisfied with. It has nothing to do with the tool or the software, it has to do with [the fact that] nobody was willing to sit down and really go through all the design and specification requirements so that the kitchen ends up next to the dining room.”
Don’t Go It Alone
No company should undertake a database makeover alone. “Get[ting] a marketing database in shape requires people with training and specialized software,” says Arthur Middleton Hughes, vice president/solutions architect at Richardson, Texas-based KnowledgeBase Marketing. “Few regular data-processing shops have ever done this before. You need someone with experience in building a marketing database: someone at an outside service bureau who has built a number of them. They’ve made all the mistakes already and learned from them.”
Hughes recalls a West Coast bank he was brought in to work with that had started the process of overhauling its data on its own. He explains that the company had purchased the D&B file for the state of California and was matching its business customers with only 8 percent of the records, “which is ridiculous,” he says. The problem is that it’s almost impossible to append data accurately when your own data isn’t standardized. After standardizing the address fields of the database, Hughes, working at his outside service bureau, was able to get a 44-percent match rate. This was done after the company had expended a great deal of time and effort trying to do it itself.
Further, while “dirty” data can be a troublesome and, at times, painfully obvious problem, it shouldn’t be first on your to-do list, as it’s a situation that’s more easily rectified once the rest of the process has been undertaken.
“Cleaning up data can sometimes be made into a big job,” says Hughes. “There is no company that is so huge that it couldn’t be done in three months or less. And it should be done rapidly. You don’t make money while you’re cleaning data.”
Which is to say that you should first design your database, then quickly clean and standardize the data, and finally perform merge/purge, NCOA and other appends.
It’s About the Users
You can do all the deep thinking you want about your database design, but you can’t get a true sense of what your database should be able to do without talking to the people who’ll be using it. Users tend to be a company’s marketing department, but a portion of the discovery process should be devoted to finding out all the users in a company.
“Part of what folks like me do is walk around and talk to the folks who’ll be using the database,” says Grossman.
From these types of discussions, the real value of your eventual database will emerge. For example, as you’ll read in the RadioShack case study that follows, the retailer interviewed 70 of its internal data users to find out their big needs, pains and concerns before moving ahead with its database rebuild.
Just as important as the users’ input is their access to the refurbished database.
“People should be accessing the database online,” says Hughes. “Everybody in the company with a need to know should have access to the database.”
A data overhaul can increase mail efficiency, as in the case of LexisNexis. It can help you, as in the case of Aspen Skiing Company, more easily combine your house data with outside sources. And it can allow you to disperse accurate intelligence to far-flung branches as in the cases of RadioShack and The Parable Group.
In the cases that follow, the upfront work was far from easy. The results, however, were more than worth the effort.
Case Study: Retail giant moves toward customer-centricity
Situation: Retailer RadioShack, a company that had long collected customer information at the point of sale, decided to change its strategy to become more customer-centric, incorporating targeted offers into its marketing mix.
Problem: The company’s existing database was a mainframe system designed as an operational data store. Its purpose was to get the right merchandise to the right retail locations. It was not designed for customer contact. The data that was available was difficult to access.
Goal: The biggest opportunities the company saw, says Tom DeNapoli, vice president of marketing communications for the Fort Worth, Texas-based company, were to “change our media to target our customers more effectively; integrate our e-mail and direct mail data in one place so we have a better picture of who our customers are and their channel preferences for working with RadioShack; and understand what the different segments purchase and what they need so we can create and market new technology/products and services more effectively.”
The Process: Working with SwatTeam Partners, a marketing consultancy based in Horseheads, N.Y., RadioShack decided to outsource the building of the database. After a thorough RFI and RFP process, RadioShack chose San Antonio, Texas-based direct marketing and targeted media company Harte-Hanks to build the database.
“We then went through a detailed process called discovery where we interviewed about 70 [internal data users] who gave us feedback about their needs,” explains DeNapoli.
Following discovery, RadioShack sent sample data to Harte-Hanks “to better understand the data, ranges, quality, etc.,” says DeNapoli. After building the database based on the users’ needs, RadioShack created sample reports and internal training programs, as well as an “executive leadership team to review the work at each step” and “an internal database committee [comprised] of IT and marketing folks who worked together to identify what data was needed, where it was, how it would be used and the business rules around it,” says DeNapoli.
RadioShack, SwatTeam and Harte-Hanks came up with a relational database with direct access. It is updated weekly and includes, says DeNapoli, “all transaction data, e-mail data, campaign management data, customer name and contact data, consumer preferences, store of preference, [presence of a] RadioShack credit card, proximity to the local store, customer segment and demographics.”
Since RadioShack stopped collecting customer information at the point of sale two years ago, the customer information in the database is culled from information for servicing, wireless accounts, installations and returns.
Outcome: RadioShack is using the new database for a variety of purposes, from honing its retail operations to expanding its direct marketing efforts.
“We’re using the new data to help our store personnel understand the segments that most likely are shopping their stores,” explains DeNapoli. “We’re using the data to purchase media more effectively. … We’re also doing more direct mail and testing offers, creative, etc. [And we] intend to do some cross-media promotion using direct mail and e-mail together to understand the synergy.”
With the marketing database in place, RadioShack looks to understand its customers’ behavior in multiple channels, get a quicker read on campaign results, test models, answer executive questions about customers quickly, and employ “what if” scenarios before making business decisions.
Tools Used: Unica for campaign management, Microstrategy for reporting and SAS for data mining.
Case Study: Skiing company becomes database mogul
Company: Aspen Skiing Co.
Situation: Aspen Skiing Co. (ASC), the company that owns and oversees the ski resort areas of Aspen, Colo., and sells items such as lift tickets and season passes, was using its operational database for marketing purposes.
“They had one source of data—operational data,” says Paul Vannett, CEO of Dovetail, the Highlands Ranch, Colo.-based database consulting company ASC teamed up with for this project, “but they had other sources of data they wanted to integrate.”
Problem: Because the company was using operational data, the database was rife with duplicates. “As is common with operational databases, when customers buy again, they’re added again,” explains Vannett. “Names and addresses are dirty. Suppressions are difficult.”
Further, marketing had no direct access to the data. Requests were made through IT, which was a burden on both departments.
Goal: Clean up and combine ASC’s operational data with an array of other primary and secondary data sources in a centralized location that allows marketing direct, one-click access.
The Process: Dovetail conducted an extensive discovery process to find out “’What are the business objectives?’ and ‘Where are the pains?’” explains Vannett.
With this information, Dovetail designed a database architecture that puts the information through a three-phase process.
Phase one is a staging area in which ASC pools its operational data, as well as data from 37 other primary sources, such as ski shows and on-mountain surveys. Here, data is cleaned, validated and de-duped.
In the second phase, the cleaned data is loaded into a central database. “It contains data from all 37 sources,” says Vannett. “It’s a never-throw-any-data-away kind of database. There’s more data in there than what marketers need day-to-day.”
The third phase is the creation of a data mart. Using a proprietary application Dovetail calls, simply, the Dovetail Application, marketers access the data mart, which contains information necessary for daily use. “If they need something additional,” says Vannett, “they have the infrastructure to get it quickly from the big database.”
Outcome: Dovetail now incorporates data from 37 primary sources and more than 200 secondary sources to supplement its operational data. For example, within the primary source called information requests and inquiries are secondary sources such as Ski magazine inquiries, Powder magazine inquiries and Aspen chamber of commerce inquiries.
Before, ASC couldn’t really suppress addresses for the purposes of, say, only sending one offer per household, and now it can do much more. “If they want to send a direct mail piece to everyone who purchased a season ticket, or if they want to send out an offer, but they want to exclude people who made a lot of purchases last year [for the purposes of making a different offer], they can do that,” explains Vannett. “Because the data is integrated, they can suppress data easily.”
The database also has helped ASC’s cross-channel marketing by incorporating e-mail addresses. “If they have 10,000 people who say they want more information, and 5,000 of them provided e-mail addresses,” explains Vannett, “they can save money by sending e-mail to the first 5,000.”
The marketing department can pull lists every day and do analysis 24 hours a day, seven days a week thanks to the Web-enabled application.
Tools Used: Dovetail’s Dovetail 2.0 application
Case Study: Information provider suffered from bad data
Situation: LexisNexis, known for providing reliable legal and corporate information, was experiencing a data breakdown in its marketing efforts. For a company whose some 2,000 mailing campaigns generate roughly 12 million mail pieces per year, sending multiple copies of mailings to the same person, mailing to the deceased and hitting recipients with rapid-fire offers from different departments was simply inefficient.
Problem: “Our direct marketing efforts were coming out of our business intelligence system, essentially a billing and usage system,” says Bill Welch, marketing systems manager for LexisNexis. The company employs many complicated billing and pricing schemes, and due to the nature of law firms, one firm might pay the bills for 30 offices. “Our back office system wasn’t good at housing information to send stuff to individual offices. It’s perfectly good as a billing database, but wasn’t very good for doing contact work.”
Because law firm names change frequently (partners are added, partners leave) LexisNexis’ data was overrun with duplicate entries. “We had one person call in saying they had received 18 identical [mail] pieces,” laments Welch, noting an extreme example.
Furthermore, the company wasn’t segmenting. “If we had an offer for a new tax publication, we’d send it to all attorneys, not just tax attorneys,” he says.
Goal: Create a marketing database that would allow LexisNexis to contact the people actually using the service, not just those paying for it. Incorporate outside sources of data, standardize and clean existing data, and establish protocols for keeping firm names updated.
The Process: LexisNexis took its existing data and combined it with information from infoUSA and Courtlink. Using technology from data quality and cleansing solution provider DataFlux, LexisNexis then was able to take the data and “cleanse them, CASS verify them [and] put match codes in,” explains Welch. “We’re grouping records together that are really the same person and appending information from other data sources to find out [lawyers’] practice area, year they passed the bar, year they were born, etc.”
LexisNexis also instituted a plan for verifying law firm names, employing a full-time data steward.
Outcome: LexisNexis now has much greater control of its data. “We can target our mailings to the appropriate individuals, and we can greatly eliminate duplicate mailings,” says Welch. “We’re avoiding sending mailings too often to the same people, and we’re no longer sending mailings to our competitors.”
With its data standardized, LexisNexis now can append data, such as firm size, with confidence. This capability allows the company to mail smarter. “We’re now able to produce compact mailings,” says Welch. “Instead of sending 20,000 mailings to everybody, we can send 5,000 or 10,000 targeted pieces.”
Another area where this process has helped is in company acquisition. “We just acquired a company, and they’re looking to us to do advanced integration of their customers,” explains Welch. While the acquired company’s IT department estimated it would take more than a year to fully integrate the customer data, LexisNexis expects to be able to send out mailings to the company’s customers—and recognize those that have a pre-existing relationship with LexisNexis—within weeks.
Tools Used: DataFlux DfPower Studio 6.0; SAS 8.2
Case Study: Network of independent Christian booksellers solves data tangle
Company: The Parable Group
Situation: In April 2000, The Parable Group, the umbrella organization for an association of more than 250 independent Christian book stores in North America, contacted Wheaton Group, a direct marketing and data consultancy in Chapel Hill, N.C., about performing some CRM work. However, the data—culled from Parable’s network of retailers, which had no standard point-of-sale system—was in bad shape. “We were overwhelmed at its poor condition,” says Parable Group CEO Tim Blair. Before any sort of CRM work could be done, The Parable Group needed to perform major reconstructive work on its marketing database.
Problem: “The Parable Group data was perhaps the most problematic that I had ever seen,” says Jim Wheaton, principal, Wheaton Group.
“In April 2000, the marketing database was basically useless,” concurs Melissa Lundie, manager of business intelligence, The Parable Group. “Major problems included, but were not limited to, bad data, bad business rules and inconsistent data processing.”
The Parable Group’s mission is to provide marketing heft to independent book stores who are competing with large chains, yet its marketing database was in shambles. Because The Parable Group does not own its member stores, standardization is difficult. Further, deciding how to treat customers who shopped at multiple stores—whether as one entity or multiple entities—was a challenge.
Goal: “The Parable Group decided that if CRM was going to be one of our core competencies that we offered to our partner stores, we needed to think about bringing the data work in-house,” explains Blair. “This was a difficult undertaking, because a single [data] warehouse had to reflect hundreds of different stores and small store chains. Each of these is a separate profit center and each has different levels of sophistication.”
“The Parable Group realized that the long-term success of its data-driven vision was predicated on having rock-solid credibility with the store owners,” says Wheaton. “[They] also understood that a handful of [poorly] executed promotional selections would do great damage to this credibility.”
The Process: According to Lundie, the marketing database, as it existed, was completely scrapped. She explains, “We undertook a seven-step, ground-floor effort:
1. We developed an understanding of the data issues.
2. We understood the business rules from scratch.
3. We built database tables and corresponding rules to support improved customer selection.
4. We tested, tried and then retried the business rules.
5. We developed a statistics-based predictive model of customer behavior, and employed it for mailings and tests.
6. We integrated list hygiene services.
7. We developed national inventory reports.”
Part of the process, however, was to standardize the partner stores’ data-reporting. “You’ve got to roll your sleeves up and do every one of those stores individually,” explains Wheaton. “You’ve got to go field by field and look at where the issues are.”
Outcome: “To get the whole thing straightened out,” says Wheaton, “they made the commitment to take the whole thing in house. They developed service bureau capabilities. They went in and did the staffing. That’s not an easy thing.”
Finally confident in the accuracy of its data, “we [now] spend more time using the data than determining if it’s accurate,” says Lundie.
This allows The Parable Group to fulfill the promises it made to its member stores.
Says Lundie: “First, we can predict with confidence a customer’s future purchase behavior in response to a promotion and then accurately measure that behavior. Second, we can process mailing lists through all stages of hygiene, de-duplication and change-of-address processing. Third, we can use national data to help local stores improve product selection.”
With the transactional data standardized and flowing to Parable weekly, “client stores now know the ROI of advertising dollars spent with Parable,” says Blair.
Furthermore, Parable’s mailing team has processed millions of records, resolving many of the data issues that pre-existed.
Tools Used: Microsoft Access for database analysis, Microsoft SQL relational database engine, Microsoft Visual Basic, SAS and FirstLogic Postalsoft.
Case Study: Publisher grapples with its legacy system
Company: A medium-sized company combining publishing and merchandising.
Situation: The company, which prefers not to be named, was using “an antiquated legacy system that managed operations as well as transactional systems,” says Shamez Dharamsi, manager of implementation and senior consultant for Quaero, the Charlotte, N.C.-based marketing performance company that worked with the publisher.
Problem: Over the system’s 20-year life “there had been a lot of patchwork in terms of home-grown processes; a lot of fitting data into fields where it didn’t belong,” explains Dharamsi. “You couldn’t change the length of a field without changing the code. There was a lot of force fitting.”
The company was dissatisfied with the accuracy of the data as well. There were fields where nobody but the original programmer—who was no longer with the company—knew what it was. “They may have force-fed elements into a field and now, 10 years later, that field is being populated and nobody knows how,” says Dharamsi.
Further, the company had limited access to the data through the IT department, and was not able to access multiyear data at all. Because of the way the database was designed, at the end of each publication year the company would make a tape copy of the prior year’s data and essentially hit reset on the data-capture program. Lifetime value and year-to-year sales were almost impossible to track.
Goal: The company wanted easy access to its data and it wanted to be able to contact its customers in a more sophisticated way. Dharamsi hoped to salvage and clean as much of the company’s year-to-year data as possible and incorporate it into a relational database.
The Process: First, Dharamsi needed to go into the database and, field by field, figure out what was in it. “What we decided to do was get the ex-employee [who’d designed the original database] on the phone to try to figure out what fields were needed,” says Dharamsi. The hope was to save as much valuable data as possible. “There’s the terminology ‘garbage in equals garbage out’,” says Dharamsi, “but [data] is usually not garbage; you just need to tweak it and pull out the elements that are good.”
Once the publisher had determined which data was still usable, Quaero and the company went through an extensive discovery process to determine not only what the data was, but, for the purpose of constructing the relational database, how it planned to use it. It needed “to understand what it wanted to do with the system not just two years from now, but 10 or 15 years from now,” says Dharamsi.
From there, the existing data needed to be cleansed and standardized—an imperative step in building a relational database so that transactions that belong to the same customer record can be matched accordingly.
Outcome: Quaero had the company up and running on its new database in 90 days. The company is now able to access multiyear data and contact history; now it can base campaigns on behavior.
“Something I always preach,” says Dharamsi, “is that behavioral data can help you move to the next level.”
The company also has experienced improved reporting, as the previous reports from the legacy system had been far from accurate.
“[The database] is not just a black box anymore,” says Dharamsi. “We have a data dictionary and documentation so that if someone started tomorrow, they could read it and walk through it and get up and running right away instead of having to learn some antiquated language.”
Tools Used: Quaero’s MarketReady, a solution that combines a datamart build, installation of Unica’s Affinium 5.01 campaign management solution and a reporting tool.