Use Metrics to Strengthen Your E-mail Campaigns, Part 2
Continuing our coverage of eM+C's All About eMail Virtual Conference & Expo held a few months ago, this week in the second installment of this three-part series focusing on a session titled "Making Sense of E-mail Metrics," I recap the presentation of Ernie Vickroy, marketing operations director at Time. Check back for the final part of this series in the May 8 issue of All About eMail, when I'll review the presentation of Jeff Mills, vice president of products at eROI, an interactive and e-mail marketing agency.
(To register for on-demand access to the conference, which is available until May 17, click here. And for part one of this series, which looks at the presentation given by Al DiGuido, CEO of Zeta Interactive, an interactive marketing agency, click here.)
The standard e-mail metrics ― delivery, open, clicks ― don't tell you the whole story about the success of a campaign, said Vickroy. Each metric is slightly flawed, he added, citing the following:
- Delivery: You really don't know if each e-mail was delivered to the inbox, Vickroy noted. This metric really measures bounces (i.e., your list and reputation).
- Opens: With so many ISPs having images blocked by default, this metric is often inaccurate. Even with certified delivery, no service covers all ISPs, Vickroy said.
- Clicks: This is a true measurement of engagement, but it includes all clicks and doesn't really measure response, Vickroy said.
A case study
How do you evaluate e-mail campaign success? For Time, which sends 30-40 different weekly campaigns across 14 different brands, the objectives for each campaign vary. For one, it might be new magazine subscriptions, another magazine renewal, another bill payment and so on. Vickroy and his staff want to get a quick snapshot of how a campaign is doing, followed up with a more detailed analysis down the line.
Vickroy cited a recent e-mail campaign's objective to drive traffic to Time's online customer service site. The e-mail contained 10 URLs ― four supporting the objective, two administrative, two secondary objectives and two “negative.”
"The day after a campaign goes out, I'm only interested in the four URLs connected to the primary objective,” Vickroy said. “The other URLs are important in detailed campaign analysis, but not as much for a quick snapshot picture.”
Working with its ESP, Time came up with a solution: to get revised snapshot reporting on each campaign, it assigns each URL as positive, negative or other. The positive clicks are those central to the campaign's objective; negative clicks are opt-out and privacy clicks; and other clicks are administrative and secondary objective clicks. The ESP provides Time with an immediate “snapshot” report with clicks broken down into the three categories, helping it save time and providing it with more accurate reporting, Vickroy said.