Subscribe & Follow
Jobs
- Retail Sales Agent Nelspruit
- Front Desk Representative Johannesburg
- Area Operations Manager Cape Town
Email marketing lies, lies, lies
Lie: "If you switch to 'our' email technology solution, we will guarantee you better deliverability."
Truth: By itself, the act of simply switching email service providers won't get you better deliverability if you continue to follow the same old bad practices.
If a salesperson from an ESP or technology provider promises you better deliverability this way, call them on it.
Deliverability is a function of many factors, primarily your sender reputation, including list-hygiene practices, permission approach and spam complaint rates among others. The fact is that about 98 percent of deliverability is driven by the practices of you, the sender.
There is no magic potion that one of the major ESP possesses and others don't. Yes, an ESP or email technology provider can give you advice and strategies to improve your deliverability. In some cases, their infrastructure and adoption of authentication protocols can also help you move the needle, but it's not a sure thing and still relies on your sending practices.
One exception: If you move from a shared IP address to a dedicated IP as part of the move, you might see an improvement, because the bad actors you might have shared your IP address with previously won't be there to affect your delivery. But this is not apples to apples, as you would likely see an improvement if you made the switch to a dedicated IP with your existing provider. If, however, you don't improve your practices, your delivery rates may in fact decline when moving to a dedicated IP on a new provider.
Lie: "My campaign got a 107% (or any number above 100) open rate."
Truth: People who claim that are really reporting "total" open rates, rather than the more accepted "unique" calculation.
I hear people bragging at industry conferences or in case studies that they got open rates over 100%. Sorry, can't happen!
An open rate calculated on "total" opens includes the total number of opens, such as when individuals open the same email more than once. In fact, many recipients open the same email two or three times.
Statistically, if you deliver 100 emails, and 60 recipients open the email, and 50 of these open it twice, your total open rate would equal 110%: (50 X 2) + 10.
I call it a lie because this scenario has contributed to the continuing confusion in the industry when it comes to benchmarking email rates. The use of total opens is rather obvious when that rate is above 100%. But when I read in some publication that a company reports a 70% open rate I'm rather dubious, and have in fact uncovered a few times that it was actually a total, rather than unique, open rate.
This issue is one of the main reasons the EEC Measurement Accuracy Roundtable exists, because, without further clarification, when someone says his campaign had a 40% open rate, you don't know if he means total opens or unique opens.
And if you've been reading my columns for awhile, you know how I feel about open rates, anyway.
Lie: "The average (insert metric here) rate is X."
Truth: Any company that implies an industry-wide metric is a specific "X" is actually only presenting the average across its own client base (and often a subset of it) or some sample group from a survey.
In my mind there is no such thing as an average open, click-through or other rate. Unless someone has surveyed every known sender or surveyed some statistically valid sampling of all senders, the averages you read about are not averages at all.
These "averages" can certainly help you determine if your own performance is in some ballpark, but unless you know the exact sampling of companies, it is not a reliable benchmark. Don't get me wrong, I'm not against these industry studies, as I have been involved in many myself.
My point is this: don't hold up these numbers as fact; rather, use them as a starting point to develop your own goals and benchmarks. These statistics are averages, and so, if available, you are better served by benchmarking against the top performers. Why would you want to see how you compare to others who are getting a "C" grade?
I'm sure I've missed several common industry lies. What deliberate email untruths have you heard propagated? Post your comments on the Email Insider blog.
Article courtesy of MediaPost