Who is the “Current Resident”, anyway?

Our CEO received an extremely heavy, expensive-looking catalog in the mail the other day, from an upmarket retailer and addressed to a previous occupant of his house “Or Current Resident”. When you receive catalogs in the mail that are addressed to the previous homeowner or the “current resident”, do you read them or toss them? Obviously the company hopes that anyone who receives a catalog at this address will more than likely take a gander at what’s being offered.
But is this a cost-effective supposition? When you consider the resources wasted on shipping a catalog to anyone that lives at a particular address, you have to wonder whether this is a smart strategy or just a cop out from cleansing a database.

Using address verification software which includes the National Change of Address (NOCA) service would help catalog senders increase their return on investment by updating their databases as frequently as needed. The NCOA service would ensure that databases are updated with the customer’s current address information, or warn of deceased or moved customers who did not give a forwarding address.

NCOA relies on the customers filling out a Change of Address form, and the USPS internal databases which keep track of customer information, which it then relays to the NCOA service. Rather than make use of NCOA data, many companies add “Or Current Resident” to the name from their databases, as the most timely and least expensive method of allowing that the addressee may no longer be there.

Set against the convenience of this tactic, these factors should also be considered:

  • The expense of shipping items to an old address
  • The much reduced chance of the new resident making a purchase
  • Losing track of a past customer
  • Alienating the new mail recipient

But does the NCOA process take that much time, or add to the expense of the mailing? Well, the answer is “no” on both counts! A whole spectrum of NCOA options is available, from desktop software that can be used by marketers, and software integrated into the corporate database (both contacting an NCOA service under the hood) to online bureaus who take your data and return the updated file a few hours later. The cost depends on data volumes, but even if you only have a few thousand records in your mailing file, you can always find an option that saves you money compared with print and mail costs – especially if your catalog is bulky.

Sending catalogs to the “current resident” might sound like easy advertising, but it doesn’t deliver return on investment for the costs of printing and mailing and it doesn’t help your brand. It really is easy and much smarter to keep track of customers with NCOA services, stop shipments to non-existent customers and even save money to reinvest in other positive, more effective marketing efforts.

6 Reasons Companies Ignore Data Quality Issues

When lean businesses encounter data quality issues, managers may be tempted to leverage existing CRM platforms or similar tools to try and meet the perceived data cleansing needs. They might also default to reinforcing some existing business processes and educating users in support of good data. While these approaches might be a piece of the data quality puzzle, it would be naive to think that they will resolve the problem. In fact, ignoring the problem for much longer while trying some half-hearted approaches, can actually amplify the problem you’ll eventually have to deal with later. So why do they do it? Here are some reasons we have heard about why businesses have stuck their heads in the proverbial data quality sand:

1. “We don’t need it. We just need to reinforce the business rules.”

Even in companies that run the tightest of ships, reinforcing business rules and standards won’t prevent all your problem. First, not all data quality errors are attributable to lazy or untrained employees. Consider nicknames, multiple legitimate addresses and variations on foreign spellings just to mention a few. Plus, while getting your process and team in line is always a good habit, it still leaves the challenge of cleaning up what you’ve got.

2. “We already have it. We just need to use it.”

Stakeholders often mistakenly think that data quality tools are inherent in existing applications or are a modular function that can be added on. Managers with sophisticated CRM or ERP tools in place may find it particularly hard to believe that their expensive investment doesn’t account for data quality. While customizing or extending existing ERP applications may take you part of the way, we are constantly talking to companies that have used up valuable time, funds and resources trying to squeeze a sufficient data quality solution out of one of their other software tools and it rarely goes well.

3. “We have no resources.”

When human, IT and financial resources are maxed out, the thought of adding a major initiative such as data quality can seem foolhardy. Even defining business  requirements is challenging unless a knowledgeable data steward is on board. With no clear approach, some businesses tread water in spite of mounting a formal assault. It’s important to keep in mind though that procrastinating a data quality issue can cost more resources in the long run because the time it takes staff to navigate data with inherent problems, can take a serious toll on efficiency.

4. “Nobody cares about data quality.”

Unfortunately, when it comes to advocating for data quality, there is often only one lone voice on the team, advocating for something that no one else really seems to care about. The key is to find the people that get it. They are there, the problem is they are rarely asked. They are usually in the trenches, trying to work with the data or struggling to keep up with the maintenance. They are not empowered to change any systems to resolve the data quality issues and may not even realize the extent of the issues, but they definitely care because it impacts their ability to do their job.

5. “It’s in the queue.”

Businesses may recognize the importance of data quality but just can’t think about it until after some other major implementation, such as a data migration, integration or warehousing project. It’s hard to know where data quality fits into the equation and when and how that tool should be implemented but it’s a safe bet to say that the time for data quality is before records move to a new environment. Put another way: garbage in = garbage out. Unfortunately for these companies, the unfamiliarity of a new system or process compounds the challenge of cleansing data errors that have migrated from the old system.

6. “I can’t justify the cost.”

One of the biggest challenges we hear about in our industry is the struggle to justify a data quality initiative with an ROI that is difficult to quantify. However, just because you can’t capture the cost of bad data in a single number doesn’t mean that it’s not affecting your bottom line. If you are faced with the dilemma of ‘justifying’ a major purchase but can’t find the figures to back it up, try to justify doing nothing. It may be easier to argue against sticking your head in the sand, then to fight ‘for’ the solution you know you need.

Is your company currently sticking their head in the sand when it comes to data quality? What other reasons have you heard?

Remember, bad data triumphs when good managers do nothing.

Data Quality Makes the Best Neighbor

So this week’s #dataqualityblunder is brought to you by the insurance industry and demonstrates that data quality issues can manifest themselves in a variety of ways and have unexpected impacts on the business entity.

Case in point – State Farm. Big company. Tons of agents. Working hard at a new, bold advertising campaign. It’s kind of common knowledge that they have regional agents (you see the billboards throughout the NY Tri-State area) and it’s common to get repeated promotional materials from your regional agent.

But, what happens when agents start competing for the same territory? That appears to be the situation for a recent set of mailings I received. On the same day, I got the same letter from two different agents in neighboring regions.

Same offer. Same address. So, who do I call? And how long will it take for me to get annoyed by getting two sets of the same marketing material? Although it may be obvious, there are a few impacts from this kind of blunder:

  • First of all – wasted dollars. Not sure who foots the bills here – State Farm or the agents themselves, but either way, someone is spending more money than they need to.
  • Brand equity suffers. When one local agent promotes themselves to me, I get a warm fuzzy feeling that he is somehow reaching out to his ‘neighbor’. He lives in this community and will understand my concerns and needs. This is his livelihood and it matters to him. But, when I get the same exact mailing from two agents in different offices, I realize there is a machine behind this initiative. Warm feelings gone and the brand State Farm has worked so hard to develop, loses its luster.
  • Painful inefficiency.  I am just one person that got stuck on two mailing lists. How many more are there? And how much more successful would each agent be if they focused their time, money and energy on a unique territory, instead of overlapping ones.

There are lots of lessons in this one and there are a variety of possible reasons for this kind of blunder.  A quick call to one of the agents and I learned that most of the lists come from the parent organization but some agents do supplement with additional lists but they assured me, this kind of overlap was not expected or planned. That means there is a step (or tool) in the process that is missing. It could require a change in business rules for agent marketing. It’s possible they have the rules in place but requires greater enforcement. It could just be a matter of implementing the right deduplication tools across their multiple data sources. There are plenty of ways to insure against this kind of #dataqualityblunder once the issue is highlighted and data quality becomes a priority.