mariott hotel data breach

Marriott’s 500m Record Breach – what they should have done

The massive 500m record breach of Marriott’s Starwood customer database is just the latest in a very long line of high profile, reputation-threatening data breaches.

“Marriott has not finished identifying duplicate information in the database, but believes it contains information on up to approximately 500 million guests who made a reservation at a Starwood property.”

in 2017 alone, over 40 organizations including Equifax, Verizon, eBay and Uber were in the news having suffered costly and/or embarrassing data breaches. That seemed bad enough, but according to personal information security specialist IdentityForce there have already been three times as many in 2018 including Facebook, British Airways and the US Postal Service!

The even worse news for Marriott is that (unlike the companies hacked last year) they now face potentially a billion dollar fine under GDPR (4% of worldwide revenue) if they can’t demonstrate prompt, effective action to notify the relevant data protection authorities and affected customers – that’s in addition to the probable loss of business and immediate 6% decline in its share price.

Let’s remind ourselves of the main requirements of GDPR compliance in respect of customer data:

• Keeping customer data accurate, up-to-date and secure
• Proving consent for all use of customer data
• Responding to Subject Access Requests quickly
• Processing the “Right To Be Forgotten”
• Maintaining a complete audit trail of access and updates to customer data
• Notifying affected customers promptly in case of a breach

Obviously, Marriott are in breach of the first duty. Their abilities on their other responsibilities are about to be put under intense scrutiny by authorities, courts and the media. Note that Marriott “has not finished identifying duplicate information in the database” – they are obviously finding it difficult to assess (and notify) the actual scale of the problem. It’s also likely to make it a huge challenge to respond quickly to what are likely to be large numbers of Subject Access Requests, prove consent for customers who wonder why Marriott hung on to their data, or reliably erase records for (probably) hordes who will who demand the “Right To Be Forgotten”. With the volumes of data involved, it will require highly accurate, automatable matching – for example, if Marriott remove one or two instances of a customer but other occurrences remain undetected, they will not be fulfilling the deletion request properly. The situation might then be aggravated by marketing to the undetected customer duplicates, leading to further scrutiny and potentially more fines.

Let’s think about what might have been – earlier this year, another very large international hotel group acquired a worldwide licence for our contact data matching engine. Their motivation was primarily twofold: they wanted to improve the quality of matching behind their Single Customer View using best-of-breed matching and to bring it under the control of their corporate database system. From the available cross-platform, on-premise or cloud deployments, our client chose to integrate the matching engine into their Amazon Web Services Linux platform. They recognised that using a discrete system for customer data matching which involves exporting data from one system to another, perhaps via a flat file, makes it difficult to ensure absolute security while the data is in flight – and any security system is only as good as its weakest link. Other significant benefits of integrating matching within the corporate database are the access control and the auditability that this provides.

But let’s imagine that the worst happens and despite their customer data residing only in the most secure place it can be, inside their main database, our client is also hacked. The first difference is that they will be alerted quickly by the monitoring tools within their database, so they can react fast. Next, they can use their accurate, up-to-date Single Customer View (enabled by our uniquely effective customer data matching) to check how many and which customers were affected – this means that they can notify the authorities immediately with concrete information about the hack, as well as the affected customers. Then, our client would be well placed to handle the expected surge in customer demands for information, erasure etc.

The bottom line is that any CTO, CEO and board that is not doing their utmost to keep access to high volumes of customer data secure, or not making sure that the organisation can react effectively in the event of a breach, is betting the farm on thinking that “it wouldn’t happen to us”!

matchIT Updates

What’s new in matchIT Data Quality Solutions? Autumn 2018

 

matchIT customers will know well that here at helpIT we are constantly taking our customers’ needs and feedback onboard to shape the regular updates and enhancements we offer on all matchIT Data Quality Solutions. Our software updates don’t just keep up with the changing landscape of the UK direct marketing industry, they include enhancements to usability – helping you to make the most of your time and get even more out of your data. We’ll be posting separately about GDPR changes to matchIT Web (fuzzy inquiry, duplicate prevention and address verification) and matchIT On Demand (our new cloud matching software-as-a-service), so here we’re updating you on the on-premise batch deployments of matchIT Data Quality Solutions.

matchIT SQL

We’ve been hard at work making further ease of use and functionality improvements to our SQL Server deployment since our last major update earlier this year. Here are some of the key changes to look out for:

  • Suppression:
    • Setting up jobs which include standard suppression (deceased, gone away and MPS) and change of address is now much easier, requiring less experience with SSIS to accomplish.
    • You can now run suppression tasks in parallel for simultaneous processing of a client file against multiple suppression files. On typical multi-core hardware, this will greatly increase your job throughput.
    • We’ve added further standard suppression files including NDR and Remover from the Ark.
  • Matching:
    • Performance improvements made to Group Matches when processing very large groups of matching pairs.
    • Improved performance when overlap matching a small set of data against a much larger universe.
  • Mailing:
    • A customisable pre-built package to make the automation of your mailing jobs significantly easier. Simply tell the package what settings you want to use and the package will do the rest (including matching, address verification, suppression and mail sortation).
  • Address verification:
    • Improved handling of foreign addresses when using UK address verification.
    • International address verification now offers the Canada Post SERP standard.

matchIT Hub

Our incredibly fast, cross-platform, database-independent API has come on leaps and bounds over the last year. Significant enhancements include:

  • Smart Matching allows users to simply specify the data layout and let matchIT Hub automatically work out the appropriate matching configuration. It will automatically match on name, company, address/postcode, phone number, email and date of birth if present, and can match on multiple levels simultaneously e.g. individual, family and household, or contact and company. The user can, of course, steer Smart Matching by specifying match levels, matching tightness and matching methods.
  • Dynamic Keys – in Lookup & Overlap modes, this option ensures that matchIT Hub only uses keys that are supported by the data supplied for matching. This greatly improves performance when performing lookups with limited data.
  • Handling of multiple names, addresses, emails and phone numbers etc. per record, as are commonly found in CRM and ERP systems – this includes custom fields e.g. credit card numbers.
  • Sophisticated, easily-configurable bridging prevention and post-processing of matches – this prevents records being matched or combined into groups when the use case dictates that they shouldn’t be.
  • Integration into the Alteryx data preparation tool – this is live for our US clients and will soon be available off-the-shelf for UK customers too. matchIT Hub is also available integrated into Talend and Pentaho – please contact us if you use any of these tools.
  • The Hub Service allows you to match multiple incoming files against a universe of data held in memory and also supports single record lookup.

matchIT Desktop

The latest version of the Desktop solution, version 6.1. includes a number of enhancements and updates to further refine the ease-of-use and capability which it is known for:

  • Improvements to make Filter and Index Order boxes in the Output Options screen more user friendly.
  • Provision for an option to include all Mailmark fields that Sort and Save produces.
  • Sort and Save Seeds functionality is now available and included in version 6.1.
  • General improvements to the Hosted Service suppression and enrichment interface.
1988 reconstruction

Seventy years since the first software program ran

Today is the 70th anniversary of the first successful execution of the world’s first software program which I wrote about on this blog on the occasion of the 65th anniversary, which was also commemorated with a specially commissioned video on Google’s official blog. The Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, was the world’s first stored-program computer i.e. the first computer that you could program for different tasks without rewiring or physical reconfiguration. The Baby was designed by Frederic C. Williams, Tom Kilburn and my father Geoff Tootill, and ran its first program on 21st June 1948.

It is the first anniversary of the Baby which my father hasn’t celebrated, as he passed away last October at the grand old age of 95. His contribution to the development of the modern day computer was recognised with obituaries in the Daily Telegraph, Guardian and the Times – the last of these was the most detailed and is reproduced here by kind permission of The Times. It reminds me of how gratified my father was to successfully debug a program that Alan Turing gave him to run on the Baby in the autumn of 1948! Apart from the Baby, Geoff Tootill worked on many, very varied scientific developments, including airborne radar during the Second World War, the first commercial computer at Ferranti, satellite communications, packet switching networks which foreshadowed the web, air traffic control and collision avoidance at sea. His final computing legacy was the phonetic algorithm that we use today in matchIT.

I’m in Manchester where there is a working replica of the Baby on display at the Museum of Science and Industry – my brother Peter will be handing over a test device that our father used when building the Baby, to be exhibited alongside the replica. Volunteers from the Computer Conservation Society are rerunning the first program. It will be great to catch up with Chris Burton who led the team that built the reconstruction of the Baby for the 50th anniversary in 1998 and to meet again Professor Dai Edwards who worked with my father in 1948. There are very few of the pioneers left now from those early days, so it is wonderful that the Computer Conservation Society has kept alive their legacy by lovingly reconstructing the machines that they built, including the Colossus at Bletchley Park. We owe them a heartfelt “Thank you!”

gdpr next steps

GDPR is live, what’s next?

In the run-up to May 25th, it seems to me that most companies focused all their efforts on ensuring GDPR compliance on the consent and contracts fronts. Assuming these are now sorted, you now need to make sure that the personal data that you hold about your customers and prospects, is accurate and up to date. After all, although the Information Commissioner’s Office doesn’t expect everyone to be perfect by now, they have stressed the importance of demonstrating continuing efforts to achieve full compliance. As Richard Sisson, senior policy officer at the ICO says: “You can’t forget about GDPR and it’s done. It’s an ongoing thing.”

He expanded on this by saying “We are trying to reassure people that if you are trying to do the work that you can to comply, if you are working towards the accountability principle and ensuring you have records of what you’re doing, and you can show that you are working towards compliance – we may not be entirely happy all the time, but we will take those things into consideration. We understand that. We’re not going to be issuing huge fines on 25th May.”

But if you aren’t sure how accurate and up to date your data is, it won’t be! And you need to start doing something about it now. As the Chair of the EU Article 29 Working Party Isabelle Falque-Pierrotin said, “This is a learning curve and we will take into account, of course, that this is a learning curve… but it’s important that you start today, not tomorrow. Today.”

There are two key things that you need to focus on to start with:

  • Making sure that you only communicate with your customers using accurate and up-to-date data. This will minimise the numbers that are prompted to contact you to question what data you have on them and maybe lodge a Subject Access Request.
  • Being able to respond promptly and fully to Subject Access and erasure requests (Right To Be Forgotten).

An accurate and current Single Customer View is essential to have full confidence that you’re meeting your data compliance obligations – but this can involve not only implementing suitable software to create and maintain this Single Customer View, but also admin work in human review of “grey area” matches – records that might be for the same person but are sufficiently different to need someone to check and maybe dig deeper.

So how do you reduce the chances of data inaccuracies being drawn to the attention of your customers, while showing solid steps taken and scheduled if someone lodges a complaint with the ICO?

  1. Consider a comprehensive, effective audit of your personal data, checking for duplication, out-of-date and incorrect addresses, people who have moved or died, phone numbers on the Telephone Preference Service etc.
  2. Make sure that data for any mass campaign or mailing that you undertake is run through an effective data cleansing solution to fix any problems before it is sent to print or the telemarketing agency.
  3. Take steps to implement a Single Customer View. The best matching software such as matchIT Data Quality Solutions will intelligently grade matches so that the vast majority can be automatically processed: combining duplicate records and linking matching records etc. Then the chances of your customers being aware of a problem are greatly reduced.
  4. While your admin team is reviewing those that got low matching scores to make manual decisions or before you’ve even done the automatic processing, matchIT Web provides a real-time Single Customer View that interrogates all your databases as part of your inquiry function: this allows your users to see all potential matches on the screen when a customer calls in. It also enables a quick effective way of handling Subject Access Requests and the Right To Be Forgotten.

One more thing to keep in mind of course, is that you need to make sure that your customer data is kept secure at all times while you’re on your journey towards GDPR data compliance. Maybe I should add a 5th item to those above: make sure you’ve got a plan for if the worst happens and you have to notify the ICO of a data breach… which could in turn require notifying all your customers. The sooner you have that accurate Single Customer View in place, integrated into the security of your database, the sooner you can be confident that you’re doing everything you can to minimise the chances of a breach – and the easier it will be to notify your customers should one happen.

GDPR What you need to know about Consent

GDPR – What you need to know about Consent

Who should read this?

This post is written for Data Controllers and anyone who needs to understand what the obligations of the Data Controller under GDPR are for obtaining consent. We’re talking about customers here, but the same obligations apply to any other personal data that you hold.

What do you need to know?

Over the last several years, data-driven marketing has been increasingly adopted as companies strive to make more effective use of their customer data. However the advent of GDPR has led many to question whether this trend can continue. The main problem is the increased burden for obtaining consent from the customer that GDPR places on you, if you are the data controller. Before we look at this in more detail though, let’s examine when consent must be explicit and when it need not be.

Although GDPR makes several mentions of “explicit consent” rather than just “consent”, it does not define what it means by “explicit”. Explicit consent is required for automated decision-making including profiling e.g. for the purpose of deciding whether to extend an offer. If you’re relying on consent (rather than one of the other provisions of the GDPR) for processing of sensitive personal data or transferring personal data outside of the EEA, then it must be explicit.

You don’t need explicit consent if, for example:

  • You need to use the customer data to fulfil your obligations under a contract
  • It’s required by law
  • It’s in the customer’s vital interests
  • It’s in your legitimate interests, except where such interests are overridden by the interests, rights or freedoms of the customer.

Under the current Data Protection Act, consent must be freely given, specific and informed but it can be implied. As now, under the GDPR, sensitive personal data requires explicit consent. Under the GDPR, consent must also be unambiguous and you must be able to demonstrate that consent was given. It must be clearly distinguishable, intelligible and easily accessible and the customer must be informed of their right to withdraw consent – it needs to be as easy to withdraw consent as it is to give consent. There is also (under GDPR) the much discussed “Right to be forgotten” – the right to request the deletion or removal of personal data where there is no compelling reason for its continued processing. When a customer withdraws consent, they may be more likely to exercise the right to be forgotten, so systems need to be designed with this in mind.

Under GDPR it’s no longer acceptable to bundle permission into the terms and conditions governing supply, they must be unbundled from the contract. Each aspect for which consent is being sought also has to be explained in granular detail. To be “freely given”, consent should not be required if it is not necessary for the performance of the contract, which could affect some providers of free web services – as discussed in more detail in this post.

Do I need consent for Direct Marketing?

The legitimate interest test needs some clarification. GDPR includes an explicit mention of Direct Marketing as a legitimate interest, but as Steve Henderson points out in this interesting post on the DMA website, this must be seen in the context of how you come to obtain that data in the first place, as well as how you’re going to use it. For example if you have obtained the contact details electronically then you continue to be bound by the Privacy Electronic Communications Regulations (PECR) and the EU E-Privacy Directive, irrespective of the legitimate interest test. Obviously if the personal data is used for direct mail and the data has been obtained in store or via a coupon, the legitimate interest test is applicable.

How is it going to work in the UK?

Now I want to look at how GDPR may be enshrined in UK regulations. There has been a lot of reaction from the industry to the Information Commissioner’s Office consultation on its GDPR consent guidance; it’s good to see TechUK representing these views to the Department for Culture Media and Sport (DCMS) robustly. The draft guidance states that consent requires a positive opt-in and that pre-ticked boxes or any other method of consent by default should not be used. However, the guidance also recognises that other opt-in methods might include signing a consent statement, oral confirmation, a binary choice presented with equal prominence, or switching a setting away from the default.

In the draft guidance there is a tendency to treat opt-in consent and “explicit consent” as the same, which GDPR itself does not. One potential issue which could arise is the effect that requiring opt-in consent for non-sensitive personal data could have on suppression processing: our comment to TechUK (which it included in its response to DCMS) was:

“One example of where an over-reliance on opt-in consent for non-sensitive personal data could have unintended consequences is where customers have moved to a new address. Currently, there are a handful of providers of lists of people who have moved which are used by many companies, either to suppress correspondence to that person at the old address, or (if the new address is available and it’s for a permitted purpose) to update correspondence records to the new address. The information for these lists is typically provided by financial services companies, who will not provide the data if they believe that it may be contrary to GDPR. Without the new address, suppliers would not know that their customer has moved and would be unable to contact them for permission to send correspondence – it would rely on the customer notifying each company that they have done business with of their new address. An inevitable result therefore of requiring an opt-in consent for non-sensitive personal data would be more mail for the old occupant sent to the new occupant at the customer’s old address.”

The bottom line

Although it’s easy to get bogged down in the detail of what you need to do regarding consent and to look on its tightening up by GDPR as a negative, DQM GRC research this year shows that an increasing proportion of consumers are ready to share their personal data if it’s properly explained why they should do so and the benefit to them: “two-thirds of consumers said they are willing to share their personal information in the right circumstances – a positive shift since 2016 when only half said this was the case”. The fundamental truth is still the same: an engaged customer who feels that they are in charge of the relationship is more likely to be a valuable customer.

Further reading

Some more information, including tips on how to handle the subject of consent, is in this article from Kemp Little.

In the next post, we’ll look at the differing obligations of Data Controllers and Data Processors.

uk election week

Suppression screening, terrorism and your vote

The election today and the aftermath of the appalling events at London Bridge and Borough Market have disrupted my momentum for writing about EU GDPR and what you need to know to get ready for next May 25th. I’m a frequent visitor to Borough Market and often walk across London Bridge, so like many others, this is the first time that terrorism has seemed so close to me.

Top of my mind this week is the news that the third mass murderer at London Bridge was an Italian/Moroccan whose name is apparently on the Schengen Information System – according to the BBC, “An Italian police source has confirmed to the BBC that Zaghba had been placed on a watch list, which is shared with many countries, including the UK.” Both the Westminster Bridge and London Bridge attacks were conducted using hired vehicles, the first a car and the second a van. Last month, the U.S. Transportation Security Administration announced that it wants truck rental agencies to be more vigilant in efforts to prevent these attacks and according to the same article, Penske (a nationwide truck leasing company in the US) screens customers using a watch list.

So, the first question that springs to my mind after London is “Should vehicle rental companies in Europe be screening customers against the Schengen list?” Obviously, not all such attacks are committed using hired vehicles, but many (if not most) are committed using hired or stolen vehicles – and stolen vehicles are likely to be on a police database with an active watch being kept out for them. The larger the vehicle, the more dangerous it is, the more likely it is to be able to crash through barriers and kill and maim people – and the more likely it is to be hired or stolen rather than owned.

The next question that rose to my mind was “Will the UK still have access to the Schengen list after Brexit?” Hopefully, however “hard” Brexit turns out to be, UK and EU negotiators will have cooperation on terrorism at the top of their list and such information will continue to be shared, so increasing systematic use of this data should be top of many people’s agendas.

Last, I worried whether the increased responsibilities for protection of personal data (and vastly increased fines) being introduced with GDPR next May will lead to companies putting their own interests first when it comes to (not) sharing information about suspicious persons with the authorities, or whether there need to be exemptions written into the guidance to ensure that individuals and organisations don’t get fined for breaches of GDPR through trying to do the right thing to help protect the public? I can ask this at next week’s techUK Data Protection Group, where one of the people developing the legislation and guidance from the Department for Culture, Media & Sport will be in attendance.

One other thought concerning data about people seems particularly relevant today – last Tuesday’s Telegraph “fears that thousands of postal ballots could have been sent out to voters who have died, putting their vote at risk of being used by somebody else”. Of course, speaking from personal experience, potentially a much bigger fraud could be all the residents of care homes, especially those with Alzheimer’s, being sent postal votes. Are additional precautions taken in checking that these votes are being filled in by the residents themselves? I know that in at least some cases, the postal vote addressee is not screened against the Power of Attorney registers. Given that GDPR obliges organisations to make sure the personal data that they keep is accurate and up-to-date, I wonder how the formula for fining an organisation 2-4% of global gross revenue under GDPR applies to a taxpayer-funded body such as a local authority!?

GDPR – what is changing next May?

This is the second in a series of posts about the EU GDPR – now less 12 months away! If you are a marketer, a business/systems analyst or a processor of third party data, this series of posts is written with you in mind – I hope you will be able to use it and the links that we provide to save you time and point you in the right direction as you grapple with the GDPR challenge. You can of course download the 130 page guide from the Information Commissioner’s website or browse through it during your lunch break, but if you want something targeted at you and split into manageable chunks, read on! In the first post, we mentioned how a genuine Single Customer View helps to keep data accurate and up to date. In this post we cover what will change from the current Data Protection Act.

At another very informative TechUK meeting last week on this topic, Rob Luke from the ICO described GDPR as an evolution of the existing rules and not a revolution. Essentially the GDPR is tightening up and clarifying existing rules more than introducing new ones, but there are two big differences:

  • Most obviously there are now huge penalties that can be imposed as discussed last week.
  • There are now responsibilities for data processors as well as data controllers, which is particularly significant for our industry.

Other key changes include mandatory notification of breaches, stricter rules on what constitutes sensitive personal data, making it harder to obtain consent and the introduction of mandatory data protection officers for some types of usage.

We will look at the new obligations for data processors in a later post aimed solely at our professional service provider audience, as well as looking at the new obligations for data controllers in a post specific to them. The key aspect of the GDPR which bears on the relationship between data controller and data processor is the much tighter control of data transfer and the need for written agreements between the two parties detailing their respective responsibilities. We will also look at when you need to obtain explicit consent and what has changed in this respect in a later post – whether you can adopt opt-out or have to settle for opt-in is now a more complex question.

Finally, if you are designing new systems, GDPR obliges you to undertake a Privacy Impact Assessment and incorporate Privacy by Design into your system – privacy and security should not be an afterthought. You must also incorporate privacy by default into collection of personal data: Fieldfisher’s blog summarises it as “businesses should only process personal data to the extent necessary for their intended purposes and should not store it for longer than is necessary for these purposes”. They state as an example that systems should “allow suppression of data of customers who have objected to receiving direct marketing”.

In the next post we will look at the key definitions in the GDPR so you can decide whether some of the obligations do indeed apply to your business. If you can’t wait, you can get a head start by reading this table in White & Case’s excellent handbook on GDPR. To be prompted about the next instalment in our series of posts, follow us on Twitter.

EU GDPR Headquarters

EU GDPR now just 12 months to go – where do you start?

As you should know, the EU General Data Protection Regulation (GDPR) comes into force one year from today, 25th May 2018. As we will still be in the EU then, whatever kind of Brexit we are in for, you only have 12 months to make sure that all your systems support compliance. If you need any incentive to start taking this seriously, you only have to consider that the maximum fine for breach of data protection regulations is increasing from £500,000 to €10 million or 2% of global gross revenue (whichever is higher) – that’s just for a level 1 breach, with double these amounts for a level 2 breach!

To help you on your journey to GDPR compliance, we will be publishing a series of posts about aspects of GDPR over the next few weeks. Initially, we will cover:

After that, we will look at how matchIT Data Quality Solutions can help you be compliant and avoid breaches of the new law – especially how a genuine Single Customer View helps to keep data accurate and up to date and ensures that you can respond to Subject Access Requests promptly, fully and efficiently.

The implications of GDPR are far reaching and HMG guidance is still being developed by the Department for Culture, Media and Sport, in consultation with industry bodies such as TechUK. Some companies may find that with only one year to go, they may not be able to become completely compliant by then – in which case, it is vital to mitigate potential costs of non-compliance by demonstrating effective progress, with a realistic timetable for full compliance. Look out for our next few posts to help you navigate towards that goal!

Bridging the gap

Bridging the skills gap

TechMarketView’s UKHotViews© post Are you hiding from your skills crisis? last week really struck a chord. Kate Hanaghan gave some interesting feedback about Microsoft’s Cloud Skills Report (which surveyed 250 mid-sized to large UK organisations) but in our experience, many of the same issues apply to moving from proprietary in house systems or legacy packaged software to industry-standard data platforms such as SQL Server.

According to Kate, “individuals themselves are not always keen to move away from the technologies they have spent years working with” and suppliers need to “convince technologists (who tend to be middle aged and highly experienced) they must re-skill in certain areas to support the business as it attempts to grow in digital areas”.

Although as Kate says, many legacy technologies will be around for many years to come, I think that with the increasing pace of technological change, individuals are unwise if they ignore opportunities to embrace new technologies. Movement to the cloud is now so rapid that cost and competitive pressures will force many organisations that are currently steadfastly “on premise” to start moving across sooner rather than later – particularly marketing services companies where demand is elastic. Companies and individuals who try and move from 20 year old, non-standard technology straight to the cloud will struggle, whereas companies with more modern infrastructure and techies with more modern skills will have more of an evolutionary, beaten path .

Apart from competitive pressures, there are many other sound reasons for moving from such aging systems to industry-standard data platforms, as we wrote in Data cleansing – inside the database or on your desktop? One of the key reasons is that using a platform like SQL Server is much more extensible – for example, in the marketing services sector, our matchIT SQL package can connect directly with MIS products upstream and document composition products downstream using shared data, so all the data is processed within SQL Server. For the company, data is more secure and both errors and turnaround time are greatly reduced. For IT staff, it means they can enhance their CV’s with sought-after skills and be ready to embrace the next opportunity a rapidly changing world gives them – such as using Microsoft Azure or Apache Spark for cloud deployment.

I’ll leave the last word to Kate, who wrote to me about her post: “In some ways I just find it so hard to understand. Who wouldn’t want to future-proof their career?! I mean, we’re going to be working till we’re 80!!”

 

matchIT SQL – finalist at the DBM Awards 2016

matchIT SQL – finalist at the DBM Awards 2016

We were thrilled to be selected as finalists for this prestigious industry award, and to join our peers, which included some of the industry’s finest at the 2016 DBM awards. The event was hosted in an impressive and historic location in London, The Brewery.

Whilst helpIT systems offers a range of deployments for its class leading matchIT Data Quality Solutions, it was our  SQL Server integrated solution matchIT SQL that was recognised as standing out from the crowd, in the fiercely competitive “Innovation in database marketing-related software” category.

matchIT SQL’s shortlisting was in part due to our innovative and seamless integration with Microsoft SQL Server, and it’s unique ability to bring together a comprehensive array of contact data quality functions, all available and accessed natively within Microsoft SQL Server. Whether it be highly accurate and intelligent data matching, UK or international address validation, suppression screening and integrated email validation, through to its Power BI reporting or blend of local or Microsoft Azure and/or web service deployment, matchIT SQL is obviously hitting the mark.

Perhaps it is because marketing agencies and data professionals alike have realised how matchIT SQL helps them build and maintain highly accurate single customer views and comply with the new GDPR. Or that it enables the seamless preparation of targeted and accurate marketing and campaign data, as well as intelligently manage and monitor data workflows and data feeds coming into a business.

Whatever it was that resulted in matchIT SQL being shortlisted, we are very proud of our team here at helpIT systems who are behind this innovative technology, and who have been providing such class leading software in the Contact Data Quality space for over 25 years.

On the night, the award went to Purple Agency, who were behind a bespoke solution created for the Financial Times, which involved a system to capture and hold every nugget of customer data to help achieve complex goals. While matchIT SQL may not have been chosen as the winner on this occasion, we’re very proud of the selection as finalists and the recognition that helpIT systems is still punching well above its weight and is a pioneer in developing innovative Database Marketing related solutions.

Thank you to the team at the DBM Awards, and in particular to our fantastic hosts Stuart Cal, Anthony Begley and the team, who made the evening the great success it was.