A guest column by Alex Mohelsky, EY Canada National Analytics Leader, and Roobi Alam, EY Canada Senior Risk Manager
For some time now, the promise of big data and advanced analytics has been obvious. Customers get a better experience tailored to their needs, wants and preferences, and companies get actionable insights which help them approach customers with more relevant offers that they’re more likely to buy. Win-win, right?
If only it was that simple.
The reality has been that while companies have eagerly jumped on the opportunity, too many are demanding instant consent to broad terms and conditions, giving them vast power over the customer data in question.
Have you ever browsed for new sneakers on your search engine, only to suddenly get bombarded by various shoe ads across your social media platforms? That’s pushing the creep factor.
Most of the time this data is used harmlessly. But there have been times where it has been seriously misused, as recent scandals have illustrated most plainly. And, unsurprisingly, people are taking notice – and being quite vocal with their criticism.
Governments are facing pressure to step in and take up the customer-protection mantle, with Europe’s tough data protection rules the most recent example.
As regulations toughen and expectations rise, companies that use customer data as part of their business now stand at a crossroads: on one hand, they can do the bare minimum required to comply with the data protection laws in their markets.
The alternative – what we believe truly forward-thinking organizations need to embrace – is the development of a robust data privacy strategy that ensures customers own their data and control how it’s used.
So, how does a company turn its data privacy approach from creepy to cool?
Be transparent from the get-go
To build trust, companies should tell customers why they’re asking for a piece of information at the time that they request it, in plain language. For example, customers are often asked to provide their date of birth when they visit a company website, without any purpose or explanation whatsoever.
Offering a brief explanation alongside the data request would go a long way to empowering customers and allowing them to make more informed decisions about what to provide. Walk-through videos of privacy settings are another good idea, as they help customers understand what they can control when handing over their data in a plain, easy-to-understand manner.
Give customers the right tools
Customers should be equipped with tools that let them understand just how much data they generate, how it’s an asset to the organization to which they’re lending it, and how they can control its use. A personal data cloud or vault can also let customers organize their data around various uses, like medical, social or banking, and dictate what use is appropriate and what isn’t.
Ideally, such tools should be easy to find and use, and also make it easy for people to remove their data if they so desire. Europe’s General Data Protection Regulation (GDPR) requires that customer data be erased without unnecessary delay. This approach, dubbed the “right to be forgotten,” may not be the law in Canada (yet), but we believe it’s a best practice worth embracing.
Put privacy at the heart of design
The simple principle here is that when new services are designed by developers, customer privacy is a core component at all times and the highest level of privacy settings is the default for customers who use the service in question. There’s a wide variety of things developers should be thinking about when designing for privacy. For example, users should be able to direct the service to automatically deactivate their accounts when they have been inactive for a specified period. Developers should also build in a means for users to easily export their data, records, files and information on demand, as some organizations already do.
Developers can also work to encrypt personal, sensitive information so that if a breach occurs, the data is incomplete and rendered useless to the attacker. As well, if data is only required for a temporary time, automatic “time bomb” technology can be used to purge the information in question. This helps avoid the creation of mountains of data that no longer serve any legitimate customer purpose, but that could pose an attractive target for hackers.
What’s clear is that as companies invest aggressively to make the most of customer data, meeting the status quo isn’t good enough. A fundamental rethinking of the pact between two parties is required – shifting privacy from a compliance measure to strategic practice.
If organizations don’t get this right, they risk alienating their existing customers and creating lasting and significant mistrust and reputational damage.