With consumers and businesses alike becoming ever more dependent on digital products, services and partnerships, our digital footprints are growing exponentially. By 2025, the World Economic Forum expects that the amount of data generated daily will reach 463 exabytes – an astronomical figure. Increasingly, with the help of smart technology such as automation, data analytics and Artificial Intelligence (AI), much of this data is being harnessed to personalise the products and services that we interact with daily.

When browsing on social media sites, for example, our browsing history and online behaviour (such as previous purchases and brand interactions) will dictate our news feeds and the online advertising presented to us – with clever algorithms serving up the curated content deemed most relevant for our digital profiles.

For brands and businesses hungry for growth, the data trail left by consumers provides almost infinite opportunities to personalise their advertising and marketing messages – and to provide content and offerings that are directly relevant (and highly customised) for individuals. For example, if a financial services provider has insight into your age, earnings bracket, financial ambitions and risk profile, it can provide financial products that are directly tailored to your profile.

Yet in the light of massive data abuse scandals (igniting the Great Privacy Awakening) and intrusive brand communications, consumers are becoming increasingly wary of tech providers, brands and businesses that are overstepping delicate boundaries – and infringing on inherent rights to privacy.

Remote working and privacy

Understanding what consumers really want

The push and pull between gaining access to personalised services and protecting one’s private information is placing brands and marketers in a difficult place, to put it mildly.

Arguably, defining the boundaries – and finding a fair balance between consumer privacy and personalisation – is one of the most critical issues shaping businesses and technology ecosystems today. As business leaders are discovering, there are conflicting messages emerging from market research that only deepens the debate, instead of answering the questions.

For example, according to Accenture research, 69 % of consumers wouldn’t do business with a brand if its data usage was invasive, while 87 % say it’s important to buy from a brand or retailer that ‘understands the real me’.

Similarly, a recent Gartner survey found that nearly 40 % of customers would stop doing business with companies if they found their personalisation “creepy”; while a study by SmarterHQ, a behavioural marketing company, found that 90 % of customers would willingly share behavioural data for an easier and cheaper shopping experience.

These conflicting reports illuminate what analysts have termed ‘the Privacy Paradox’: while consumers want value and customised offers and experiences, they do not want it at any cost. This paradox is made even murkier by the fact that many consumers don’t fully understand the trade-offs that they are making (either intentionally or unintentionally), and that many of the services they currently enjoy are made possible by algorithms that leverage their personal data.

Risk-averse privacy strategies won’t work, says Gartner

For brands and businesses, the current challenge is to create a symbiotic relationship between personalisation and privacy. Without doubt, personalisation yields major value for the customer, who can receive offers and deals specific to their circumstances and profile (eliminating the need to sift through content and offers that have no immediate relevance). This type of ease, speed and convenience is fast becoming a hallmark of retail success, both online and offline.

According to Gartner, personalisation is ‘the key to survival in the cut-throat world of digital retail,’ noting that with conversions to sale frequently in the low single digit percentages for many retailers, they have to optimise conversion to survive and thrive.

With this in mind, Gartner asserts that risk-averse privacy ideas ‘often prevent organisations from creating great customer experiences’, precisely at a time when customers expect to be recognised – and do want their experiences personalised.

“Despite having less trust in brands to use their data ethically, millennials are more willing to provide companies with information in exchange for convenience and personalised experiences,” noted a recent Gartner survey.

Importantly, it appears that many brands operate with the belief that personalisation and privacy are conflicting efforts, not ‘symbiotic opportunities’. This limits their potential to use data in a transparent, value-creating manner.

“Organisations are losing their best chances to create great customer experiences due to needlessly risk-averse privacy ideas that limit the use of personal data,” says Penny Gillespie, VP Analyst, Gartner. “The key is to bring value to customers and keep data use in context.”

Embracing responsible personalisation

As brands and marketers try to navigate this increasingly complex digital landscape, their primary task is to develop trust with the consumer through the responsible and transparent use of their data (responsible personalisation). This approach arguably creates value for customers, and also generates increased profits and innovation opportunities for businesses.

To illustrate this point, it’s critical to remember that owning data is NOT the new oil… while having access to consented data is. Put in another way, the issue isn’t necessarily about what data is collected, but rather about how it is used, how transparently such mechanisms are disclosed, trust between data subject and brand, and faith in the confidentiality of the data held.

While technology providers are working furiously to provide technical solutions to this challenge (for example, through zero knowledge data exchange), marketers and brands have a major role to play in building trust and transparency through their campaigns. By leveraging personalisation and automation as an extension of your marketing team (thereby providing more value to customers while freeing up marketing resources for strategy and innovation), you can leverage data intelligently and in such a way that consumers are willing to trust and share key data without feeling exposed.

So how is this trust created and protected?

In our view, to build legitimate trust (while harnessing the many benefits of automated marketing and communications), the forward-looking marketer’s role should include key actions, such as:

  • Setting clear expectations and being consistent
  • Educating and informing to minimise fear
  • Being invisible but transparent
  • Holding yourself accountable
  • Being competent
  • Understanding boundaries and being fair

In both the short and long term, nurturing trust in these ways supports openness, and consumers will be more open to brand and marketing interactions (many of which will be triggered by smart, automated communications). Indeed, if recipients know that you really took the time and made the effort to understand them – and have received permission to communicate with them – they will be far more receptive to receiving personalised content that creates value in their day-to-day lives.

By JD Engelbrecht, Managing Director, Everlytic.

Latest Tweets From @DUOMarketing

  • Error: You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.