In the second of my trilogy based on the Brand Jam event at Brunel University, I’m dissecting Derek Macaulay’s talk on consumer data privacy – particularly relevant this week given LinkedIn’s recent misfortunes.
Derek talks privacy
Derek is Director of the Horizon Digital Research, a university-funded project that aims to investigate the technical developments needed if electronic information is to be properly controlled, managed and harnessed. His talk was focussed around the concept of privacy in the digital age – the idea that each of us leaves a lifelong digital footprint that we ultimately struggle to keep private until death, when all information and rights to privacy cease (possibly a massive generalisation; more my take on it).
On a contextual level, he made the point that we unknowingly (but increasingly, knowingly) leave behind digital traces; inferences that, welcome or not, provide a record of our personal activities for exploitation through systems such as behavioural marketing platforms and targeted advertising. The flipside is provision of better services, content and experiences (again based on our unique digital footprint), but increasingly this new age of enlightenment brings with it serious privacy concerns.
The point where my head was about to explode came when he referred to Social Media channels as artificial social constructs that should focus more upon transitive social interactions, but really serve as a platform for individuals’ self-actualisations – the reason why tangible product marketing opportunities are rarely awarded through behavioural analysis.
While I wholeheartedly agree that behavioural analysis is flawed, the idea that social platforms are merely artificial social constructs where 90% of content isn’t marketable is somewhat of a fallacy to me. We live in a materialistic world. The idea that people rarely talk about their latest purchase, or intent to purchase, seems a little thin – especially when, in our 13 years of experience, we’ve seen thousands of daily examples where a certain personal arrogance becomes more pronounced in social, as seemingly meaningless disclosures become valuable social currency.
Moreover, I believe that online privacy is not necessarily a concern most people recognise in their online lives; it is instead very much a product of the evolution of the digital marketplace. Isn’t trust generally earned by a brand through its reputation and actions, whether an individual has read its terms and conditions or not?
Perception of Horizon, and other privacy compliance research organisations, by marketing professionals could easily be that of latter-day Luddites, with over-pronounced privacy concerns easily standing in the way of progress by limiting the extent the marketplace can be invigorated by technological innovations.
Yet back in reality, these organisations and action groups do play a fundamentally important role, by clearing out the grey areas of privacy and providing better protection to the consumer.
Would progress be quicker, smoother and more inventive if data privacy concerns were put to one side? Is that really the solution?
And do cases of data loss present a significant risk to the use of any personal digital data in the future?