Behavioural black magic
In 1957, the US market researcher James Vicary claimed he could get moviegoers to “drink Coca-Cola” and “eat popcorn” by flashing messages onscreen for such a short time that viewers were unaware they had seen them.
The term “subliminal advertising” was coined to describe this unnerving practice and it was subsequently banned in many countries, including the UK, for fear it could be used by governments and cults to manipulate citizens.
Vicary later admitted that he fabricated his results but subsequent research has shown that messages we are not consciously aware of do leave an impression, as the illusionist Derren Brown regularly shows to good effect.
Wind forward 60 years from those early subliminal experiments and we are now witnessing the rapid emergence of behavioural science as a powerful force in our industry. Behavioural science which is altogether more highbrow than flashing a picture of a refreshing drink on the billboard in hot weather and will underpin the mass customisation of financial products.
We are, I believe, at the beginning of a period where specific data about each of us is becoming more available and increasingly detailed. With a few service calls and the relevant permissions, a look into your social media data could quickly profile you as a friendly risk taker who is prone to impulse purchases. This data could easily be used by behavioural scientists to manipulate consumers and direct behaviour. Bolt on some artificial intelligence (AI) to learn the best way to sell and you get a powerful tool that any organisation could use as a black box to drive sales.
These techniques can be a very powerful tool for good. In the US, new insurance entrant Slice uses detailed understanding of behaviour to ensure consumers get precisely the insurance cover they need.
Similar techniques are being used in the investment sector to make arduous customer journeys shorter, and ensure that investments are appropriate to the customer’s circumstances.
But the same techniques used in isolation can be dangerous too. When I log into my bank and see that I could have a pre-approved loan “now” is that a nudge? And is it a nudge in the wrong direction without a whole lot more data about me?
When the combination of behavioural science, big data and AI becomes as pervasive as I expect it to be, will the Financial Conduct Authority (FCA) even be able to see the subliminal messages any more, much less regulate them?
By Michael James, head of technical architecture at Altus Consulting