I wonder whether Netflix's intricate algorithm could be the next iteration of Amazon’s ‘Customers also purchased…’ feature, potentially signalling the future of targeted advertising. We’ve grown to accept that companies are analysing our search history to sell us products. But what happens when they are targeting customers based on the colour of their skin?
Diversity in advertising is improving, but what if digital billboards could analyse and adjust their messaging in line with who’s approaching? In fact, it’s already happening, as advertisers turn to technology to sophisticatedly target passers-by. Translate this to tv and streaming, and there is the potential for future advertisers to deliver campaigns tailored to each gender, age group or ethnicity, maximising the amount of attention they can win.
But the Netflix debacle also forewarns about a darker future, in which our individual preferences are trumped by those of our collective demographic. English professor Chris Gilliard believes that using algorithms that are based on our viewing or purchase history is dangerous territory for brands, which are inadvertently encouraging racial profiling. ‘Once products and, more importantly, people, are coded as having certain preferences and tendencies, the feedback loops of algorithmic systems will work to reinforce these often flawed and discriminatory assumptions,’ he wrote in a recent article for Real Life magazine. ‘The presupposed problem of difference will become even more entrenched, the chasms between people will widen.’
Brands must acknowledge the biases that some algorithms are forcing on society, addressing and adjusting them accordingly. For more, read our macrotrend Morality Recoded.