AI isn’t just another tool
It’s official: the Cannes Lions Festival is in full swing! The crowds are bustling down the Croissette, vibrant branded experiences are around every (beach) corner, and the energy is high.
I got right down to business and popped over to the Palais to hear a panel on ‘The Creative Ethics of Cognitive Intelligence’, which ended up being the most prominent theme of the day. To provide the audience with varying viewpoints, the session featured panelists whose backgrounds were rooted in both Eastern and Western culture - The Venerable Tenzin Priyadarshi, the founding director of The Dalai Lama Center for Ethics and Transformative Values at MIT, as well as Astrid Boutaud, the global head of advertising at Sanofi. They explained that many AI systems are built based on Judeo-Christian values, but aren’t made to incorporate diverse views that could ultimately provide more contextually relevant experiences for consumers. Machine learning isn’t black and white – it listens, grows, adapts, and responds to the world around it. The world consists of democracies and dictatorships, different cultural priorities, and places where privacy laws are drastically dissimilar than those of neighboring states. So, much like human decision making, AI must be built to operate in gray areas.
When consumer safety is concerned, however, who takes the blame: man or machine? We’ve seen this issue arise in the US around self-driving car incidents because, as we all know too well, algorithms can make mistakes. AI is heavily influenced by its creators, and Boutaud argued that we must weave in regulations - not to curb technological or creative growth - but to foster the relationship between people and machines. There needs to be a push to market products responsibly (i.e. with fast food, smoking, and drinking) and use ethics as an optimization framework with which to regulate advertising campaigns. It’s interesting because with all the industry talk around machine learning, I haven’t heard nearly enough discussion around these systems having tailored ethical restraints. Priyadarshi was sure to note that agency lies with consumers as well, but at the end of the day, “AI isn’t just another tool,” Priyadarshi continued, “it’s the tool and mother to rule them all,” so brands need to take responsibility for using algorithms that promote a certain human behavior.
A shortlisted Nike campaign in Singapore had runners competing against themselves within a custom LED track. Runners’ had the opportunity to train alongside their own personal LED silhouette, which used uploaded consumer exercise data so the runners could compete to beat their best time – a truly incredible use of data and tech in a brand campaign.
Samsung in Australia executed a campaign using augmented reality and a bespoke algorithm that combined GPS, compass, and gyroscope data to make beach goers more easily aware of safety concerns. When opened at the beach, the Pocket Patrol app placed 3-D markets in virtual space on the ocean to inform visitors of dangerous ocean conditions and riptide warnings. Such an innovative way to keep people safe!
Written by Amanda Hechinger