Skip to main content
Start of main content.

The death of authenticity

trust

by Dr Rajat Roy, Associate Professor of Marketing at Bond University

It was social media that first sparked the move towards authenticity as consumers learned to distrust the glossy perfection that once defined advertising.

Now another shift is emerging as the rise of artificial intelligence makes it easier than ever to fake authenticity.

Over the past two years, AI systems have made it possible for anyone to generate flawless images, confident voices and convincing storylines in seconds.

What once required training, experience, skill and effort can now be produced by anyone with a laptop.

This explosion of “synthetic” content has quietly stripped away many of the cues people use to judge who and what they can trust.

Signals that matter

In marketing theory we call these cues signals – the little things we use to make sense of something we can’t see directly.

A brand’s reputation, a creator’s consistency, a company’s design choices, or even the quality of a photograph, told us something real about the care and competence behind them.

These signals matter because most of the time we cannot inspect the underlying truth for ourselves – we trust the cues instead.

AI has weakened many of these signals because the time and financial costs of creating them have collapsed.

If a perfect image with perfect lighting can be made in seconds, then a perfect image no longer shows that real effort or real expertise sits behind it.

Rajat Roy
Dr Rajat Roy.

When everyone can produce the same shiny output, the output itself starts to lose meaning.

This is something many people are starting to sense, but may not have the words to describe.

It’s not that the world suddenly became fake, it’s that the old markers of what was “real” are fading.

Feeling something is missing

There’s another shift happening too – people don’t just react to the content they see, they react to how it was made.

When persuasion is “synthetic” – delivered through algorithms rather than an identifiable human – people often feel something deeper than simple scepticism. They feel uneasy.

Research already shows that AI-assisted selling can trigger a sense that someone is planning to manipulate you, which leads to negative attitudes and disengagement.

Consumers aren’t rejecting technology itself, they’re rejecting the feeling that the persuader is missing.

As these patterns become more visible, people start to recalibrate their trust. Not necessarily away from digital life, but away from anything that feels unreal.

Proof of human effort

In practical terms, this means a renewed appetite for places and products where real human effort is obvious.

This might look like a move to more in-person events, seeking out small makers or artisan producers or attending live classes.

Recent research from Eventbrite found 74 per cent of young people surveyed in the US and UK thought in-person experiences were more important than digital ones and nearly 90 per cent wanted events that connected them to their community.

Industry data is also showing that a majority of people prefer in-person events when given a choice.

This doesn’t mean technology is in danger of fading into the background, what it means is that consumers will adapt and learn new cues.

Things that cannot be simulated will rise in value – presence, imperfection and the willingness to take social risks in front of others.

Rebuilding trust

For brands and creators, the message is that people want proof that a human made the thing they are seeing.

That might require a less formal style of communication, a more conversational voice, visible signs of labour or even allowing small mistakes to stay in the final product.

These are the signals that will matter more as AI becomes an unavoidable part of daily life.

Consumers already know something has changed, they can feel the difference.

The challenge now is to rebuild trust by restoring the signals that show something real still sits behind the message.

More from Bond

Previous Next