Follow

Keep up to date with the latest Stelia advancements

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

AI-driven hyper-personalisation and the trust deficit in media and entertainment

The personalisation opportunity in media has never been greater. But neither has the risk of getting it wrong.

POSSIBLE Miami this year was, in many ways, a room wrestling with the same question from different angles. How do you get closer to your audience in a time when getting closer has never been more achievable – or more complicated?

For me, the sharper question is how media organisations turn AI capability into measurable commercial impact without weakening the trust their audience relationships depend on.

AI capability today enables the delivery of hyper-personalised experiences in media and entertainment at a scale that was not possible even two years ago, bringing the right content to the right audience, at the right moment, and in the right format.

But the real opportunity is not just operational efficiency; it is better decisions, faster insight, stronger retention, and experiences audiences actively choose.

The commercial case for personalisation is compelling enough to be reshaping how every media organisation considers its audience relationships. Organisations that treat personalisation as a strategic lever, not just a technical capability, are the ones capturing measurable ROI, stronger retention, and audience loyalty. For the most competitive media businesses, personalisation infrastructure has already become deeply embedded into business models, and it is increasingly difficult to compete without it.

But alongside the personalisation opportunity, another conversation kept surfacing with equal consistency. The question of what AI-driven personalisation actually costs. Not financially, but in trust.

Below, I have unpacked the dynamics at play driving these competing pressures, and what they mean for the media organisations trying to get this right.

The personalisation opportunity in media has never been greater

Hyper-personalisation in media and entertainment has moved from a competitive advantage to a baseline expectation. Audiences no longer marvel at a well-timed recommendation, but instead notice when it’s absent. That raises the bar for media teams: personalisation now has to improve the audience experience and prove its value commercially.

Streaming platforms have spent years building and refining the infrastructure making this possible, and the results are evident in better discovery, lower churn, and stronger engagement. The personalisation engine has become central to how media businesses retain and grow their audiences today. The next stage is not simply more automation, but more intelligent decision-making, using AI to identify what matters, act faster, and create experiences that feel genuinely relevant. Increasingly, the standard is anticipation: removing friction, identifying need earlier, and creating value before the audience has to ask for it.

But the pressure to personalise has accelerated considerably faster than the frameworks for doing so responsibly. And the data at stake in the media industry isn’t equivalent to data in other industries. Knowing someone’s purchasing habits is one thing. Understanding their viewing behaviour, their emotional responses to content, and the patterns of how and when they engage is a different category of exposure entirely. This is why personalisation cannot be treated only as a data or marketing function; it is now a trust, governance, and brand relationship issue.

Audiences are beginning to understand this. And the way they respond to feeling over-exposed will have direct consequences for the platforms and brands that push hyper-personalised experiences too far, too fast.

Audience trust is eroding in real time

The erosion of trust is showing up in audience behaviour across creative industries. As AI-generated content proliferates across media environments, audiences are becoming more discerning about the provenance of what they consume and more conscious of the data relationships that underpin how it reaches them. In media and entertainment specifically, when the value of an experience is directly tied to its authenticity, the erosion of trust is tightly coupled to commercial consequences.

And audience engagement is the commodity at risk. When personalisation starts to feel like surveillance, when the recommendations and targeting feel too knowing and precise, audiences disengage. Trust is built in everyday moments: through consistency, transparency, reliability, and performance when it matters most. In real-time media environments, the brand is judged not by what it promises, but by whether the experience works when attention is highest and expectations are most intense. We can see this dynamic in subscription churn, in audiences gravitating toward creators with owned, trusted relationships over platform-served recommendations, and in the growing resistance to experiences that prioritise data extraction over genuine value. Audiences will share attention, data, and loyalty when the value exchange feels fair.

This recalibration of the terms on which audiences are willing to be known demands that AI-enabled hyper-personalisation be built around trust, not just capability alone.

What getting this right actually looks like

At POSSIBLE 2026, a keynote conversation between Michael Kassan, Founder and CEO of 3C Ventures, and Kellyn Kenny, Chief Marketing and Growth Officer at AT&T, crystallised this thinking clearly. The strategic positioning: customer trust is not for sale. That is a commercial choice with genuine consequences – it determines how AI gets deployed across the customer experience, and reflects a maturity in enterprise thinking whereby AI is not treated as a cost-cutting mechanism but as a growth driver that creates value for audiences rather than extracting it from them.

Equally notable was the emphasis on human oversight embedded into AI systems as a design principle, rather than simply a compliance exercise. Bolting governance onto AI after the fact may produce the appearance of responsible deployment, but only by building it in from the start – with full visibility into how AI is making decisions, where data is going, and who has access to what – can durable AI advantage be produced, with capabilities organisations and their audiences can genuinely trust.

Stelia AI OS was built to meet these demands, providing full observability and auditability across every AI interaction and giving teams the oversight required for responsible, compliant AI governance. Enterprise-grade access controls and sovereign deployment options mean that the sensitive audience data critical to enabling hyper-personalised experiences is able to remain within the organisation’s control, rather than passing through third parties.

The real work sits behind the experience: aligning systems, data, models, and operations so that trust and anticipation can be delivered consistently, not just promised. In an industry where the data relationship is uniquely sensitive, this foundation allows media teams to personalise responsibly, protect audience relationships, and turn AI capability into measurable commercial advantage.

What this means for media teams

For media organisations, navigating the tension between personalisation and privacy requires both sides of the equation. Secure, governed infrastructure with the ability to personalise at scale is essential – but so is the relationship it operates within. The brands and platforms that will compound advantage over time are those that pair sophisticated AI capability with genuinely trusted audience relationships.

That means consistent creative integrity, transparency about how AI is shaping the experience, and storytelling that earns audience permission rather than assuming it. Individual creators are already demonstrating what this looks like in practice – building portable, owned audience relationships that follow them across platforms precisely because the trust is genuine. For media organisations, the lesson is the same: the most durable personalisation is personalisation built on a foundation audiences have actively opted into.

Personalisation without trust doesn’t scale

The personalisation opportunity in media and entertainment is not going away – and neither is the privacy awareness of the audiences it depends on. Rather than opposing forces to be managed, brands must treat them as conditions to be designed for, together.

The organisations that treat this as a design problem – building trust into the architecture of how they use data, and how they deploy AI from the start – are the ones that will compound advantage over time. As POSSIBLE surfaced, responsible personalisation at scale demands that trust, capability and commercial impact are designed together from the beginning, not reconciled after the fact.

Stelia AI OS