ChangeLAB: Writing for AI and humans

by Stratton Craig

image overlay

On 26 November 2025, we held a roundtable event as part of our ChangeLAB series, where attendees from a range of businesses came together to discuss how AI is changing the ways audiences find and consume brand content. Here is a summary of our discussion from the day.

How big of a change is AI search?

It’s clear that our content no longer just speaks directly to our audiences. With ChatGPT, Gemini and a growing number of other AI tools ingesting, indexing and interpreting huge amounts of content at scale, the stories we share are increasingly filtered through the lens of AI models.

Increasingly, AI is our first audience, and the summaries, snippets and rewrites it generates is what real people see initially. When SEO was king, the rules of the game were defined almost exclusively by Google, and the businesses that played well were rewarded. Today, ChatGPT, Perplexity and Bing might all summarise the same webpage differently. The variances in how models ‘hear’ the same story shows how easily meaning can be distorted or nuance lost.

This is a significant shift in the way that brands communicate with their audiences. And it creates a real tension: how do we write so AI interprets our messages accurately, without losing the emotional nuance people rely on to make decisions?

Is AI changing the way you write for your audience?

As a group, we agreed it was helpful to view AI as another step in the customer journey, acting as an intermediary that qualifies or shortlists options before humans dive deeper. To cater for this, it was suggested that we might see content strategy split into AI-facing content optimised for clarity and accuracy, and human-facing content that assumes higher intent and leans into emotional resonance.

Deep dive content, for example, is likely to remain important for people looking to do their own research into a product or service. Brands could pair this in-depth analysis with an AI-friendly structured summary to meet both audiences where they are. Alternatively, in-person experiences, high-quality print, and varied content formats may all play a more prominent role in communicating nuance and creating connection after the AI-intermediated part of the customer journey.

This makes it especially important for organisations to understand the specific job each piece of content is designed to do, and to measure whether it’s truly delivering on that job.

Should you be prioritising AI or human audiences in content development?

Despite AI changing how people might find our content, we all agreed that people still buy from people. Leaders listen to leaders. Communities form around individuals, not algorithms. AI still lacks cultural nuance and cannot replicate lived experience, humour, or the emotional undercurrents that drive real decisions.

Understanding what drives people’s decisions remains critical but could also become harder. Not everyone who engages with an AI-generated overview is a potential buyer, so organisations need to judge how much effort to invest in serving that early, low-intent stage. But you can’t ignore it either. The challenge is recognising intent, tailoring content accordingly and ensuring your message reaches the right audience at the right moment.

Does AI make storytelling for brands more or less important?

A key concern for everyone was how AI-generated writing often sounds very similar. With audience sensitivity to generic AI content growing, distinctive brand personality is becoming a strategic asset. It builds trust, memorability and emotional connection.

Effective storytelling signals that real people sit behind the brand, helping audiences connect on an emotional and rational level with a brand to make more confident choices. One attendee even admitted to intentionally adding obviously human flourishes to internal communications to make it clear that is was not AI-generated.

Is AI changing what we define as ‘good content’?

AI is already becoming embedded in content creation processes, but it is creating a risk of poor-quality content and eroding people’s ability to recognise good content. This suggests organisations need clearer, more robust governance around content production than ever.

Solid foundations like tone of voice, messaging frameworks, style guides can help anchor what good looks and sounds like. But brands also need to make sure they’ve got one (or more) experts in the right roles with to keep a keen eye on quality, consistency and strategic alignment.

That means defining who signs off AI-generated content, approving AI tools before they’re used, setting clear usage and brand-safety guidelines, providing teams with practical training, and continuously monitoring and refining the framework to make sure it’s working.

One attendee described the process of creating content with the help of an AI tool as playing a game of table tennis. It’s not a matter of entering one prompt and accepting the output unquestioningly. It works best when it’s used in refining or optimising human-generated content to hone its effectiveness.

It’s time to be proactive about quality, authenticity and accuracy

AI is a useful tool for filtering information. But it is there to help guide decisions, not make them for us. As more people rely on AI to sift through the noise, brands risk blending into machine-generated sameness. Having a compelling and authentic brand voice matters more than ever. So when your message does reach a human, it lands and connects.

Our roundtable session closed with an invitation: keep experimenting, keep learning, keep sharing insight with each other. And keep storytelling human.

Sign up to hear from us