top of page
Picture 1qq.png

Balancing the Human and Algorithmic: the Future of Audience Insight

Picture 1a.png

When Wavelength was founded six years ago, the intention was to bring some integrity and authenticity back to the qualitative research process. At the time, the proliferation of data-driven tools and trends had led to an emotional detachment from real people, with empathy in short supply. Others voiced frustration with pre-determined research frameworks that served more as branding tools for agencies than as genuine avenues for audience discovery.

 

The solution was easy; place "people" rather than "process" or tools at the centre of the research approach. By asking what the most authentic way to find, understand, and tell the story of a given audience might be, we could then design research that prioritised genuine human connection.

 

However, the rise of AI poses a new challenge. AI isn’t simply putting the “process” before “people” – it’s threatening to omit people altogether from the research equation. This raises profound questions about the future of audience insight and the risk of losing the very essence of what makes research meaningful.

 

The pace of this change is particularly striking. Just four years ago, during the first COVID-19 lockdown, I wrote in the AQR magazine about the potential for a reflective resurgence of research with people in the flesh, a welcome respite from the ubiquity of Zoom-only conversations. Yet, the conversation has since shifted from people offline to people online, and now, to no people at all – a remarkable and concerning transformation.

 

While AI undoubtedly offers useful tools - to assist with mundane admin tasks, ideas, inspiration, real-time qual analysis, and the summarisation of data - the emergence of synthetic respondents powered by large language models (LLMs) poses a more profound challenge. These AI-driven "research" offerings, many which claim to provide "better than human" services, risk severing the connection between researchers and the very people they seek to understand.

 

Used carefully in partnership with qualitative research - synthetic data is, however, showing tremendous amount of promise: the ability for AI to inspire and shape audience personas for instance, which can then be qualified and enriched through in-person interviews is clearly a very efficient use of the technology. And I’m sure this is clearly only the start of how AI can begin to help us.

 

However, I would question whether using synthetic data alone - i.e. market research without any real people - can truly be considered audience research at all. Even if you could overcome the encoding biases that are present in AI-driven content - and force it to understand context and nuance, there is still the question of who exactly is being heard, and who is being overlooked? Will the importance of a carefully crafted sample simply be disregarded in the pursuit of speed, convenience, and the analysis of vast amounts of data?

 

The research industry has long emphasised the importance of developing approaches that help researchers "get closer" to their audiences. So, the prospect of "human-like" conversations that lack any sense of emotional depth, empathy, or trust-building necessary for authentic relationships seems counter-intuitive and even antithetical to this core principle.

 

I imagine these new synthetic research offerings as the modern equivalent of the stereotypical 1960s housewives in detergent commercials: those awkward, artificial portrayals of domestic life before the rise of qualitative research. Well, those veneers of human beings gave way to more nuanced, authentic "slice of life" ads as insights and strategy became more sophisticated (see Sterling Cooper’s use of focus groups).

 

But imagine a future where even that creative process is reduced to algorithms. The creative brief not generated by a spark of human understanding, but by cold, impersonal code. The result would be a world of advertising devoid of creativity - bland, superficial commercials that struggle to resonate with anyone on an emotional level.

 

As such, it’s challenging to envisage how “human-free” authentic connections can exist any time soon. No amount of digital digging or algorithmic prowess can replace the autobiographical nature of qual and its capacity to view people in 3D - through their thoughts, feelings and behaviours and uncover those unexpected nuggets of truth that emerge from the real world.

 

Of course, AI will only become more sophisticated, which is what makes this topic so compelling. No doubt there will be ‘AI powered capabilities’ that - in some way - offer extraordinary insight in the future. However, for now, we should remain vigilant to the pitfalls of over-automation and the potential for homogenisation of insights, or we’ll continue to lose any sense of empathy.

 

While I don’t have all the answers to how we navigate this transition, I hope the future of audience insight doesn’t come down AI versus humans, but AI complementing and enhancing human-centred research. The right balance, I’m sure, will lead to extraordinary possibilities - but we must never lose sight of the need to keep “people”, not just processes, at the heart of the endeavour.

 

One of my favourite quotes from the research practitioner Roy Langmaid:

 

“The core of research has always been about authentic contact. No matter how well informed you might be, if you haven’t met people then what you think about them has an element of fantasy about it.”

bottom of page