a cute robot holds a clipboard and pen

On The Agenda


Synthetic Testing: A Race To The Bottom?

Should we be replacing human research with AI research? We ask strategists to weigh in

By Creative Salon

After a year of breathless hype, generative AI’s impact on marketing and advertising is coming into clearer focus. The biggest players in advertising are placing bold bets on AI-driven synthetic testing.

Publicis Groupe will invest around $325 million over the next three years to better position its business around generative AI and to support an in-house platform called CoreAI that draws on troves of Publicis data, including 2.3 billion consumer profiles.

In March, Accenture Song announced its investment in Aaru, a tool that uses language models to simulate and predict human behavior at scale. Aaru’s AI agents can model entire populations and forecast events within minutes—offering a glimpse into the future of marketing research without the usual delays. Meanwhile, WPP is investing £300 million this year to expand WPP Open, its AI-powered platform designed to let agencies test creative ideas in real time against synthetic consumers. The promise? Faster, sharper insights without the logistical drag of traditional methods.

This breakneck rise of synthetic research has sparked heated debate in the industry, polarising those who see it as a shortcut to creativity and those who fear it flattens the messy magic of human culture. There are obvious cost efficiencies when it comes to using synthetic research - but are there risks? And do the efficiencies make those risks palatable? Does synthetic data by its very nature not ignore random fluctuations in behaviour and sentiment? How does that data work when brands and businesses keep on talking about "playing in culture” or when the synthetic audience data becomes homogenised by biased data? Yet, perhaps the real opportunity lies not in choosing sides but in learning to play with the tension.

The following perspectives from leading strategists explore the promise and pitfalls of synthetic research, offering a view of how it can shape the future of creative strategy.

The consensus seems to be that this is not about replacing human insight but amplifying it. Used well, synthetic research can accelerate learning, spark more daring questions, and create a space where ideas can stretch beyond the limits of conventional wisdom. But that it takes care, curiosity, and a willingness to embrace the unpredictable to avoid falling into the trap of algorithmic homogeneity.

Matthew Waksman, head of strategy, advertising - Ogilvy

When people ask me what is planning, I come back to one simple thing. Our job is to bring voice of the consumer into the room. So, you might expect me to take a provocative ‘Over my dead qualitative body’ stance on this one. But beyond the divisive terms ‘synthetic testing/research,’ there are many ways AI can help elevate consumer perspectives in the creative process. I’m going to get tangible and share a couple of examples because I find the way the AI debates tend to stay in the land of theory and outrage a bit unhelpful.

One of the things I adore about qual is that you get very close with your audience. I kind of see it as getting into character. It’s why on every pitch and brief you’ll find me getting into field and doing at least some of the moderating myself. But what I hate about qual is the pile of static notes that I then go back to time and time again to try and work out what would Ian or Gemma think about where we are now? For me, this is a powerful use case of synthetic testing. Do your own qual, create your own bank of enormously rich and individual data in the form of transcripts, then use that data to create personas that you can interact with. It’s the same as what I’ve always done, going back through my qual notes, but more insightful and less tea stain-ey.

Synthetic testing largely focusses on AI’s role in generating insight for research but the thing about data is this - it’s not how big your data set is, it’s how you use it. We’ve all been there with a whopping big quant or qual debrief or academic review that is full of insight but gets abandoned as nobody can be bothered to read it properly. A synthetic approach can add so much value here. For example, I really enjoy using AI to create podcasts featuring personas I can build out of different types of research data. It’s a way of packaging insights that’s easy and even fun for the team to engage with.

Synthetic Testing or Synthetic Research are somewhat misleading terms because when they work best, they don’t replace the richness of human and other forms of ‘organic’ research. I find they simply makes insight more malleable, agile, and easier to take in. And if our job is to bring the voice of the consumer into the room, then they are tools I’m happy to embrace.

Ben Worden, chief strategy officer, VML

In almost 20 years working in agencies, the phrase I’ve heard used most often is “it doesn’t work”, closely followed by its near relative “it works”. When something “doesn’t work” the assumption is made that any ideas like that must be locked in a box and can never be tried again. Ever.

And when something “works” it tends to mean that agencies, brands, and indeed, entire industries get stuck in the pattern of doing incremental variants of the exact same thing over-and-over again.

Synthetic Research represents a great way for us to cast aside our tendency to be so binary about what works and what doesn’t and embrace a more experimental way of being.

My view is that, far from being a race to the bottom, in the right hands Synthetic Research can help us ascend to new heights in terms of the ideas we create and the things we can make and do.

Three things to remember:

  • Don’t expect it to give you all the answers. Use Synthetic Research to sharpen your understanding, and to stress test your thinking, not do the thinking for you. Human value creation remains the lifeblood of strategy. What Synthetic Research can help you do is get to the value creation bit sooner.

  • Dull questions lead to dull answers. Use Synthetic Research to ask daring, provocative questions. Use it to experiment and explore. If you’ve got wild ideas, testing them out with an agent that has been trained on the right data set could be the perfect way to prove that there’s method in the madness.

  • Don’t imagine that the arrival of Synthetic Research means that other forms of research are no longer necessary. To do so is the equivalent of me telling my Mum that her grandchildren no longer want/need to visit her because we have a Whatsapp group and Facetime.

Ruairi Curren, executive strategy director, Gravity Road

Marketers crave predictability and have to operate in a world of budget cuts and measurement obsession, so it's not hard to see why synthetic testing appeals. But marketers also know that the best marketing is unpredictable, that you can’t “test” your way into originality, and that if you optimise creative for what’s expected, you never get the unexpected. Here I find David Golding’s ‘Culture vs. Collateral’ framing useful: synthetic testing can work brilliantly for collateral—performance-driven work nudging audiences along a customer journey. But for culture, optimising against AI-generated synthetic audiences risks creating the Spotify effect, where algorithms optimised pop music for instant gratification—hooks in the first 5 seconds, predictable chord progressions, ‘proven’ formulas.. technically perfect, but missing the soul. So if you want creative that performs like an efficient machine, synthetic testing can help. But if you want creative that changes culture, it won't serve you better than the intuition and creative instincts of your agency team.

Dan Hulse – chief strategy officer, St Luke’s

If there’s one thing that scares agencies more than the thought of consumers having the ultimate say over their ideas, it’s the prospect that a synthetic consumer might. I get it—there’s a real possibility that every piece of work will be run through substandard synthetic testing, like a glorified version of “we showed it to some folk around the office,” but with all the authority of a quant pre-test. It’s easy to imagine how that leads to a worse outcome for everyone.

But I’ve got a more positive view. Like any digital innovation that lowers costs, the real magic comes from using it to do something new. There’s no question that if you’re swapping a black cab for an Uber, the experience is a step down in quality. But Ubers are so much cheaper that you use them when you wouldn’t have taken a cab—maybe even to make trips you wouldn’t have otherwise made.

When research is expensive, you do it only when necessary, and you have to listen to what it says. That can make agencies and clients conservative: if there’s only time and money for one round of research, you’d better have something that ticks all the boxes. But synthetic testing is so much faster and cheaper that it can be incredibly liberating. Conventional research is like working on a campaign idea for months and then putting your best bet into testing while the agency and client wait nervously for a result. Synthetic testing is like having the audience sitting in the next room while you’re working on your ideas. You can pop in, ask them a few questions, and run some early thoughts by them—even the batshit ones.

I’m not for a second saying that synthetic data can unlock the same deep, unexpected insights that a skilled researcher can. But maybe that’s not its real value. I’m more excited about using it as a sounding board to push our ideas to more interesting places.

Peter Mckenzie Jones – technology and CX director, Grey

Synthetic data holds great promise for creative agencies, offering the ability to craft campaigns pre-tested on simulated, diverse, and nuanced "audiences". It is already transforming how we develop our work, enabling us to explore bold ideas, refine concepts, personalise messaging, and understand niche segments in innovative ways. Moreover, it allows for the exploration of a wide range of strategic possibilities and creative directions that might be impossible or impractical with traditional data and research—all before real-world ad budgets are spent.

But does synthetic data, by its very nature, risk ignoring the messy, random fluctuations in human behaviour and sentiment that make culture what it is? Culture isn’t static; it’s fluid, chaotic, and often irrational. It thrives on anomalies and moments of collective spontaneity—qualities that algorithms may struggle to predict. Synthetic data, if not handled with care, risks flattening this complexity into something overly sanitised—a neat, predictable version of humanity that doesn’t exist. Worse still, poorly constructed models could amplify existing biases, creating caricatures rather than true reflections of real people.

Building trust in the outputs of this technology requires a commitment to ethical model-building, rigorous testing, and a critical examination of the source data. Like traditional research methods, we must acknowledge and address its limitations, ensuring it complements rather than constrains our creative process.

Ultimately, synthetic data should serve as a tool to amplify creativity—not replace the intuition, empathy, and cultural awareness that lie at the heart of great work. After all, culture isn’t something you simulate; it’s something you live, breathe, and participate in. As creative agencies, our job is to play in that space—not just observe it from a synthetic distance.

Share

LinkedIn iconx

Your Privacy

We use cookies to give you the best online experience. Please let us know if you agree to all of these cookies.