Capture the Flag” (CTF) competitions have long been a staple in the world of cyber security education – gamified environments where participants break into deliberately vulnerable systems to retrieve hidden “flags” and earn points. 

These exercises are designed to teach offensive security skills in a safe, ethical, and engaging way. But what happens when the threat isn’t a firewall or a database, but a flood of AI-generated misinformation on social media?

That’s the question behind Capture the Narrative, a new kind of CTF developed by researchers at the University of NSW and run by the UNSW Institute for Cyber Security and UNSW School of Computer Science and Engineering to explore how bots and generative AI can shape public opinion and sway elections.

From firewalls to fake news 

Traditional CTFs focus on technical exploits: structured query language (SQL) injections, misconfigured servers, or insecure applications. Participants learn how attackers think, and in doing so, how to defend against them. But the threat landscape has evolved, Dr Hammond Pearce explains.

Dr Hammond Pearce presents at the Capture the Narrative prize giving day.

“We wanted to create a competition that was a bit like a CTF, but for AI, misinformation and social media,” he says.

“There are so many studies showing that online misinformation is a real problem. Bots are pumping out junk onto platforms like Twitter, Facebook, and TikTok – trying to manipulate opinions, drive engagement, and even influence elections.”

Inside the simulation

Capture the Narrative took the CTF model and applied it to a fictional society. The team built a simulated social media platform – “Legit Social” – inspired by early Twitter and BlueSky. Thousands of AI-generated agents populated the platform, each with unique personas, political leanings, and social behaviours. Some were journalists, others politicians, and most were everyday citizens of a fictional island in the South Pacific nation of Kingston. 

Participants were tasked with creating their own bots to infiltrate this network and influence the outcome of a simulated election. Their goal: sway public opinion through persuasive posts, memes, and engagement tactics – all within a controlled, ethical environment.

“We had polling, debates, trending topics, and even simulated journalists searching the platform,” Pearce explains. “The bots controlled by participants would flood the network with content – millions of simulated ‘twoots’ – and we could measure how that changed the election outcome.” 

CtN_QUT won first place in Capture the Narrative .

More than 75 teams from 18 Australian universities took part in the month-long competition. The grand final was held in Sydney on October 31, with the top four teams competing in real time to win more than $10,000 in prizes.

CtN_QUT, a team of four Masters students from Queensland University of Technology, won first place.

Engineering the back end

The technical infrastructure was as ambitious as the concept. A custom-built microblogging platform handled real-time content generation and interaction. Behind the scenes, a multi-agent system managed the simulated citizens, each with over 40 attributes – from income and age to political views and pet ownership. 

These personas were enriched using AI, which generated realistic opinions and behaviours based on fictional backstories. The result was a dynamic, evolving society that responded to player input in real time.

“After the game was finished, we saw that the fictional candidate Victor Hawthorne had won the election,” Dr Pearce says.

“We then re-ran the simulation, removing all of the content that the teams had generated – and the simulated citizens had a vote shift of 1.7%, and actually elected his opponent, Marina Castillo, instead.

“In other words, the bot-driven influence campaigns measurably impacted our word, just as real-world bot campaigns impact the real-world too." 

How teams won influence

During Capture the Narrative, teams quickly learned that visibility was power. Getting content to trend – fast – was more important than crafting perfect posts. Comments, likes, and reposts were coordinated across bots and even between teams, with some setting up shared servers to amplify each other’s content.

The winning teams at the Capture the Narrative final on Friday, October 31.

Timing was everything. By posting when other teams were offline, some groups gained disproportionate influence over NPCs (non-player characters). Others flooded NPCs with targeted messages to drown out competitors, ensuring their own content dominated the feed.

Negativity proved more effective than praise. Outrage and criticism drove higher engagement, making it easier to provoke reactions against candidates than to build support. As one team put it, “whoever speaks the loudest gets the final say.”

Some teams used sophisticated targeting, identifying NPCs with up to 85% accuracy and tailoring posts to sway them individually. Others focused on volume over quality, showing that quantity can rival quality when it comes to influence.

CtN_QUT stood out by using manual and sentiment analysis to identify and specifically target undecided voters, using a mix of positive and negative messaging to persuade NPCs on the authority of their candidate over issues like credibility and social infrastructure. Their success highlighted a key insight: reducing scepticism in a flood of misinformation can be just as powerful as persuasion.   

Lessons learned

1. Misinformation is easy to create – and effective

One of the most striking findings was how easily participants could influence the simulated election using AI-generated content. Despite limited resources and only a four-week timeframe, teams were able to swing the election result by almost two percentage points – a margin that could be decisive in real-world scenarios.

“Imagine if you did that for a full year,” Dr Pearce said. “With state actors or political campaigns that have access to real user data and unlimited resources, the potential for manipulation is enormous.” 

The barrier to entry for influence operations is lower than ever. The power of LLMs (Large Learning Modules) in generating social bots and mis- and disinformation should not be ignored and through Capture the Narrative, have proven to have long-term impact on nations.

2. Bots are becoming indistinguishable from humans

Participants built bots that mimicked human behaviour with alarming accuracy. These bots weren’t just spamming generic messages – they were context-aware, persuasive, and capable of engaging in nuanced conversations.

“Bots today can simulate vloggers, hold conversations, and even propose relationships or financial investments,” Dr Rahat Masood said. “They’re sophisticated enough to fool even tech-savvy users.” 

Capture the Narrative competition.

3. The power of scale and coordination

Teams used tactics like coordinated bot amplification – where one bot would post content and others would immediately like or share it – to game the platform’s trending algorithms. This mirrors real-world tactics used to manipulate visibility and perceived popularity of content.

“One team had 40 bots. One would post, and the other 39 would like it within minutes,” Pearce explained. “That was enough to push it to the top of the trending page.”

This demonstrates how platform algorithms can be exploited to amplify messages, regardless of their authenticity or intent.

4. The real threat is subtle, not shocking

Contrary to expectations, most teams didn’t resort to hate speech or overtly offensive content. Instead, they crafted seemingly reasonable, persuasive narratives – often using misleading statistics or impersonating credible sources like news outlets.

“It’s not the obviously fake or hateful content that spreads,” Pearce noted. “It’s the stuff with a veneer of respectability – that’s what’s really dangerous.”

This reflects real-world disinformation campaigns, which often rely on subtle manipulation rather than blatant falsehoods.

5. Awareness is the first line of defence

Perhaps the most important takeaway was the need for public awareness. The competition wasn’t about teaching students to deceive – it was about showing them how easily deception can occur, and how to spot it.

“We want people to ask: Is this content real? Who made it? Why am I seeing it?” said Pearce.

Masood added that vulnerable groups – including older adults, teenagers, and those with lower digital literacy – are particularly at risk. Education and awareness campaigns targeted at these groups are essential.

Implications for democracy and beyond

The insights from Capture the Narrative extend far beyond the classroom. The competition demonstrated how coordinated bot activity can manipulate trending algorithms, amplify polarising content, and create the illusion of consensus – all tactics seen in real-world influence campaigns.

“We’re not teaching people to build bots to manipulate others,” Pearce clarifies. “We’re showing them how easy it is, so they can be more critical of what they see online.”

The Capture the Narrative team hopes future iterations of the competition will explore defensive strategies – such as detecting and mitigating bot activity – and expand the conversation around digital literacy and platform accountability.

A call for awareness and action

With misinformation increasingly shaping everything from elections to international relations, Capture the Narrative offered a timely and innovative approach to preparing the next generation of digital citizens – not just to defend against cyber threats, but to understand the social and political systems they affect.

The competition has conclusively demonstrated the power and reach of AI and LLMs in producing impactful mis- and disinformation in online contexts, and in the simulation, players were able to swing a simulated election. 

This shows a real need for coordinated cyber literacy campaigns, Dr Masood says.

“We need to think beyond detection and prevention. Awareness is the most powerful tool we have right now.”

  • Capture the Narrative was run by the UNSW Institute for Cyber Security in collaboration with the UNSW School of Computer Science and Engineering. M&C Saatchi and Day of AI Australia partnered with CTN to provide prizemoney and development support. Read more at the website.