Inside the cyber competition that trains students to detect – and defeat – AI misinformation
2025-11-27T09:00:00+11:00
Prize winners from the UNSW Capture the Narrative competition.
Photo: UNSW
What happens when students use AI bots to flood a fake social network? A four-week experiment shows why influence is louder than facts.
“Capture the flag” (CTF) competitions have long been a staple of cyber security education – they’re gamified environments where participants break into deliberately vulnerable systems to retrieve hidden “flags” and earn points.
These exercises are designed to teach offensive security skills in a safe, ethical, and engaging way. But what happens when the threat isn’t a firewall or a database, but a flood of AI-generated misinformation on social media?
That’s the question behind Capture the Narrative, a new kind of CTF developed by researchers at UNSW Sydney. It’s run by the University’s Institute for Cyber Security and School of Computer Science and Engineering to explore how bots and generative AI can shape public opinion and sway elections.
“We wanted to create a competition for AI, misinformation and social media,” UNSW Senior Lecturer Dr Hammond Pearce says. “There are so many studies showing that online misinformation is a real problem. Bots are pumping out junk onto platforms like X, Facebook, and TikTok – trying to manipulate opinions, drive engagement, and even influence elections.”
Media enquiries
For media enquiries about this story and interview requests, please contact Kyle Mackey-Laws, Media Officer, UNSW Institute for Cyber Security.
Email: k.mackeylaws@unsw.edu.au
Inside the simulation
Capture the Narrative took the CTF model and applied it to a fictional society. The team built a simulated social media platform – “Legit Social” – inspired by early Twitter and BlueSky. Thousands of AI-generated agents populated the platform, each with unique personas, political leanings, and social behaviours. Some were journalists, others politicians, and most were everyday citizens of a fictional island in the South Pacific nation of Kingston.
“We built simulated personas using AI, which generated realistic opinions and behaviours based on fictional backstories. The result was a dynamic, evolving society that responded to player input in real time,” Dr Pearce says.
Participants were tasked with creating their own bots to infiltrate this network and influence the outcome of a simulated election. Their goal: sway public opinion through persuasive posts, memes, and engagement tactics – all within a controlled, ethical environment.
“We had polling, debates, trending topics, and even simulated journalists searching the platform,” Dr Pearce explains. “The bots controlled by participants would flood the network with content – millions of simulated ‘twoots’ – and we could measure how that changed the election outcome.”
More than 75 teams from 18 Australian universities took part in the month-long competition. The grand final was held in Sydney on October 31, with the top four teams competing in real time to win more than $10,000 in prizes.
CtN_QUT, a team of four master's students from Queensland University of Technology, won first place.
How teams won influence
During Capture the Narrative, teams quickly learned that visibility was power. Getting content to trend – fast – was more important than crafting perfect posts. Comments, likes, and reposts were coordinated across bots and even between teams, with some setting up shared servers to amplify each other’s content.
“Timing was everything,” UNSW Senior Lecturer Dr Rahat Masood says. “By posting when other teams were offline, some groups gained disproportionate influence over NPCs (non-player characters). Others flooded NPCs with targeted messages to drown out competitors, ensuring their own content dominated the feed.”
Negativity proved more effective than praise. Outrage and criticism drove higher engagement, making it easier to provoke reactions against candidates than to build support. As one team put it, “whoever speaks the loudest gets the final say”.
Some teams used sophisticated targeting, identifying NPCs with up to 85% accuracy and tailoring posts to sway them individually. Others focused on volume over quality, showing that quantity can rival quality when it comes to influence.
The winning team stood out by using manual and sentiment analysis to identify and specifically target undecided voters, using a mix of positive and negative messaging to persuade NPCs on the authority of their candidate over issues such as credibility and social infrastructure.
“Their success highlighted a key insight – reducing scepticism in a flood of misinformation can be just as powerful as persuasion,” Dr Pearce says.
Lessons learned
1. Misinformation is easy to create – and effective
One of the most striking findings was how easily participants could influence the simulated election using AI-generated content. Despite limited resources and only a four-week timeframe, teams were able to swing the election result by almost two percentage points – a margin that could be decisive in real-world scenarios.
“Imagine if you did that for a full year,” Dr Pearce says. “With state actors or political campaigns that have access to real user data and unlimited resources, the potential for manipulation is enormous.”
2. Bots are becoming indistinguishable from humans
Participants built bots that mimicked human behaviour with alarming accuracy. These bots weren’t just spamming generic messages – they were context-aware, persuasive, and capable of engaging in nuanced conversations.
“Bots today can simulate vloggers, hold conversations, and even propose relationships or financial investments,” says Dr Masood. “They’re sophisticated enough to fool even tech-savvy users.”
3. The power of scale and coordination
Teams used tactics like coordinated bot amplification – where one bot would post content and others would immediately like or share it – to game the platform’s trending algorithms. This mirrors real-world tactics used to manipulate visibility and perceived popularity of content.
“One team had 40 bots. One would post, and the other 39 would like it within minutes,” Dr Pearce says. “That was enough to push it to the top of the trending page.”
4. The real threat is subtle, not shocking
Contrary to expectations, most teams didn’t resort to hate speech or overtly offensive content. Instead, they crafted seemingly reasonable, persuasive narratives – often using misleading statistics or impersonating credible sources like news outlets.
“It’s not the obviously fake or hateful content that spreads,” Dr Pearce says. “It’s the stuff with a veneer of respectability – that’s what’s really dangerous.”
5. Awareness is the first line of defence
Perhaps the most important takeaway was the need for public awareness. The competition wasn’t about teaching students to deceive – it was about showing them how easily deception could occur, and how to spot it.
“We want people to ask: is this content real? Who made it? Why am I seeing it?” Dr Pearce says.
Dr Masood says vulnerable groups – including older adults, teenagers, and those with lower digital literacy – are particularly at risk.
“Education and awareness campaigns targeted at these groups are essential,” she says.
Implications for democracy and beyond
The researchers say insights from Capture the Narrative extend far beyond the classroom.
“The competition demonstrated how coordinated bot activity can manipulate trending algorithms, amplify polarising content, and create the illusion of consensus – all tactics seen in real-world influence campaigns,” Dr Pearce says.
“But we’re not teaching people to build bots to manipulate others. We’re showing them how easy it is, so they can be more critical of what they see online.”
The competition has demonstrated the power and reach of AI and LLMs in producing impactful mis- and disinformation in online contexts, and in the simulation, players were able to swing a simulated election.
This shows a real need for coordinated cyber literacy campaigns, Dr Masood says.
“We need to think beyond detection and prevention. Awareness is the most powerful tool we have right now.”
- Capture the Narrative was run by the UNSW Institute for Cyber Security in collaboration with the UNSW School of Computer Science and Engineering. M&C Saatchi and Day of AI Australia partnered with CTN to provide prize money and development support.
- Read more at the website: https://capturethenarrative.com
Related stories
-
AI heavyweights call for end to ‘superintelligence’ research
-
AI is using your data to set personalised prices online. It could seriously backfire
-
More people are using AI in court, not a lawyer. It could cost you money – and your case
-
Misinformation on refugees and migrants is rife during elections. We found 6 ways it spreads – and how to stop it