Autonomous vehicles promise to revolutionise transport, but a single cyberattack could seriously erode public trust – perhaps even over the longer term (eg. Marcinkiewicz, V., Zhang, Q., Inatani, T., Ueda., Y, Asada, M., Katsuno, H., and Morgan, P. L. under review). 

Research across the UK, Japan, as well as emerging evidence from Australia shows that if self-driving cars experienced a hack today, the majority of people tested would never get in one again – potentially disrupting one of the most transformative technologies of our era before it even takes off. 

Whilst the research is based on interacting with video generated materials and driving simulation scenarios – the findings are concerning – with many recommendations from the Cardiff University School of Psychology team led by Professor Phil Morgan.

Phil Morgan has more than 25 years’ experience in behavioural sciences. The Cardiff University Senior Professor within the School of Psychology warns the biggest threat to autonomous vehicles likely isn't the technology itself, but our psychological unpreparedness for when things go wrong.

Speaking exclusively to UNSW Institute for Cyber Security members next week, Prof Morgan will present findings that overturn common assumptions about cyber security and automation. 

Prof Morgan says vulnerability isn’t just theoretical. 

“At the moment, with autonomous vehicles, most manufacturers aren’t thinking deeply enough about the effects of potential cyberattacks on end users,” he says. 

“It’s more of a case of ‘off you go into the autonomous vehicle, type in your destination, and everything will be fine’.

“Now think about aviation – one of the safest forms of transport, where passengers receive safety briefings, cards, and videos preparing them for unlikely emergencies. Without similar preparation for other robotic and autonomous systems – as well as AI, users face scenarios where, for example, vehicles might stop and trap them, fail to stop at red lights, or – in a worst-case scenario – entire networks of hundreds of vehicles could be simultaneously compromised.

“It's one of those very sad realities. If a cyberattack were to occur on autonomous vehicles right now – if there were 10,000 autonomous vehicles deployed in Australia – chances are a lot of people would never get in one again, and that could be very damaging for this potentially revolutionary change in transport – set to save lives, be more environmentally friendly, reduce congestion, and so on.”

Another revelation Prof Morgan will discuss at Kensington next week is that how emotionally attached you are to your devices is a stronger predictor of cyber-safe behaviour than traditional risk factors such as impulsivity or personality type. This is linked to his ground-breaking research with the global aerospace giant Airbus.

Prof Morgan, who is deveoping new definitions and methods to measure trust in cyber security, AI and automation, suggests the more psychological ownership we have of our devices, the more careful we are going to be when it comes to cyber security.

“How do we feel about work devices,” he asks.

“Do we feel that we psychologically own them? Can we have, for example, a screensaver of our favorite holiday destination and put stickers over them, so that if we were to lose it or something went wrong, we’d miss it because it feels part of us,” he says.

We know what the problems are. This is rooted in psychological and other theory that’s been around for years. We’ve fixed it in other domains.
Prof Phil Morgan

“Whereas if we’re given a work device, you might think ‘well, this is just a work device’. You can't do anything with it. If it gets damaged or lost, that's bad, but it's not the end of the world.”

Rethinking security education

Traditional cyber security training is failing, Prof Morgan says bluntly. 

One-day awareness sessions might hold attention briefly, but within a month, people revert to old behaviours. His research shows straightforward education and awareness “doesn't work” as a standalone solution.

The alternative is immersive learning – scenarios where people experience simulated cyberattacks in safe environments. 

He says when someone personally witnesses their social media account compromised in 18 seconds during a simulation (true story), they internalise the risk in ways lectures cannot achieve. They also become advocates, sharing experiences with friends and colleagues.

This approach reflects a fundamental shift in thinking. Rather than labeling humans as “the weakest link” – responsible for 85-95% of cyber breaches – Prof Morgan again draws parallels to aviation safety. Decades of applied psychological research transformed commercial flight into the safest form of travel despite inherent human vulnerabilities.

“We know what the problems are. This is rooted in psychological and other theory that’s been around for years. We’ve fixed it in other domains,” he says.

A multidisciplinary future

Solving cyber security challenges in an automated world requires collaboration across psychology, computer science, engineering, sociology, law, politics, and anthropology, Prof Morgan says.

No single discipline holds all the answers.

The work is supported by major industry partnerships, including Airbus, which employs 1,200 people in cyber security across the UK, France, and Germany. Prof Morgan directs Airbus's Human-Centred Cyber Security Centre and holds a strategic partnership between Cardiff University and Airbus Global.

As autonomous systems proliferate – from vehicles to AI assistants to robotics – human factors become not less important but more critical. 

The choice isn’t between perfect security and acceptance of vulnerability. It’s between reactive crisis management and proactive preparation, grounded in understanding how humans actually interact with technology.

 

  • Professor Phil Morgan will hold an exclusive talk with IFCyber members at Kensington’s CSE Building Basement (K17) on Tuesday, 1 December at 1pm. He will shine a spotlight on how we can better understand and measure our cyber vulnerabilities in order to fight back and achieve a state of seamless security and privacy in symbiosis with the AI, robotic and autonomous systems in which we increasingly share the world with. 


Professor Phil Morgan. Supplied