Scientists at the University of Adelaide’s School of Computer Science will be working on a new set of defence-related projects related to advanced cyber capabilities.
The university has signed agreements with the federal government’s Defence Science and Technology Group (DSTG) for research led by professor Debi Ashenden, associate professor Hung Nguyen and associate professor Damith Ranasinghe.
“These projects are funded under the Next Generation Technology Fund’s cyber program and are four of the 11 that have been given the go-ahead nationally,” Defence and Security Institute (DSI) director and University of Adelaide’s Defence, Cyber and Space academic coordinator professor Michael Webb said.
“Projects such as these that help keep Australia safe and enable our researchers to collaborate with like-minded organisations at the highest levels.”
Ashenden holds the DST Group-University of Adelaide joint chair in Cyber Security. Her project is part of DSTG’s Next Generation Technology (NGT) Cyber Call 2020 – Fusing Behavioural Science and Cyber Deception – Fighting Wars from Inside Machines.
“My project aims to fuse behavioural research on deception with cyber deception technology, artificial intelligence and machine learning (AI/ML),” she said.
“Our intention is to develop cyber deception threat models and effects that integrate behavioural science with technology, alongside a toolkit that will assist in delivering novel cyber deception effects. We will explore the limits of how AI and ML methods can improve and automate cyber deception.”
The research will build a sovereign capability with the aim of increasing operational advantage to Australian defence.
Ashenden’s project will forge a new research partnership between DST Group, Australian-based cyber technology company PenTen, Deakin University’s Applied Artificial Intelligence Institute (A2I2), and the UK’s National Cyber Deception Lab.
“Projects such as these that help keep Australia safe and enable our researchers to collaborate with like-minded organisations at the highest levels,” Webb said.
Nguyen leads the Defence, Cyber and Space theme in the Faculty of Engineering, Computer and Mathematical Sciences.
“Network configuration inconsistencies in computers, such as policy conflicts, are a common occurrence and they can, and do, leave networks open to cyber-attacks,” he said.
“This three-year research program will develop methods that aim to address the challenge of the overwhelming complexity in managing network configurations and security.
“Our solutions will build on the formal metagraph abstraction that we developed at the University of Adelaide and will help significantly reduce the attack surface on Australian critical infrastructure.”
The research will provide Australian defence and industry with a unique cyber assurance capability through a new research partnership. This is led by the University of Adelaide with DST Group and communication and networking technologies provider, Cisco Systems Australia.
Ranasinghe works in the field of computer and cyber security, pervasive computing and machine learning. He will lead two projects and build new partnerships with research leaders from the University of New South Wales, Deakin University’s Centre for Cyber Security Research & Innovation (CSRI), DST Group and CSIRO’s Data61 Group.
“Deploying new technologies, without understanding the risks is dangerous. Our first project will look at the challenging problem of how we build trustworthy AI systems for autonomous systems of the future,” he said.
“Artificial intelligence technologies are increasingly becoming pervasive because of the potential to provide superhuman performance tasks, but the flip side is that AI systems are also very fragile and can be easily fooled and manipulated.”
In a second project, Ranasinghe will find more effective and faster methods to discover software vulnerabilities.
“Our efforts will address the challenges in large scale automated testing of software for discovering reliability, correctness and security vulnerabilities of software,” Ranasinghe said.
“Given the complexity of software being built today it is often very hard to manually test software to find bugs, especially deeply embedded ones which can be very hard to find.”