Loading Events

AESIN Security Conference

July 20 @ 8:45 am - 5:00 pm

At the AESIN Security Conferences in 2019, 2020 and 2021 Industry and Academia combined to set out the Cyber resilience challenge facing the automotive industry as we transition to greater autonomy, ever increasing levels of connectivity and vehicle system complexity. To outline an approach and to ask for your help in understanding how to operationalise Cyber Resilience, what research could be repurposed to kick start a solution, and how to meet and where necessary influence our regulatory obligations.

Now in 2022 we’re back to explain and debate what, with your help we, we have done to refine the approach and to demonstrate its technical and economic feasibility at scale.

Hosted by Peter Davies, Director Security Concepts, this Conference will provide insight into the latest thinking and experiences from leading cross-industry players, considering topics such as skills requirements, digital trust, secure architecture, cryptographic keys/authentication, robustness – resilient application interfaces and remote updates/access/firmware OTA (over-the-air).

In three sessions we will cover:

  • Legal responsibility in software tools
  • The indeterminate nature of Failure Mode analysis in AI and Machine Learning systems
  • Building a framework for trust in a regulatorily fragmented world.

Discussing the embracing of Complexity Science and Decision Making under Uncertainty, how you apply that to your engineering process and what we have learned about sustainability in insurance court, forensics, evidence and significant difference.

Our work during the year has been enabled by collaboration between industrial and academic experts covering the key technologies for the next decade from the physics of chip design, approximate computing, AI and Data Science and much, much more. As we stand on the threshold of a world where immeasurable benefits are achievable by doing things that matter at scale please join us to understand our vision and debate how you might help enable the future.

When your children ask you where you were when the future was being decided do not be the one to say “I was not there”.


Rachel Craddock, Thales

With 30 years of experience in Artificial Intelligence (AI) and Machine Learning (ML), Dr Rachel Craddock (BSc (Hons), MSc, PhD, CEng, FIET) is a Thales Expert in this field. After working as a Research Fellow in academia, she joined Thales UK, Research Technology and Innovation in 1998. Her EPSRC PhD research developed a form of deep learning using Radial Basis Functions, and was followed by EPSRC funded research applying recurrent neural networks to inverse model control.

Rachel has used AI / ML for a broad range of applications, both in the civil and defence domains. She applies AI for extracting useful intelligence and system input from big data, and developing AI based decision support and control systems.

Read More
She is Thales’s leader in AI based Pattern of Life analytics, winning external recognition through externally funded projects, IEEE conference papers, and partnering with external organisations e.g. Surrey Police. Over the past few years, Rachel has been applying her AI / ML knowledge to the domain of cyber and cyber-physical security and safety, through the ResiCAV set of projects. In 2021, Rachel was one of 100 Highly Commended Finalists in the Women’s Engineering Society 2021 Top 50 Women in Engineering: Engineering Heroes.”

Presentation: “The importance of failure mode analysis in systems using Artificial Intelligence and Machine Learning”

Artificial Intelligence (AI) and machine learning systems and components are becoming widespread in many areas of society, including automotive systems. Their uncertain nature combined with complex, dynamic operational environments makes system safety and failure difficult to analyse and manage using conventional safety methodologies. This talk will cover the safety problems associated with using AI and machine learning, and will present an methodology which can be used during operational activity to reduce the potential for harm and damage.

Jessica Uguccioni, Law Commission

Jessica is the lead lawyer for the Law Commissions’ automated vehicles review which published a report in January 2022 with recommendations for a new regulatory framework for automated vehicles for use on GB roads. She continues to advise the Centre for Connected and Autonomous Vehicles on AV policy and also started a new project on remote driving. Jessica’s work involves extensive public consultation in the UK and abroad. She is a member of the UNECE’s informal group of experts on automated driving (IGEAD) for working party 1. Before joining the civil service, Jessica was a lecturer in Law at Durham University and an associate at Cleary Gottlieb Steen & Hamilton in New York and London.

Paul Waller, NCSC

Paul has worked in cryptography and hardware security since graduating with a degree in mathematics in 2001. He has represented the NCSC and its predecessor organisation in various standards bodies, including the Trusted Computing Group, Global Platform and FIDO. His current role as Head of Capability Research allows him to spend time with academic and industry partners learning what the future holds for security technology, and also to help user communities take advantage of new features. Outside of work (when pandemic restrictions allow!) Paul likes to cycle up small hills in summer, and ski down bigger ones in winter.

Paul is one of our hosts for the event.

Chris Hankin, Imperial College London

Chris Hankin gained his PhD from the University of London in 1979. After 5 years as a Lecturer in Computer Science at Westfield College (University of London), he moved to Imperial College London as a Lecturer in Computing in 1984. He was appointed to a Chair in Computing Science in 1995. He has held various senior posts at Imperial College including being Vice Dean for Engineering (2006-2008) and Pro Rector for Research (2004-2006). From 2010-2019, he was Director of the Institute for Security Science and Technology at Imperial College London.

Read More
He has been involved in research in computer security since 2000. He led one of the founding projects in the NCSC/UKRI Research Institute in Socio-technical Cyber Security (RISCS); this project developed novel game-theoretic models to support investment decisions in cyber defence and is being further developed through follow-on-funding. Since 2014, he has been Director of NCSC/UKRI Research Institute in Trustworthy Inter-connected Cyber Physical Systems. His research has focused on modelling to identify effective defensive strategies, examining the role of software and hardware diversity in slowing attack propagation, using secure machine learning techniques in anomaly detection, developing metrics for identifying critical components and designing cyber physical attack graphs.
Back to Events

Newsletter Signup

Keep up to date with our latest news and events.

    Techworkshub Limited, 1 George Square, Glasgow G2 1AL

    Privacy Policy

    Restricted Content

    This content is restricted to registered users. To view the content please either login or register below.

    Login in Register

    Restricted Content

    This content is restricted to registered users. To view the content please either login or register below.

    Login in Register