MI5
Printable version

Maths And MI5: The Calculations That Keep The Country Safe

Our director general, Ken McCallum, delivered the annual Bowman lecture at the University of Glasgow earlier last month.

Ken McCallum delivering the Bowman lecture on 1st June.

These lectures highlight the role of the mathematical sciences in addressing major issues of public importance.

The director general, who is a graduate in mathematics from the University of Glasgow, explored how MI5 uses maths and data science to detect and disrupt threat, and how we are working with partners to protect the UK’s resilience and economic security.

Thank you so much Sir Anton, Adrian. Good evening everyone.

Tonight is a special moment for me. It’s always a pleasure to be north of the border. It’s a joy to see old friends. It’s moving to be standing on the Bute Hall stage where I graduated twenty-seven years ago. And it’s a great privilege to be invited to give this year’s Bowman Lecture.

When I was appointed to this role three and a bit years ago, the Sun newspaper’s headline was “Maths boffin named as MI5 Director General”. I obviously took this as an extravagant compliment. But it wasn’t one I really deserved.

The UK is fortunate to have mathematicians of genuine distinction serving in our intelligence community. Most famously, the codebreakers and codemakers of GCHQ, following in the footsteps of Alan Turing, make a profound contribution to our security through world-leading work in their deep specialism. I salute them. In an alternative life I might have tried to be one of them; in my Honours year here, Professor Ken Brown pointed out that the number theory I was studying could take me in that direction.

I’ll never know whether I would have cut it as a codebreaker – or indeed as any other form of professional mathematician – as I took another path so I stand before you tonight as by some distance the least academically distinguished Bowman lecturer so far. But I think that’s OK, because maths and statistics play such a crucial role in keeping our country safe that there’s plenty to say.

I’m going to take as my starting point the inaugural Bowman lecturer, Sir David Spiegelhalter, sat here tonight. I’m blessed to count David as a friend. And I think I’m right in saying that when he was invited to serve as the first Winton Professor at Cambridge, the advertised role was titled “Professor of the Public Understanding of Statistics”. David said he would be delighted to take the role – on the condition that it was renamed “Professor of the Public Understanding of Risk”. And managing risk is what MI5 is all about.

So, after a quick summary of what MI5 does, that’s where I will start tonight: our round-the-clock management of serious risks, where we rarely have certain knowledge but must make difficult decisions based on probabilities. I’ll then touch on some of where we use maths in more specialist ways, to build some of our tools and that will in turn lead naturally onto the partnerships, including crucially with our universities, that we need to crack the challenges of our generation.

When that’s all done – I’d hope to come in just under 40 minutes – any member of the audience (apart from a couple of family members who know me too well) can ask me any question they like.

MI5’S ROLE

So first, a couple of minutes on MI5’s role. Just as maths, stats and science can be stereotyped – I of course speak as a “maths boffin” – so too is MI5. I can confirm mathematically that James Bond is 94.83% fiction. Whether it’s in popular culture or in some of the wilder conspiracy theories, the fact that we don’t – can’t – say too much about our work leads to others filling the gap. Some fairly accurately, others…. less so. So here is what we actually do.

MI5 exists to deal with threats – mostly hidden threats – of the most serious sort: threats to people’s lives, and threats to our way of life.

Most of my career has been spent fighting terrorism. Week by week we detect and respond to potential threats of lethal violence from Islamist extremist terrorists, from right wing extremist terrorists, from rejectionist splinter groups in Northern Ireland. Terrorism is appalling and it’s deeply motivating to have the job of trying to stop such horror and devastation.

Countering terrorism was, and remains, more than enough to fill our days. But the job doesn’t end there I’m afraid. When I left this great university in the mid-90s and joined MI5, the Cold War was over and there was widespread hope that we might enter a more co-operative geopolitical era. I wish we had. But as the people of Ukraine will tell you, that’s not how things turned out.

And so alongside our still-intense work to tackle terrorism, MI5 works to safeguard our nation from rising State threats. Sometimes those are direct threats to life, as in Salisbury, or the more than fifteen occasions since the beginning of last year when we’ve had to deal with potentially deadly threats from the Iranian regime.

More often, though, it’s about threats to our way of life, with foreign powers – most profoundly the leadership of the Chinese Communist Party – using every tool at their disposal to bend the world in ways they think suit them. Some of that of course is legitimate international competition. But where States use methods descending into coercion or deception, interference in our democracy, or theft of critical information – including intellectual capital built by Scotland’s universities and businesses – MI5 has work to do. A particular focus is inevitably around technology, where we face systemic competition that will shape the lives of future generations.

When you’re dealing with risk at scale, it is not a good strategy simply to try to take on every individual actor presenting a threat. It’s smart also to think about how you make your business or your country – or your university – harder to attack in the first place. Helping reduce the UK’s vulnerability to threats is MI5’s second, less famous role. It doesn’t tend to attract filmmakers and thriller-writers, but it’s every bit as vital. I’ll come back to it later.

So, that is MI5’s job in a nutshell. We’re not short of things to do and it really matters that we’re as effective as we can possibly be so where does maths come in? I’ll start right at the heart of what we do: managing risk.

RISK, UNCERTAINTY, JUDGEMENTS

If you were to sneak into MI5 tomorrow morning – I strongly advise you not to try – you wouldn’t hear many people debating statistics. You would however find several thousand dedicated professionals discussing risk. Not in the abstract. In a practical, grounded, often urgent way.

Only a few specialists will ever make explicit reference to Bayes’ Theorem – but it underpins our day-to-day operations, just as it informed Alan Turing’s world-changing work at Bletchley. It’s there whenever we make judgements on the probability of X or Y being true, given we know A, believe B and can infer C about a suspected terrorist, and we know from experience the prevalence of various forms of risk. We have to spend our time examining forensically the reliability and credibilty of the sources of what we think we know. And working out how much weight we can place on our always-incomplete knowledge.

Such work is not just intellectually challenging (though it is). Dealing with risks which are often unlikely to manifest but horrifying in their potential impact requires not just fine judgement, but courage too. In MI5’s work, false positives are bad. We never want to intrude on someone’s privacy unnecessarily. We don’t want to use up any of our finite resources on a threat which turns out not to be real, when plenty of others are. But even worse than false positives are false negatives. Every day we have to take decisions not to allocate capacity to some of the potential leads which come in the door. This is hard, and I have the greatest respect for the dedicated people of MI5 who step forward to shoulder such weighty responsibility on behalf of their fellow citizens.

Every day MI5’s people take thousands of decisions based on their assessment of risk and the probability of threat materialising:

  • Is this piece of information credible? How worrying is it really? Is it a departure from what we would normally expect?
  • What do we know as fact? What do we assess to be likely, and how likely is it?
  • What are the chances we have found everything we need to? What other information could we usefully gather to ensure nothing material has been missed? Are these steps proportionate to the risk we think we might be facing?
  • What is the impact of acting in this case? And what is the impact of not acting?

We ask ourselves these questions in the particular and very serious context of national security risks. But as questions they apply to all sorts of situations of uncertainty and risk assessment, from medicine to insurance to investment markets. The Prime Minister recently spoke about wanting maths to be prized as a “key skill every bit as essential as reading”. I don’t know how many of MI5’s people even think of themselves as having mathematical or statistical skills. But they’ve got them.

In saying that, I’m not suggesting that our task can somehow be reduced to numbers calculated to three decimal places. The mathematician John von Neumann said that “If people do not realise that mathematics is simple, it is only because they do not realise how complicated life is.” Life is complicated. And so are MI5’s investigations. Terrorists and spies go out of their way to conceal their intentions and hide in crowds. There isn’t enough terrorism – thank heavens – to generate the kinds of massive datasets that you need for some statistical methods to come into play so, we have to rely hugely on nuanced human judgements to make sense of noisy, confusing, sometimes sparse information.

But even then we find that judgements made within a disciplined structure and system are better than judgements made more loosely. In striving every day to be as good as we can be at making sometimes near-impossible choices, we must give our people, in supportive teams, the training, technology, skills and tools to make the best decisions they can. And back them when it’s really difficult.

I could fill this whole lecture with examples of the risk judgements we have to make. But it’s probably best I don’t tell the world, and thus those who mean us harm, too much. So let’s instead take a tour of a few more specialised uses of mathematical and statistical methods that may have particular resonance in this audience.

HARNESSING THE POWER OF MATHS: AI

As my first example: it’s not a secret that MI5 needs to make sense of masses of audio data, as we strive to penetrate the intentions and plans of spies and of terrorists. I will leave to your imagination the gizmos we might use to collect audio data; but suffice to say that a good chunk of it is obtained in demanding circumstances – and is thus quite difficult to decipher.

When I joined the organisation, converting that raw audio into useful information was a painstaking task for hundreds of professionals sat wearing headphones. Today, we want to automate as much as possible of that foundational conversion of audio into searchable text – freeing up our analysts to focus on extracting the intelligence insights that count. But the challenging nature of our audio data means that commercial speech-to-text solutions often can’t do what we need – at least not with the precision that high-stakes work rightly demands.

So our data scientists build, train, and deploy our own machine learning models, continually improving them based on real feedback – giving our people a huge productivity boost, enabling them to apply their analytical skills to the true secrets and mysteries. This interplay between maths, computing science, engineering and human expertise is a critical dynamic for us. Building a model is just the first step; the real test is using it on live operations. Applied maths at its sharpest.

A second example of applying AI is in detecting violence in images. Understanding whether, say, a prolific contributor to extreme right-wing online forums is also watching graphic beheading videos can help in assessing the level of risk they might pose. But we don’t need or want to view all the sport they’re also watching. So with another grateful nod to Alan Turing, we again turn to Machine Learning. We have put in place automated capabilities to detect violent material within large data streams.

To take an example of a quite different sort: quite often, precious clues to terrorist or espionage activity will crop up in data which is not held by MI5, but by private companies. Breakthroughs can be made in the overlap between what MI5 knows about thousands of people of concern, and data held in bulk by businesses on financial transactions, say, or travel. There are compelling reasons to be interested in that overlap – but you don’t want either party having to share its own sensitive information at mass scale.

This is where so-called Privacy Enhancing Technologies can step in – allowing different organisations to collaborate on the sensitive data each holds, but without unnecessarily disclosing their data to each other. One of these technologies – Fully Homomorphic Encryption – is being developed by some of the most gifted mathematicians in universities and start-ups around the world, and has real potential. It involves insanely hard maths. Maths which will help keep our country safe and minimise intrusion into privacy. Win-win.

We use many other areas of mathematics that I don’t have time to describe properly tonight: modelling, simulation and optimisation to support our own operational efficiency; applied maths in the design of sensors and of protective systems; discrete maths in analysing cyber-vulnerabilities; systems and control theory in understanding resilience; I could go on...

These examples are just a taste of the ways in which MI5 is seizing the opportunities presented by AI and by other ground-breaking technologies.

AI is really a subject for another lecture on another occasion, on which others – such as my old computer science professor Dame Muffy Calder [indicate towards DMC in audience] – can speak with more authority than me. So I’m not going to linger much longer on it. But this is huge. To quote the UK’s AI strategy, “AI may be one of the most important innovations in human history, and the government believes it is critical to both our economic and national security that the UK prepares for the opportunities AI brings, and that the country is at the forefront of solving the complex challenges posed by increased use of AI.”

MI5 is an organisation that needs to be resistant to hype, and even we would agree that recent advances in large language models and generative AI – supercharging neural network approaches using huge amounts of training data and ever-expanding processing power – are changing our world. Your world and my world alike.

SOCIETAL RESPONSE AND ETHICS

Hand-in-glove with all the opportunities are the hazards. Technology of this power demands a wise societal response; the Government’s recently published White Paper on the future of AI regulation is important. We need the mathematicians and computer scientists to keep challenging themselves on the accuracy and fairness of the models and systems they’re building. But safeguarding isn’t just on the technical experts; it needs a multi-disciplinary effort – from lawyers, through behavioural scientists, to end users – to build truly effective systems, and to deploy them in ways that are proportionate, ethical and safe.

Getting all this right matters as much to MI5 as to the rest of you. We make high stakes decisions; where AI can sharpen our judgements, our productivity or our prioritisation, we’d be negligent not to use it. But just as with all our other tools and methods, we have to use it responsibly. The whole point of MI5’s mission is to protect the core values of our democracy; if we were to use AI recklessly, we’d be undermining the very things we exist to protect. So as we adopt cutting-edge technologies, we give specific attention to embedding our ethical standards into our use of them, with our anchor the enduring principles of fairness, necessity and proportionality.

That’s why we’re glad to operate within the strict legal and oversight framework laid down by Parliament. The legislation that governs our work makes clear what we can – and cannot – do and for what purposes. The Investigatory Powers Commissioner, Sir Brian Leveson, and his team conduct robust inspections of what we’re doing with our data – and they call us out if we slip up. If anyone has concerns about how we’re operating, they can approach the Investigatory Powers Tribunal.

Even if such independent scrutiny wasn’t in place, our people would still want to act lawfully, responsibly and ethically; those are the values they joined MI5 to uphold. But we’re glad that we are held to account. It’s not something you see happening to intelligence services in nations that do not subscribe to our values. It is a hallmark of a healthy State.

HARNESSING THE BRILLIANCE OF OTHERS…

Implicit in a lot of what I’ve said, but I think worth making explicit, is the massive extent to which national security in the 2020s is a team effort. To stay effective across diversifying threats and evolving technologies, we need to work not just with traditional partners like MI6, GCHQ, the police and the military, but with a far wider range of partners than the MI5 I joined. That takes us in all sorts of directions – from asking online retailers to detect potentially suspicious purchases of chemicals, to engaging with safeguarding professionals to inform decision-making in cases where vulnerable young people are drawn towards extremism and terrorism, posing risks to their communities but also to themselves.

Another of the places it takes us, you won’t be surprised to hear, is tapping into the UK’s world-class expertise in maths, and in the sciences more broadly. We increasingly work in partnership with our universities, research institutes and cutting-edge businesses, to combine complementary expertise to solve hard problems and build the capabilities our nation needs. To give a few examples:

  • Last year we made public MI5’s collaboration with The Alan Turing Institute, which brings together experts to apply AI research to national security challenges. MI5 draws on the ATI’s world-leading research to ensure our use of AI keeps pace.
  • The intelligence agencies support the Heilbronn Institute for Mathematical Research, a strategic partnership with UK universities within which academic mathematicians contribute to problems of national security.
  • Likewise, mathematicians at the Royal Academy of Engineering, the Royal Society and the new National Academy for the Mathematical Sciences lend their expertise to addressing challenges with national security significance.
  • The National Security Strategic Investment Fund – NSSIF – brings together government funding and expertise with advanced tech firms and venture capital investment and by working with entrepreneurs across the UK, NSSIF is both developing next-generation security technology and boosting innovation and economic competitiveness.

It's through collaborations like these that together we will harness the brilliance, ingenuity and creativity of the UK’s maths and science community to keep our country safe in the years ahead. MI5 has incredibly challenging and interesting real-world problems to solve.

Solving them, together, is hugely rewarding. Our generation’s version of the game-changing work done on WWII codebreaking challenges, which saw UK universities leading the way on the development of the modern computer. 

…AND PROTECTING THAT BRILLIANCE

There is another rapidly-growing and equally vital dimension to MI5’s collaboration with academia and industry. And this is the one part of the evening where I need to say something challenging.

The UK’s great research universities – of which Scotland’s are such a proud part – rank as one of our foremost national strengths. One of the things the world admires most in the UK; up there like our creative industries, our internationally-respected judiciary… Scotland’s distilleries.

The excellence of our universities is hugely good news for the nations of the UK, enriching our culture and supporting our economy. But it comes with a catch: just as your research excellence means organisations like mine are keen to partner with you, it also means that hostile actors working for other States make it their business to take your hard work, and use it for their gain. We see this happening with dispiriting regularity. Precisely because our great universities are so great and rightly prize openness, they are magnetic targets for espionage and manipulation.

It's always been the case that scientific and technical excellence has been an important component of national power. The space race of the 50s and 60s was perhaps the most iconic scientific and technical contest of the Cold War; but probably even more decisive, through the 70s and 80s, was the way that the rise of Silicon Valley enabled US economic and military capability to pull far ahead of the Soviet Union.

Today's contest for scientific and technological advantage is not a re-run of what we had in the Cold War – but it is every bit as far-reaching. Systemic competition means just that. If your field of research is relevant to, say, advanced materials, or quantum computing, or AI, or biotech – to name but a few – your work will be of interest to people employed by States who do not share our values.

That interest can be subtle. The attractive conference invitation or collaboration proposal. Engagement from postgrad students who are connected back to, or can be pressured by, State organisations we wouldn’t want to assist. It can be institutional: the strategic partnerships, donations with strings, investment proposals or jointly-funded research that build dependency. Or through cyber security vulnerabilities that leave your most valuable information exposed.

These aren’t hypotheticals. They’re things MI5 sees in investigations week by week, and they happen in universities just like Glasgow.

This is where MI5’s protective role – the bit I mentioned earlier that doesn’t tend to attract filmmakers’ – comes in.

In March, government announced the creation of the National Protective Security Authority. The NPSA, a part of MI5, is there to offer expert, practical advice and training to UK businesses, institutions, universities – to help them protect themselves from growing risks.

This is not a new concept; NPSA builds on the work of a predecessor organisation, CPNI, which for decades provided advice principally to the operators of critical infrastructure. That mission still matters – but in the era we’re now in, the scope of what needs protecting is way broader.

And so NPSA is seeking to reach many more customers – at least ten times more over the next two years. Including, pivotally, deepening our engagement with academia. We’re not going from a standing start; the Trusted Research campaign, co-created with academia helps universities make the most of international research collaborations, informed about the threats and armed with appropriate mitigations. If you do only one thing off the back of tonight’s lecture, take a look at the guidance on the NPSA website.

Take up has been great – 84% of UK research organisations have begun to adapt their processes in response. A number of universities have developed their own Trusted Research teams. Glasgow hosted the most recent Trusted Research STEM universities forum in February.

Further campaigns will follow, and the learning will be both ways. It has to be a dialogue. Because there is a real conundrum in the middle of all this. We don’t want to pull up some imaginary drawbridge and cut ourselves off from the world. We can’t afford to smother the very excellence that we are trying to protect.

But neither can we afford to adopt an ostrich policy. This is not about being xenophobic; I earnestly hope that the amazing peoples of Russia, Iran and China enjoy greater freedom in decades to come. But if you look at what Putin’s military and mercenaries are doing in Ukraine; at the Iranian regime’s ongoing suppression of its own people; at the restrictions of freedoms in Hong Kong and human rights violations in Xinjiang, or China’s escalatory activity around Taiwan – I don’t think you want the fruits of your inspiration and perspiration to be turned to the advantage of the Russian, Iranian or Chinese governments.

I get that many researchers would rather not have to think about security; it can seem remote from what interests them, perhaps even against the grain of the openness they rightly value. I would say security is a requirement for research integrity and a prerequisite for openness rather than set in opposition to it. Whether we like it or not, universities are participants in the global contest I’m describing – and need to make conscious choices about the role they’re going to play.

So I really welcome invitations like this evening’s, which shows The University of Glasgow, amongst many others, to be alive to these issues and up for the conversation. Please make the effort – with our support – to make those difficult choices consciously. Future generations will thank you.

MI5’S PEOPLE

Universities like Glasgow don’t just produce precious research. They also mould capable people. So let me make a direct appeal to anyone who fancies applying their talents to the challenges I’ve been describing.

Our ability to protect you is the stronger the more we succeed in harnessing the talents of all sorts of people, either working for us as members of staff, or contributing to our wide network of partnerships. It’s vital that MI5 reflects the whole of the population we serve. It’s the right thing to do and it is the smart thing to do, because it gives us access to a greater range of skills, perspectives, backgrounds, experience. We are already more diverse than most people think – 48% of MI5’s people are women, and last year we’ve just had our best-ever year in recruiting joiners of minority ethnicity.

But we still have more to do. Too many people still rule themselves out because they just don’t picture themselves in MI5. So we keep pushing ourselves to reach different kind of audiences, who might otherwise never think of us as an option. That’s why I’m delighted, for example, that this summer we are running a data science and machine learning summer experience, delivered jointly with the Alan Turing Institute. This will allow school-age students – particularly those from lower socioeconomic backgrounds – to learn about data science and its application to national security. This is all in the hope that these young people and their peers consider a future career in defence and security and in AI.

I’m sure we’ll get strong take-up. But you don’t have to be a lucky school-age student to find a way into MI5. We recruit people at the start of their careers and people further in with experience to bring. And you don’t have to be a graduate – as just one example of another route in, we’ve been training apprentices since 2012, including on our software engineer and cyber schemes.

So please don’t wait for the legendary tap on the shoulder. All of our jobs are available on our website. We recruit pretty much like everyone else, with a bit of extra vetting as you would hope. We’re a modern, caring employer, where wellbeing is a priority and you often have to leave your work behind when you pack up at the end of the day.

The reason I’ve stayed at MI5 for so long – the reason I’m so proud to lead MI5 – is that, in our quiet way, we make a huge difference. The mission and the sense of team is unparalleled. You won’t be a tech billionaire. And you certainly won’t be famous. But if you’re up for the adventure, we offer one-of-a-kind careers, where you get to do some unique things, working in teams alongside other committed, selfless people who share your determination to make that difference for the sake of our fellow citizens. Not every day is an easy day. But every day counts.

Click on MI5’s website. It might change your life. It might save someone else’s.

CONCLUSION

Thank you all so much for your attention. I won’t attempt to draw together a summary of the ground I’ve covered. But as I start to wind up, let me share another quotation, this one a favourite of mine from Albert Einstein, who said “One must divide one’s time between politics and equations. But equations are much more important to me, because politics is for the present, but an equation is something for eternity”.

You can kind of see why this quote has stuck with me. ‘Politics’ as such is not my thing – MI5 is a strictly apolitical organisation – but clearly I have devoted much of my life to equations and to current affairs. And I suppose what tonight’s talk maybe illustrates is that the path I’ve followed hasn’t felt like dividing my time between security and equations; more that they are pretty strongly connected.

Which takes me to the thought I’d like to leave you with. Since I graduated on this stage more than 25 years ago, the world has changed enormously. It will change even more in the next twenty five. Technological advances, underpinned by maths, are driving big shifts in how life is lived on our planet.

For MI5, keeping our country safe across the coming decades means continuing to develop and adapt the unique spycraft we’ve been honing for more than a century – and combining that tradecraft with vital skills, perspectives and data held in other places. Including in our universities. Including specifically by our mathematicians and statisticians. We need you. Given the adversaries out there, you need us. And that is why I was delighted to be asked to give this year’s Bowman Lecture.

Thank you.

 

Channel website: https://www.mi5.gov.uk/

Original article link: https://www.mi5.gov.uk/news/director-general-speech-at-university-of-glasgow

Share this article

Latest News from
MI5

Championing Sustainability in the Workplace with Skills Bootcamps from Serco: An opportunity to secure funding to upskill your employees