Transcript: Iain Armstrong podcast

Iain Armstrong is Regulatory Affairs Practice Lead at ComplyAdvantage.

The following is a transcript of the podcast episode Iain Armstrong on the state of financial crime in 2024 between GRIP senior reporter Carmen Cracknell and Iain Armstrong, Regulatory Affairs Practice Lead at ComplyAdvantage.


Carmen Cracknell: Iain, it’s great to have you here. Thank you for joining us on the GRIP Podcast. Can you please start by talking a bit about your background and what you do at ComplyAdvantage?

Iain Armstrong: Yeah, absolutely. I’d be very happy to. And thank you for inviting me on the podcast today.

My background, if we start with that, I cut my teeth at the regulator, which at the time I joined was still called the Financial Services Authority. And I worked there in the enforcement division. Over the course of getting on for 10 years, I worked on various different things. I started on a big misconduct investigation.

But once that was being wound down, I moved into what they call unauthorized business, which is effectively the criminal investigations end of regulatory enforcement. And that’s really where I got my entree into the world of financial crime, if you like, because it was very much focused on going after, what they used to call “boiler room fraudsters”. They don’t really use that phrase very much anymore. They tend now to talk about investment fraud, but yeah, it was very fraud focused.

Worked a lot with law enforcement agencies, both here and overseas, as well as with agencies like the FBI, Interpol, etc. From there, I moved into banking. I was at a number of the large UK banks working in various different financial crime risk and compliance roles, both in the second line of defense and the first line of defense. And I’ve been at ComplyAdvantage for about just coming up to a year and a half now. And really, really loving working in such a different environment.

My role broadly is kind of focused across the whole organization, actually, so we’ll do a bit of everything. I sometimes refer to us as part of the discovery engine of ComplyAdvantage, in that we really try to help, first and foremost, our product owners and product managers to sort of understand the needs of our end users, but also to work with our commercial teams to make sure existing customers are kept happy.

And again, that partly involves sort of a little bit of interpretation of what the underlying drivers and needs of clients are. And obviously, where we have prospects, particularly where we have larger enterprise prospects, I’m part of the process of winning new business there.

Carmen Cracknell: Having that insider background at the regulator, how has that helped in your current position in particular?

Iain Armstrong: Well, it helps a lot, actually, because I understand what it is to launch an enforcement investigation, apart from anything else. So I keep my ear fairly close to the ground on regulatory development. I mean, it’s been close to 14 years, I think, since I left. So obviously, things have moved on, but I do try and keep myself apprised of what’s going on in regulatory enforcement, both here and across the globe, because obviously, we have quite a large geographical footprint. We have clients in over 70 countries. And I think my kind of grounding in regulation and in enforcement is very useful in that. I’ve sort of been on both sides of that table.

So I kind of understand the things that tend to drive enforcement investigations. And I do a lot of horizon scanning. So I’m constantly trying to anticipate what the regulators might be looking at sort of next year or in three years time, and what they’ll be looking at from a supervision perspective, which will then kind of feed into the things which they choose to investigate in enforcement. And then obviously, I have been at banks that have been subject to regulatory scrutiny.

So it’s not necessarily just my branding of the regulator. I think it’s the fact that I’ve been on both sides of the table that I can hopefully bring to bear in helping to shape our products, working with our very talented product managers and product owners to be something that is going to be useful in the industry and is really going to serve the needs of clients, by which I mean not just the compliance departments of financial institutions, but also the operations departments who often have to do the heavy lifting in terms of the outputs of solutions like ours.

Carmen Cracknell: So ComplyAdvantage’s latest report, which you contributed to- The State of Financial Crime 2024- as we said before we started, I looked at this last, pretty much exactly this time last year with your colleague, Alia. And the focus then seemed to be fraud, the metaverse, crowdfunding came up a lot in relation to extremism, the challenges of open finance. It looks like there was a slightly different focus this year, a bit of a shift. What were the biggest trends this year compared and the biggest difference in your view compared with last year?

Iain Armstrong: Yeah, sure. And I would certainly say all of those things that you spoke to Alia about a year ago are still massively relevant. Obviously, crowdfunding, we were talking just the other week on the subject of crowdfunding and it’s kind of overlapped with terrorist financing.

But you’re right, things have changed. There have been some sort of notable differences, possibly unsurprisingly, one of the biggest I think is around perceptions of artificial intelligence and what that means. So we’ve seen that out of the financial industry respondents who we surveyed, two thirds, so around 66% think that the use of artificial intelligence by fraudsters to enhance the scale and sophistication of their attacks is now an increasing threat, a growing threat.

On the other hand, we’ve seen some really interesting divergences between how consumers think of the subject of artificial intelligence. So we actually found, because this year one different thing I should say is we surveyed consumers as well as industry professionals and that revealed some quite interesting divergences. And we found that a quarter of consumers were not comfortable with their banks using artificial intelligence to help prevent and detect crime. So that’s about a quarter of people.

The banks themselves are, as you might imagine, very enthusiastic about using it. So 86% of financial institutions said that their company was going to be investing in new technologies in the coming year, including AI.

So we found it quite interesting that there’s that sort of divergence that banks are clearly looking to make more and better use of these technologies, whereas consumers are slightly reticent about their use. And in fact, we asked the financial industry respondents directly whether they felt it was necessary to prioritise explaining to their customers how they use artificial intelligence. And only 53% of them said it was.

So just to unpack that a little bit, that is a situation where you have a quarter of everyday consumers not comfortable with the use of artificial intelligence. Most banks are very comfortable with using it and in fact planning to use it more. But those same banks don’t really feel it necessary to explain to their customers how they’re using it.

So I think that we’re probably in for some further debate on that. And my theory is, in the long run, the consumers are definitely going to win out there. And so banks will over time have to get better at explaining to their customer bases exactly how AI is being used. So the analogy I draw is probably all of us have received one of those emails from our bank warning us about fraud and telling us a bit about fraud typologies. I suspect we’ll see that type of campaign being used in future. But instead of warning about fraud, it will be designed to educate the average consumer about how artificial intelligence is used and how it can be a force for good in protecting them and their money.

Carmen Cracknell: Yeah, so how does this fit in with the explosion in open banking? Real time payments could grow threefold in the next few years, with 60% of the population expected to use it. So how’s this going to work? What sort of trust levels are consumers going to have in open banking?

Iain Armstrong: Yeah, I love this question because I’ve been saying for many, many years that open banking’s going to change everything. I remember when I was still at HSBC back in 2016 giving a talk about it. And it hasn’t quite happened, I think, is a reality of the situation. It keeps threatening to happen. But actually, if you look at the uptake of usage of open banking services, we’re still fairly near the bottom of the hill, I think.

And particularly, I mean, in Europe, we’re a little bit ahead, I think, compared with places like America where there’s really very little use of open banking still. But I suppose the point I’m making is, I think it remains to be seen how widely it’s going to be taken up. I still firmly believe that one day it will be very, very widely used technology. But we haven’t quite got there yet. And so to your question of how does this new, brave new world of artificial intelligence and applications of machine learning, etc. How does that tie in with open banking? I’d say it’s probably still a little bit too early to say.

I think, I use open banking. I expect you do. But I think we’re possibly, you know, by doing podcasts like this, we’re perhaps self-selecting participants there. And I think the reality is a lot of the general public don’t really know about it or make much use of it. I think when we see that that number start to tick up, then I can pretty much guarantee there will be people waiting in the wings, looking at, you know, how you can apply more effectively something like, say, a machine learning technology to that space. But I just don’t think we’re there yet, to be honest.

Carmen Cracknell: Another thing that came up in your report, and I hear a lot in the industry in general, is the challenge of finding adequately trained compliance staff, especially with kind of the explosion in technology. So what’s, in your view, more important? Having the skilled teams or the best technology and how can you have both?

Iain Armstrong: Sure. Yeah, I mean, I think in this case, it’s probably a mistake to think of it as an either or binary type question, but rather a question about balance. I think it is absolutely critical that banks and other parts of the financial ecosystem stay abreast of what technology can do for them. In fact, I mean, I was literally an hour or two ago at a meeting with a number of banks, where it is very clear that a lot of time and energy is spent on keeping up to date on technological developments, particularly in the bigger banks. As I say, you can’t get away from it. And apart from anything else, you know, I can tell you one group of people that are really adept at keeping their finger on the pulse in terms of technological developments, and that’s criminals.

You know, criminals are particularly good at identifying new use cases for the abuse of technology. So you can’t really get away from it. However, I think it is equally important that we don’t lose sight of the importance of that human element. And that’s why we’re seeing the same numbers actually this year, when compared with last year, when we asked our industry survey respondents where they’re planning to invest. So I think we had almost identical numbers. So 87% said they were going to invest in staffing, and 87% also said they were going to invest in technology.

I think, you know, technology helps to process information faster than a team of people ever could. So, you know, we know that we’re generating more and more data and more different kinds of data all of the time. So you do need that. But for AI to really work effectively, to use AI as an example, you obviously need the data to be good, good quality, you need it to be accurate, you need it to be accessible in some format, which is actually going to be useful to you.

And the models that you’re using that data with need this sort of constant scrutiny and iteration to make sure that they remain accurate. And again, you do need that human element to review whatever it is that your AI system is outputting. So a couple of years ago, the German regulator, BaFin, issued a guidance paper on, I think what they called supervisory principles for the use of big data and AI.

And one of the central tenets of that paper was the concept of putting the human in the loop, which is a phrase that you now hear quite a lot, I certainly use a lot. And it was essentially this idea that employees should be sufficiently involved in the interpretation of algorithmic results and the decisions around that. And we’ve seen examples of what can go wrong when that doesn’t happen.

So in the US, you might remember a couple of years ago, the Consumer Financial Protection Bureau ordered one large bank to pay out, I think it was around $160 million in compensation to over a million customers, I think, whose accounts have been closed over a period of five years going back to, I think, 2011. And the finding there was that there had been too much reliance on automated fraud detection, and not enough human validation of the output of those detection systems.

Carmen Cracknell: And do you see in this push to automate processes? I mean, is it moving too quickly, in your view?

Iain Armstrong: Well, I think, so if we go back to that Consumer Financial Protection Bureau case, I was just referring to, that was about actions that were being taken between 2011 and 2016. So, you know, even the most recent of those was almost 10 years ago at this point. So I think, I wouldn’t say it’s moving too quickly, I think cases like that actually get watched very closely by the rest of the financial services industry.

And I’m fundamentally an optimist, I would like to think that people see enforcement actions like that and actually really pay close attention to them and say, well, let’s not make that mistake. So I think, if anything, the way that we’re using automation is actually maturing rather than growing too quickly. And so what 10 years ago might have been seen as acceptable today isn’t, because people are a little bit wiser and a little bit smarter about what can go wrong. Because unfortunately, there have been cases of things going wrong. But every time there is one, that’s a teachable moment, as we say, you know, that’s an opportunity to actually learn and do better.

Carmen Cracknell: There are quite a few elections coming up this year. What are the main risks associated with these, and in particular, politically exposed persons, because I know PEPs came up quite a bit in your report. The big one is obviously the US. But if you have any smaller, sort of lesser known elections that to watch out for. Yeah, could you talk a bit about that?

Iain Armstrong: Yeah, look, for sure, I can. And you’re right, it did come up quite a lot in the report. I think there’s a couple of ways to look at this. I mean, you’re absolutely right, you know, 2024 is going to see more people voting in democratic elections than I think has ever happened before in human history in a single year. So that’s probably why it featured so prominently in some of the responses to the survey and in the report itself. I think there’s two ways, two really distinct but important ways of thinking about the issue of political exposure. So one is straightforwardly about your ability as a financial institution to identify a PEP in the first place. And the other is a little bit more esoteric around why we treat PEPs differently.

So if I just tackle the first one, I mean, at a very high level, obviously, a PEP is an individual with a high profile political role, or someone who’s entrusted with a prominent public function. And it also captures the relatives and close associates of those people so that they’re kind of immediate family, but also business associates, etc. And if you are a PEP, or one of those relatives or close associates, I’ll just call them RCAs from now on, then typically, your financial institution will subject you to enhanced due diligence. So they’ll scrutinize you a little bit more closely. And in fact, in many countries, local regulations require financial institutions to be able to identify every single one of their customers, who is either a PEP or an RCA.

Now I’ve said that as a kind of preface to this is the world that we operate in, it’s the world that financial services have to operate in. So you’re looking down the course of 2024 ahead of us. And the issue with all of these elections that are happening, and I will talk about some specifics in a minute, but you know, just look at the globe as a whole. The issue from the perspective of financial institution is how do I ensure what is sometimes referred to as election readiness.

So election readiness is if the overall population of PEPs goes through a significant shift, where you see completely new PEPs appearing, as well as a lot of current PEPs leaving politics altogether, which usually happens after an election. How can you make sure that you’re tracking that? So how do you make sure that, you know, your customer, Joe Bloggs, who yesterday was not technically a PEP, but tomorrow as a result of, you know, an election and then seeking to become elected, has suddenly become a PEP.

And that, you know, that you need to have a handle on that information. And the fact of a person’s political exposure is quite often used as a factor in how financial institutions do risk scoring on their customers. So looking at a client’s overall score allows an institution to manage their risk with PEPs appropriately. So that’s just broadly, you know, the question of identification.

That brings me to the second aspect of PEPs is why do we treat them differently in the first place? So the Financial Action Task Force has designated PEPs as people who should be more closely screened and monitored, hence the enhanced due diligence. And it’s not, you know, the designation of pep is not clearly defined. This is another really key thing to understand, depending on what country you’re in. A PEP could cover quite a broad range of different roles and positions.

So my point here is that there isn’t a sort of simple approach to screening. It’s actually quite a complex job to look across every country in the world and determine who in that country is a PEP and who isn’t. But the idea is that these people are more exposed to the potential for being bribed or the potential for engaging in corruption.

That’s not to say all PEPs are corrupt, but their positions of power and influence make them more susceptible to that. And by the way, it’s definitely a myth that the risks associated with political exposure are only relevant when we’re talking about people in third world countries or developing countries. There’s actually an ongoing case right now against the US senator who allegedly accepted hundreds of thousands of dollars in cash, not to mention gold bars, not to mention a sports car and various other things in exchange for protecting the business interests of some foreign nationals, essentially.

And I mean, I can’t really talk about this topic without talking about the fact that, you know, on the 15th of April, we were about to see the first ever criminal prosecution against a former US president, you know, for alleged hush money payments and the falsification of records. So I think, you know, this is the environment, if you like. So to come back to your original question, those are the things, and I can guarantee this, having worked in big global financial institutions, that can keep you up at night when you’re in a year like the one we’re in now.

Carmen Cracknell: So are sanctions an effective tool to combat cross-border crime and financial crime in particular?

Iain Armstrong: Well, again, it’s really interesting because I was actually talking to, you know, the group I referred to earlier just a couple of hours ago about this exact question. And I know there’s been a lot said in various think pieces about, you know, or raising the question of whether sanctions have sort of had their day. But one of the people in the room that I was just speaking to was actually saying, but when you go to countries like Ukraine, you will often find that the general public are very heavily invested in sanctions against Russia, for example.

And there you get a lot of questions about, you know, when are the next tranche of sanctions going to be coming? You know, could they be focused on this particular industry? That would be really useful to us. So actually on the ground, if you like, you do see a lot of support for them. I think they do make a difference.

It’s important to think of them as they’re not a silver bullet, obviously, and also not an instantly effective thing. They never have been really. Sanctions are a slow puncture, you know, that are over a long period of time, hopefully not too long, will have that intended effect of changing behaviours by a state. But it does take time for the impacts to really be felt. Could we do more? Absolutely.

I think it’s really important. And we’re seeing this sort of mood music, if you like, from some of the key Western governments around the expectation that financial institutions should, it’s not enough to just be looking at a list of sanctioned individuals and saying, well, let’s make sure none of our customers are on that list. You really do need to look at the network of companies and individuals around a sanctioned individual and making sure that you don’t have unnecessary exposure there or concerning exposure, I should say.

Carmen Cracknell: My last question is quite generic and we’ve probably touched on it already, but what were the key AI threats over the course of this year and going forward, how can firms stay prepared?

Iain Armstrong: Yeah. So I think this is interesting because actually part of my answer here does refer back to this 2024 being a year of great political change. And what I mean by that is the use of deep fakes has been written about quite a lot, but it’s now, what hasn’t been written about quite so much is deep fake audio. So this is, the faking of the voices of people. You don’t even have to actually fake video footage of them.

We’ve seen instances of this recently with the London mayor. There’ve been lots of other instances all around the world where prominent people are apparently caught on tape, if you like, making these controversial statements. And then it later turns out that it was deep fake audio. I think it is seen by many as the number one source of political disinformation in this day and age. So I think that is certainly an AI application, which you didn’t really hear about three or four years ago.

So that’s one thing that everyone needs to be aware of, not just financial institutions, to be honest, but just the general public as well. I think from a more specifically financial crime perspective, I think the possibilities which AI affords around the kind of mass industrialization of scams.

So in other words, you can use social engineering, which has been around for a very, very long time. But now you can use a chat bot, a large language model powered chat bot, a bit like chat GPT, to engage in these really targeted conversations with victims and to build their trust.

So rather than having a very generic blunt force email message saying, please send me some money for want of a better expression, you can now actually engage potential victims with these bots that actually, because you can target millions and millions of people in one go, you only need a small handful of those to fall for it in order for that to be quite financially lucrative from a perspective of a criminal. And then I suppose the last thing I’ve mentioned is the use of synthetic ID.

So this is taking a combination of real data, data on you or I that perhaps has been obtained through theft or social engineering. And you kind of combine that with data faked through artificial intelligence, whether it is a fake document or a piece of ID, etc. And you kind of combine those two things and use those to open accounts to apply for loans, etc. So I think those three things have been really, really prominent.

Carmen Cracknell: Great. Well, that brings me to the end of my questions.

Iain Armstrong: Well, thanks very much for the opportunity.

Carmen Cracknell: Thanks very much for joining.

Iain Armstrong: Thank you.

Listen to the audio.