Nigel's Eyes

20220416 Back to the Future : understanding RegTech.

One of the fascinating aspects of having the longevity that I have had in relation to financial crime risk and compliance, coupled with my previous career as a lawyer, is that things come around again and again.

What is now called "RegTech" has been around, in one form or another, since the 1990s. But underpinning all KYC and transaction monitoring decisions, be they by people or machines, is the Risk Matrix.

Looking back over the past 20 or so years of seminar tours, one of the most popular was Building a Risk Matrix. Time, methinks, to produce a new version of that course with specific reference to understanding more about RegTech.

Let's start with the premise that regulators are pushing all forms of regulated business to invest in RegTech of one form or another. Everything from on-line approval of new accounts to transaction monitoring, with all the various sub-sets that such areas of activity include, falls within the broad category we are talking about.

But while regulators are hot to trot on RegTech, they are also issuing stern warnings.

Regulators are increasingly taking the view that where a tech-based KYC / risk assessment system fails, that companies cannot blame the tech. It is not a defence to say "we bought a black box and we expected it to work." The very regulators that are strongly encouraging companies to adopt RegTech are saying that those companies must understand and control what it does.

As an aside, it's interesting to note that The Law Commission in the UK has proposed that, in the case of autonomous cars, the driver should not be liable in the case of an accident. There is a clear conflict of ethos which, if it were to be applied in the RegTech world would provide a liability risk that many RegTech companies may not be able to afford. But such a change of mind by regulators is unlikely for the foreseeable future. However, it is foreseeable, indeed it's already happened, that customers of RegTech companies claim in indemnity for penalties or damages.

That the regulated business bears responsibility makes perfect sense: it has long been established that function can be outsourced but responsibility cannot. It would be to stretch common sense beyond breaking point to imagine that responsibility can be outsourced to a computer program or to those who design and/or implement it. The simple fact is this: if your business is using it, your business is liable for its faults and failures.

So, if a company is considering buying RegTech - or already has it and needs to review it - the company (which of course means its staff) must understand what happens in that black box and to understand that, you need to know how to build a Risk Matrix.

I'm relaunching the Building a Financial Crime Risk Matrix series with a tour of the UK in May 2022. The focus is on understanding three things:

1. What information do you need in order to make informed decisions;
2. How is that information used; and
3. Can that be used in a RegTech-based environment?

The focus is on creating an environment where delegates can return to their office and build a risk matrix for themselves, from live data, and therefore be prepared to answer questions from regulators and to focus discussions with RegTech salesmen.

That is important because it is apparent that RegTech systems must be tailored to the business. Of course, there can be standardised modules for external information but it's when it comes to analysing your own company's customer base, or setting the information relating to a new applicant for business that the information within the company becomes so important.

That's where it all gets messy from the providers' point of view. And, ultimately, it comes down to what information you have within your own records so as to determine what is usual, or not usual, in relation to the customer and his transactions.

That's why you need to know how to build your own risk matrix before you buy RegTech and, even, after you have it and you need to assess its performance.

For more information, see here:…

Over-compensation rather than risk assessment

Far too many systems that have been implemented within the past couple of years, often as a response to restrictions placed on everyone during the COVID-19 pandemic have been poorly thought through. That has to be corrected unless viable business is to be lost. I see, for example, extreme parochialism which takes no account of the way many people live their lives in the real, non-lockdown, world.

This means, simply, that no adequate risk assessment was done - or that a business decision was taken to do business only with those who fit narrow criteria. How ironic that the worse offenders are those that claim they are "disrupters".

So that's the kind of thing we will be looking at. We'll provide a pen and paper: you just need to bring your AI - Actual Intelligence.

Details of the course and booking is here:…

Notes on a compliance system that might not seem like one

On an entirely separate compliance note, I have been enjoying using London's buses. My father was a bus man and one of the things that he introduced and was later adopted by London Transport, as it then was, was the protective screen for drivers of what were, then, called "One Man Operated" buses. One of the things I have noticed is that the screen, even on apparently similar buses, has several designs, none of which, these days, have the capability to accept cash and issue a ticket. Instead, there is a card-reader touch point. But the touch-point is in any one of a number of different locations: standardisation would be helpful and would speed boarding.

Something else that is interesting is that there is a presumption that everyone knows what to do. I didn't. Yes, I knew to touch my Oyster card when I got on but I couldn't find out where to touch it when I got off. How does it know how much to charge? It turns out that you don't and it doesn't. There is a flat fare. That's interesting because it's very cheap for most trips but it's very expensive for a short trip: there is a clear incentive to walk if the weather is nice and the walk is, say, up to a kilometre.

The Oyster card itself was a bit of a challenge. At London's Heathrow, I asked at the information desk and was told that I could only recharge it with a payment card. But that's not true: notes and coins can be used in many shops and, even, in machines in stations, bus stations, etc. Because I didn't know what balance was on my card (the last time I used it was, I think, more than five years ago), I was confused. I was in a compliance system (information..find bus..get on bus... pay) that was actually extremely simple but without information, I had no way of knowing that. I felt a bit stupid saying to staff "please ignore my accent - I don't live here and I have no idea how anything works. How do I..."

It is strange that an absolutely superb system doesn't work because the people that design and work it assume that everyone knows what they know. It's a lesson that all compliance people should learn.

Also, learn what happens when one link in that compliance system outperforms expectations: A timetable is part of a compliance system. It's really frustrating to be left standing in a gale, with temperature of 3 degrees even without wind chill because the bus was two minutes early. I know - it happened. And it was too far to walk in that cold when, 30 hours earlier, I'd left home in Kuala Lumpur where the temperature was 32 degrees. Hehe.

It's fascinating what lessons we can all learn from things unrelated to what we do for a living.

We just have to keep our wits about us and take those lessons back and ask ourselves "do the systems we have put in place display any of those failures and if so how can we resolve them?"

Thanks for reading.