Cook’s 1971 paper says that complexity theorists believe that P cannot equal NP. But if P can be shown to equal NP then, amongst other things, cryptography would be pretty much unsustainable.
Given that the paper was written 40 years ago this year, some of it is delightfully archaic. It also shows that not only has no one proved that P equals NP but no one has proved that it doesn’t.
In the context of financial crime risk and compliance, this extract strikes a chord:
″A fundamental question in complexity theory is whether a source of random bits can be used to speed up substantially the recognition of some languages, provided one is willing to accept a small probability of error. The class BPP consists of all languages L that can be recognised by a randomised polynomial-time algorithm,with at most an exponentially small error probability on every input.″
Let’s, for a moment, stick with the simple part of customer acquisition: is this applicant for business listed on any of the databases that we check against?
If yes, refer for verification; if no pass to stage two.
Obviously, that initial checking, the P part of Cook’s problem, is a pure data-crunching, data-matching task. The dataset may be absolutely enormous but the algorithm is simple (well, it is on the face of it but in reality it’s quite complex in the sense that it has multiple sub-routines within it but the essence of what it does is simple).
So, let’s get a bit more into the techy stuff.
In Cook’s paper, P stands for ″polynomial time.″ This is not a measure of hours, minutes and seconds. It’s a kind of play on words. It means the number of steps that are required to perform a set task. That task is whatever the computer is told to do. And no, I’m not talking about how many steps a computer will do if you tell it to take the dog for a walk.
But you can test this for yourself.