UK Lord Chancellor Alex Chalk asked to rule on use of computer evidence

Law chief is pressed for opinion on controversial presumption that computer evidence is correct.

The minister in charge of the UK’s courts has been asked to look into the rule on use of computer evidence in court in an effort to break the stalemate over the issues. The move comes alongside news of a landmark judgment in the US about the use of AI in recruitment.

Darren Jones MP, who chairs the House of Commons Business and Trade Select Committee, has written to Lord Chancellor Alex Chalk asking him to step in personally after the Government brushed aside calls for change.

Since 1999, following a recommendation by the UK Law Commission, the country’s courts have operated on the presumption that a computer is correct unless there is explicit evidence to the contrary. This replaced a section of the Police and Criminal Evidence Act 1984 which stated anyone introducing computer evidence to court should be able to prove it was operating properly.

Post Office Horizon scandal

That principle has been called into question by the UK Post Office Horizon IT scandal, and is at the heart of what is now known to be the biggest miscarriage of justice in UK history. Over a 15-year period, 736 postmasters were prosecuted by the Post Office for fraud, with convictions based on evidence from the Horizon retail and accounting system supplied by Fujitsu. That ‘evidence’ has been found to have been riddled with errors, and figures provided by the system should never have been used in court.

Hundreds of convictions have been declared unsafe, and 86 overturned. But those charged have endured imprisonment, bankruptcy, defamation, loss of livelihood, divorce, and suicide. The case has been extensively reported by UK trade publication Computer Weekly, which uncovered the scandal in 2009.

Jones’s letter says: “The law, in short, says that the computer evidence must be deemed true unless a defendant can prove otherwise … this is clearly out of date and needs review”. He has asked the Lord Chancellor to “personally look at the issue, not least given the much wider implications today, to accelerate an appropriate review”.

There is growing opinion within legal and IT circles that guidance on the use of computer evidence needs to change.

The UK government has previously said it had no plans to review the presumption, with under-secretary of state for justice James Cartlidge saying “it has wide application and is rebuttable if there is evidence to the contrary”.

But there is growing opinion within legal and IT circles that guidance on the use of computer evidence needs to change. The Horizon scandal has been the main driver behind calls for reform in the UK, but the emergence and widespread deployment of AI is highlighting an emerging area of legal and compliance risk for businesses.

Legal developments in the US

In the US, the Equal Employment Opportunity Commission has secured a ruling that imposes a fine on a tutoring company that programmed recruitment software to filter out older applicants. And legislators in New York are working up legislation that would ban AI-only decision-making in employment practice.

There’s much discussion in legal circles about how AI can be deployed to assist with research, the detection of wrongdoing, and the drawing up of contracts. But the importance of maintaining the primacy of human decision-making is becoming clearer all the time, something Dr Lukas Hambel of CMS pointed out in detail on GRIP recently.

Paul Marshall, the barrister who represented the UK subpostmasters who successfully overturned convictions secured because of the presumption of the correctness of computer evidence, told Computer Weekly that the judgment in their case exposed “how strikingly unfairly the presumption that a computer is working properly may operate in legal proceedings”.

And he concluded: “No computer scientist or expert would support or endorse the position in English law because it is simply wrong.”