AI is already being tested in Malaysian courts, but this is why it might not be a good idea yet

In January, it was reported that Chief Judge of Sabah and Sarawak Tan Sri Abang Iskandar Abang Hashim said that the courts in Sabah and Sarawak implemented Artificial Intelligence (AI) through a pilot project. And as the AI is currently being tested and used in courts, some are concerned that there was “no proper consultation on the technology’s use”, and that it is “not contemplated in the country’s penal code”.

“Our Criminal Procedure Code does not provide for use of AI in the courts … I think it’s unconstitutional,” said lawyer Hamid Ismail.

Hamid said that he was “uneasy” that the technology was being used before lawyers, judges, and the public fully understood it. The artificial intelligence tool, which was being tested as part of a nationwide pilot, was used when a man he defended was sentenced.

The piloted AI software was developed by Sarawak Information Systems—a state government firm. According to authorities, AI-based systems make sentencing “more consistent and can clear case backlogs quickly and cheaply”. It also “helps all parties in legal proceedings to avoid lengthy, expensive and stressful litigation”.

However, critics have warned AI risks bias against minorities and marginalised groups, saying the technology lacks a judge’s ability to weigh up individual circumstances or adapt to changing social mores. For example, there are reported cases of AI favoring men over women or penalising members of ethnic minorities.

This isn’t a new finding—as newly developed AI can often show bias pretty quickly. Like when a low-resolution picture of Barack Obama was uploaded into an AI face depixeliser, and the output is a white man.

“In sentencing, judges don’t just look at the facts of the case—they also consider mitigating factors, and use their discretion. But AI cannot use discretion,” Ismail noted.

To try and stop its AI software from bias sentencing, Sarawak Information Systems said it had removed the “race” variable from the algorithm. It also noted that the company had only used a dataset of five years from 2014-19 to train the algorithm, “which seems somewhat limited in comparison with the extensive databases used in global efforts”. However, there’s no information on whether it had since expanded its database.

An analysis by Khazanah Research Institute showed that court judges followed the AI sentencing recommendation in a third of the cases—all of which involved rape or drug possession under the terms of the two states’ pilot. For the other two thirds, some judges reduced the suggested sentences, while others toughened sentences on the basis that they would not serve as a “strong enough deterrent”.

“Many decisions might properly be handed over to the machines. (But) a judge should not outsource discretion to an opaque algorithm,” said Simon Chesterman, a senior director at AI Singapore.

[ SOURCE, IMAGE SOURCE ]

Recent Posts

SoyaCincau Awards 2024: The Best Telcos of the Year

What a year it has been for Malaysian telcos. We saw mobile and internet plan…

4 hours ago

Proton e.MAS 7: Proton’s first EV has achieved over 2,500 bookings so far

Pro-Net has announced that the Proton e.MAS 7 has achieved over 2,500 bookings so far.…

6 hours ago

MCMC: Telegram and WeChat have started social media licensing process

The Malaysian Communications and Multimedia Commission (MCMC) has announced that Telegram and Tencent (WeChat) have…

8 hours ago

Will Apple build its own search engine if it loses lucrative revenue share deal with Google Search?

The US Department of Justice is currently taking steps to break up Google over its…

13 hours ago

SoyaCincau Awards 2024: The Best and Worst in Tech of the Year

2024 has certainly been an interesting year, and to a certain extent, a dramatic one…

1 day ago

SoyaCincau Awards 2024: The Best Gadgets of the Year

We can't get enough of gadgets. These companion devices make our lives easier or more…

2 days ago

This website uses cookies.