Data Management

Statistical Data-Driven Tools & Algorithms are the New Deciders of Justice

Disha Ganguli

In an attempt to avoid the overwhelm with the help of an Algorithm

Algorithms hold the promises of altering of changing a system for good.

It's a clear fact that even judges and the drivers of the law and jurisdiction are humans and an increase in the rate of crimes and case files put them into an overwhelming situation to make proper decisions. The rates of crime have risen astronomically on a global basis, amidst the pandemic 2020. There were 750 cases of murder reported in Chicago, 322 cases of homicide in Los Angeles and 437 cases in New York. Mayor de Blasio blames the mishaps on the pandemic that had people cooped up in their houses, a routine which was monotonous and lethal. He also points out at the criminal justice system whose work pace was slowed by the virus.

In order to circumvent this overwhelm, law courts are now relying on algorithms that can read the data records of criminals and make a decision accordingly which are believed to be accurate and trustworthy.

The objective of incorporation of algorithm in the law-making system.

Being ubiquitous in nature, artificial intelligence has also commenced to make its effects and influence pronounced on in the law making system as well. Here are few reasons why AI-driven decision-making algorithms are trusted by the criminal justice system.

1. Breaking the bias

Unlike humans, machines and algorithms are not prone to biases. Algorithms are employed with hopes that it will generate results from a neutral perspective, leaving no rooms for confusions. An unbiased result will be beneficial for the judges.

2. Accuracy

Algorithms are incorporated to yield accurate results. Be in matters of generating statistics of the records of the accused or make predictions, accuracy is of utmost importance as one wrong result can prove to be dangerous for everyone involved in the process.

3. Objective

Results and interpretations that are generated using the algorithms are absolutely objective. This eliminates complications in understanding, making the work for the judges easy.

Despite the algorithms being promising in transforming the criminal justice system or precisely the justice delivery system, it does not fall out of the hands of controversies.

Data scientists and especially media psychologists frown upon the absolute dependence of the criminal justice system on the algorithms. They harp on the uncertainties that prevent the algorithms from being trustworthy, through and through.

Why are these algorithms not a right choice for decision making?

Media psychologists are breaking the myth of the algorithms being accurate and unbiased. They are of the opinion that machines and AI driven algorithms are after all programmed, operated and monitored by humans. Savvy data scientists with ill intentions can always tamper with the algorithms and manipulate them to yield unfair results. This phenomena or tendency to do so is termed as "black box" where algorithms are doctored using computational tricks and which cannot be detected or be brought under suspicion.

Only delivering decisions about the one who will be taken into incarceration is not enough. The results that are delivered in an objective manner have to be backed with logical explanation, which validates the result delivered.

Initially, the algorithms for deciding the future of criminals were designed for the purpose of risk assessment. This included detecting violence streaks in the accused and if the defendant is likely to devise a new crime or inflict any harm on his cellmate.

However, the case studies made on the famous crimes committed by Armstrong and Borden who were kept behind the bars in charges of murders, misdemeanours, and theft respectively, show that these AI-drive algorithms are used for bail decision and parole. This is done with the help of a new tool named PSA or Public Safety Assessment that ranks the accused based on a scale of ten and decisions are made accordingly.

This is where uncertainties stepped in when Armstrong was granted a bail based on the decision made by algorithms, which read and interpreted his past accounts and asserted that he will show up for court hearings and is not likely to commit a new crime and Armstrong did not show up on the day of court hearing.

The algorithms only promise risk assessment

Extending the usage of algorithms to decide bails of the accused is indeed an unhealthy idea. Studies say that integration of AI-driven bots and algorithms have shown substantial and significant results when they are utilised for risk assessments in criminals.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Can XRP Price Reach $100 This Bull Run if It Wins Against the SEC, Launches an IPO, and Secures ETF Approval?

PEPE Drops 20% & Solana Faces Challenges— While BlockDAG Presale Shines With $122 Million Raised

The Benefits of Multi-Chain NFTs

Solana Price Prediction: When $1000? Big Investors Target SOL, Dogecoin (DOGE) and Rexas Finance (RXS) in Post-Election Buying Spree

The Leading Crypto Hot Wallets of 2024: A Comparative Review