Studies found that an algorithm used by Kentucky to predict the ‘risk level’ of defendants was heavily skewed towards one group of people.
However, despite all of the optimism about AI integration into the court system, AI could still pose problems such as racial bias
With the rapid normalization and integration of artificial intelligence in the court system, how do we guarantee that there is no bias affecting the AI’s decision-making that might favor/despise one group over another?
In this subsection, we will be going over and explaining the factors and components of creating an AI assessment tool, such as the COMPAS dataset, the ML (machine learning) steps, and training data.
Correctional Offender Management Profiling for Alternative Sanctions dataset, or also just known as the COMPAS dataset, is a risk-assessment tool that is powered by an algorithm that takes in a plethora of data to interpre
The COMPAS dataset attempts to predict the likelihood of a defendant recidivating or committing another crime in the future.