IN AMERICA, computers have been used to assist bail and sentencing decisions for years. Their proponents argue that the rigorous logic of an algorithm, trained with a vast amount of data, can make judgments about whether a convict will reoffend that are unclouded by human bias. Two researchers have now put one such program to the test. According to their study, published in Science Advances, COMPAS did neither better nor worse than people with no special expertise.

Julia Dressel and Hany Farid of Dartmouth College in New Hampshire selected 1,000 defendants at random from a database of 7,214 people arrested in Broward County, Florida between 2013 and 2014, who had been subject to COMPAS analysis. They split their sample into 20 groups of 50. For each defendant they created a short description that included sex, age and prior convictions, as well as the criminal charge faced.

Read more: Are programs better than people at predicting reoffending?

Don’t forget to share this via , Google+, Pinterest, LinkedIn, Buffer, , Tumblr, Reddit, StumbleUpon and Delicious.

Published by Mike Rawson

Mike Rawson has recently re-awoken a long-standing interest in robots and our automated future. He lives in London with a single android - a temperamental vacuum cleaner - but is looking forward to getting more cyborgs soon.

Leave a comment

Your email address will not be published. Required fields are marked *

Are programs better than people at predicting reoffending?

by Mike Rawson time to read: 1 min
Hi there - can I help you with anything?
[Subscribe here]
More in Machine Learning, News
Making drones easier to use
Making multicopters easier to use will increase the number in use

SMALL multicopter drones—souped-up versions of those sold by the million as Christmas toys—have tremendous potential for use in industry and...