Bernard Harcourt | September 8 | Vice

If California’s current effort to abolish cash bail succeeds, it would establish a system that keeps people in jail based on the recommendations of “crime prediction” algorithms—technology that experts criticize for enabling discrimination.

Under Proposition 25, a ballot measure up for vote in the state this November, judges will be required to use an algorithm, known as a “risk-assessment tool,” that predicts whether a defendant will be re-arrested or skip their trial if released. With a high enough “risk” score, a defendant will be jailed, and they won’t have the chance to purchase their freedom, since the legislation abolishes cash bail.

A judge can override the algorithm’s recommendation, and sometimes they do, experts say, but usually to lock people up. “Judges rarely override ‘detain’ recommendations from risk assessment tools and often override ‘release’ recommendations,” John Raphling, a senior researcher at Human Rights Watch, told Motherboard.

Between ending bail and implementing big-data analytics, the legislation has won plaudits from many reformers, who say it’s a scientific solution to the racism and classism of money bail that redresses worries about re-arrests and missed hearings. But predictive algorithms reproduce systematic discrimination, scholars and advocates say.


Originally published by Vice. Read more here.