Algorithmic Accountability is the next step for data journalism

Algorithmic Accountability is a sub-genre of data journalism in the making. The term got established by the American journalism researcher Nicholas Diakopoulos. His report "Algorithmic Accountability Reporting: On the Investigation of Black Boxes" was published in the beginning of 2014. It outlines a new task for journalists: They must understand software systems as objects for investigations. In this, mere transparency can’t be the goal - often it doesn’t help to have a look at the software code of so called artificial intelligence or machine learning. Without the data these systems are trained on, their workings can’t be understood. Thus Algorithmic Accountability thrives for intelligibility.

Different to the „traditional“ data journalism, which operates with sets of manual or automated gathered data, Algorithmic Accountability cares about how data is processed and/or generated. A fine example is the work by Pro Publica in its series "Machine Bias" from 2016. Among others the newsroom did research on a software widely used in court rooms across the US to determine if an offender should get parole. It found that the software was reproducing racism. The responsible private software company was not willing to shed light on how its software product was operating in detail. Pro Publica aquired data on offenders of a county by a Freedom of Information request and kind of reverse engineered the software system in question.

Algorithmic Accountability is a logical next step for data journalism in world of Automated Decision Making (ADM): Democratic societies which are more and more relying on governance by and with software have to be able to understand and control these "machines".


< Back to the tree



Views 832

About

Lorenz Matzat

Lorenz Matzat is the co-founder of the NGO AlgorithmWatch.

snow flake
© 2017 Journocode