MFIA Study Calls for Algorithmic Accountability

algorithm

A new study from the Media Freedom and Information Access (MFIA) Clinic surveys the sufficiency of current transparency laws in Connecticut to ensure public accountability when algorithms are used to perform government functions.

Co-authored by MFIA students Chloe Francis ’23, David Froomkin ’22, Eli Pales ’22, Karen Sung ’23, and Kataeya Wooten ’22, under the supervision of Floyd Abrams Clinical Lecturer in Law David Schulz ’78 and Local Journalism Fellow Stephen Stich ’17, the study identifies major shortcomings in current disclosure laws and identifies potential reforms to promote algorithmic accountability and transparency.

MFIA undertook the study in response to reports of problems around the country. Many had found that public sector algorithms were operating largely in the dark, leading to accounts of algorithms not performing as intended or having unanticipated disparate impacts across racial or socioeconomic lines. Because of the potential for these dangers, the study argues, such algorithms require robust public oversight.

To learn more about the state of algorithmic accountability and transparency in Connecticut, the MFIA researchers filed Freedom of Information (FOI) requests with three state agencies. The responses revealed that the impact of the algorithms was opaque. In some cases, the agency did little to evaluate the algorithm for effectiveness or bias, while in others the agency provided no meaningful response to the FOI request, according to the study. In a further barrier to transparency, all three state agencies invoked the Connecticut Freedom of Information Act’s trade secrets exemption to justify withholding basic information about the algorithm’s functioning.

“Algorithms are supposed to promote efficiency, accuracy, and fairness,” said Frookmin. “But in practice in a lot of cases there just isn’t sufficient transparency to know if they’re doing what they’re supposed to.”

At one state agency, the Department of Children and Families, an algorithm was being used to facilitate intervention in suspected cases of child neglect. But Illinois had used the same algorithm and found it to be ineffective.

“In cases like this the fact that an algorithm is ineffective actually does affirmative harm to many people,” said Pales. “In view of the evidence from Illinois, the fact that Connecticut was not undertaking robust monitoring is concerning.”

The report hopes to spur the state legislature to form a commission to further investigate agencies’ use of algorithms. A commission would be able to propose potential reforms and solutions to increase accountability and transparency in Connecticut and serve as a model for other states moving forward.

“At the end of the day, we’re not against algorithms,” said Froomkin. “On the contrary, maybe our report will help fulfil some of the promises and some of the hopes that that came with their adoption.”

“This project is about accountability,” Pales added. “It is about making sure that algorithms are used for good, that they are effective, and that they’re not discriminatory. We want to ensure that at every step of the way — during the process of adoption, during implementation and, ultimately, if algorithms stop being used in a certain agency — there are checks in place to ensure that the algorithm is vetted and properly used.”

The Media Freedom and Information Access Clinic is dedicated to increasing government transparency, defending the essential work of news gatherers, and protecting freedom of expression through impact litigation, direct legal services, and policy work.