The Impact of Algorithms on Rights

JusticeAI June 16, 2020 By: Emily Rabbitt, CAE

As use of algorithmic systems increase, so do questions about their built-in biases. New research from ASAE ForesightWorks outlines potential pitfalls and insights for associations that use these systems.

Algorithmic systems—human-coded computational processes for collecting and analyzing data to create outputs—are increasingly being used by governments, corporations, and other organizations. While they are adopted to provide organizations with more data and improved workflow, their implications are not yet fully understood. A dearth of both transparency and regulation around these systems exacerbates the uncertainty.

The forthcoming ASAE Foresight Works “Algorithms and Rights” action brief provides an association lens on issues inherent in using algorithms in decision-making. Associations may use algorithms for human resources management or member engagement, but leaders should be wary when they do. These systems reflect the biases of their programmers—including, but not limited to, race and gender bias—which can perpetuate inequities. As the Algorithmic Responsibility pillar of ASAE’s Diversity +Inclusion Strategic Plan [PDF] notes, “Algorithms find patterns within data sets that reflect implicit bias and, in so doing, maintain some of the same biases that permeate society.”

The challenge with algorithmic systems is that the potential for bias is not contingent on just one factor. According to the action brief, algorithms consume a massive amount of data in drawing conclusions. For example, U.K. payday lender Wonga analyzes 7,000 data points to assess whether an applicant would be likely to default.

Associations can combat this issue by evaluating the algorithmic systems they use to eliminate bias. Leaders can examine how algorithms are being deployed—and what thought leaders are saying about future uses. Consider which uses might be harmful: Gather information, engage experts and stakeholders, and be prepared to take a stand when you identify harmful biases.

Algorithmic systems will affect many decisions across society, including payer and insurance reimbursement, job candidate screening, and regulatory compliance. Not all use of algorithmic systems is regulated, and when it is, the rules vary by locality. Associations have the opportunity to translate information and help members understand laws and regulations where they exist. Leaders may also seek a seat at the table when legislation is being crafted and lobby for legislation that will be fair and advantageous for their members.

Algorithmic rights, particularly as they pertain to bias, will be an ongoing issue across all industries. If associations commit the time and resources to do so, they can help drive a more equitable digital environment.

Emily Rabbitt, CAE

Emily Rabbitt, CAE, is a former manager of research content and knowledge resources for the ASAE Foundation.