Computer science researchers Alexandra Meliou and Yuriy Brun at the University of Massachusetts Amherst have received a four-year, $1.05 million grant from the National Science Foundation to study how software systems can exhibit bias and how software engineers can develop fairer, more equitable systems.
Meliou says, “Software makes decisions in what products we are led to buy, who gets a loan, self-driving car actions that may lead to property damage or human injury, medical diagnoses and treatment, and every stage of the criminal justice system including arraignment and sentencing that determine who goes to jail and who is set free.”
“And yet, examples of discrimination have shown up in many software applications, including advertising, hotel bookings, facial recognition, and image search,” she adds. “And as of today, software plays an increased role in many decisions, fairness is becoming a critical property.”
Current software engineering processes do not support measuring bias in software, the researchers point out. Requirement specification, which lays out functional and non-functional requirements, is ad-hoc. There are no validation and verification methodologies and no manual test guidelines nor automated test generation techniques to help ensure fairness. Their new project will address these and other shortcomings, following their NSF grant last year to explore the feasibility of using software engineering techniques to detect and enforce fairness properties.
As Brun explains, “We plan to create a theoretical foundation of software fairness, including defining a suite of fairness metrics that can be used to describe desired properties of software. We also plan to create algorithms for testing and verifying the software fairness, for identifying bias causes and for debugging discrimination bugs.”
He adds that the project will produce benchmarks and methodologies to evaluate the effectiveness of both existing and new fairness-seeking learning techniques. “A part of the project will focus on measuring causal relationships between sensitive attributes and software behavior and addressing potential bias in software with complex inputs, such as facial recognition and natural language processing.”
Meliou and Brun plan to organize international workshops on software fairness, such as FairWare 2018 that took place in May 2018 in Gothenburg, Sweden. Their project will also improve both undergraduate and graduate education in terms of fairness-aware software and data engineering.
This work builds on Meliou’s and Brun’s award-winning work on automatically generating test suites to measure causal relationships that may exhibit bias, published in the 2017 Joint Meeting of the European Software Engineering Conference and the Association of Computing Machinery (ACM) SIGSOFT Symposium on the Foundations of Software Engineering. It was recognized with an ACM Special Interest Group on Software Engineering Distinguished Paper Award.