Is AEDT an ethical issue? As a business ethics consultant and business ethics speaker and ethics book author, I can assure you it is. AEDT or “automatic employment decision tool,” is part of the boom in AI software. However, there may be a problem with the software: an inherent bias. In fact, New York City government is pushing back on its indiscriminate use and is legislating around it.
According to the July 2023 online issue of Venture Beat:
“Under the AEDT law, it will be unlawful for an employer or employment agency to use artificial intelligence and algorithm-based technologies to evaluate NYC job candidates and employees — unless it conducts an independent bias audit before using the AI employment tools. The bottom line: New York City employers will be the ones taking on compliance obligations around these AI tools, rather than the software vendors who create them.”
Bias
Bias is all around us in the workplace. As a business ethics consultant and business ethics speaker, I have spoken to numerous groups about biases. According to business writer Jon Brown at Fox News (July 6, 2023):
“A new law in New York City is mandating that businesses using artificial intelligence programs to find employees must demonstrate the process is free from racism and sexism.
Under legislation that went into effect Wednesday [June 28, 2023] a third party is required to conduct a “bias audit” on any (AEDT), which increasing numbers of companies are using to search for potential hires and eliminate candidates.”
However, the most ardent critics of AEDT also claim built-in biases could include age, religious affiliation and disabilities.
The new bill about the use of automatic employment decision tools, is also requiring those using the tool to make it clear the screening mechanism is in place. In addition, after the hiring process has been concluded.
Said Julia Stoyanovich, a computer science professor at New York University:
“It’s an important start but still very limited…I’m really glad the law is on the books, that there are rules now and we’re going to start enforcing them. But there are also lots of gaps. So, for example, the bias audit is very limited in terms of categories. We don’t look at age-based discrimination, for example…”
The “For Example” Equation
As a business ethics keynote speaker and business ethics consultant, I am not debating AI. In the right hands, it is an incredible tool that will help many aspects of modern life. However, the much larger issue is this: “Is the software as ready for all of the situations we would like to believe it is, or are there “lots of gaps” in the technology that all of the experts have overlooked?” Worse, are there gaps where the experts refuse to look?
There is an old expression about people not knowing what they don’t want to know. In this case, could it be the hiring managers in New York City government?
So, for example, if I am an excellent candidate who may need a mobility device such as a wheelchair, is there a bias automatically triggered in the AI software in terms of accommodation?
Suppose another candidate, a hearing-impaired applicant requires common assistive technology, is in fact, the perfect candidate, but the AI driven software flags the software as a problem?
Unless, humans are the final arbiters, AEDT or “automatic employment decision tools,” may be as exclusive as they claim to be inclusive.
How is the national debate on AI, specifically AEDT framed and does the debate need to be re-examined? What good is the use of being equitable to everyone along racial lines, but at the same time “accidentally” kicking out a person of color because they are disabled? And, what about age? What does AEDT propose to do about a 63-year-old qualified person who is automatically discriminated against because of a 55-year-old built-in bias?
The “For Example” questions in regard to this software, are more ethical than technical. Perhaps it is long past due to discuss the avoidance of ethical problems rather than the back-patting involved with coming up with lists of candidates who still fit a mold of bias.
LEAVE YOUR COMMENTS!