Technology’s only as good as its imperfect human programmers.
That’s been the general rule of thumb for climate-change modeling. That’s been the talked-about challenge for artificial intelligence development for years.
And now, given a new report from Reuters that revealed how Amazon’s employment recruitment A.I. disregarded and dismissed qualified candidates based solely on sex — they were women — that’s the apparently unfixed, unchanged challenge of the technology world.
Amazon scrapped its algorithm-based program. But that’s like putting a Band-Aid on a long-festering wound.
Bias in A.I. has been a long-studied, long time problem. So far, the solution’s proven elusive.
In May of 2016, ProPublica found that COMPAS, an algorithm used to estimate how likely a criminal was to re-offend, was racially biased and predicted blacks, much more so than whites, were at higher risks for recidivism.
Also in 2016, the policing tool PredPol, revered for its so-called ability to predict crimes before they occur — and in so doing, enable local law enforcement to better utilize and manage budgets, manpower and resources — was outed by one human rights’ group for unfairly targeting neighborhoods with large racial minority populations. Part of the technological bias, critics said, came from the fact the A.I.-fueled software based its predictive powers solely on reports from police, not on true crimes and actual arrests.
In February of 2018, researchers with the Massachusetts Institute of Technology discovered that three emerging facial recognition programs, all of which were commercially available, were prone to skin and gender biases.
“In the researchers’ experiments,” MIT News reported, “the three programs’ error rates in determining the gender of light-skinned men were never worse than 0.8 percent. For darker-skinned women, however, the error rates ballooned — to more than 20 percent in one case and more than 34 percent in the other two.”
[email protected] or on Twitter, @ckchumley.
Copyright © 2018 The Washington Times, LLC. Click here for reprint permission.