AI expert calls for end to UK use of ‘racially biased’ algorithms | THE GUARDIAN

Prof Noel Sharkey, who is also a leading figure in a global campaign against “killer robots”, said algorithms were so “infected with biases” that their decision-making processes could not be fair or trusted.’

Decision-making algorithms should be treated like prescription drugs and subject to rigorous tests before general release, says a Sheffield professor. THE GUARDIAN reports Professor Noel Sharkey wants prediction algorithms to be tested “on millions of people, or at least hundreds of thousands of people, in order to reach a point that shows no major inbuilt bias.”

Sharkey told the Guardian he typically favours only “light regulation” so that innovation is not curtailed. He’s changed his thinking because of algorithms that impact human lives. “Why? Because they are not working and have been shown to be biased across the board,” he said.

OUR TAKE

  • This is the second pharmaceutical analogy used by a leading AI researcher when expressing concern about the power of algorithms.
  • In an August 2019 op-ed in WIRED, three US experts advocated for regulation of algorithms using a Federal Drug Administration (FDA) model

SEE FULL STORY

AI expert calls for end to UK use of ‘racially biased’ algorithms
THE GUARDIAN | December 12, 2019 | by Henry McDonald

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.