New innovation challenge launched to tackle bias in AI systems

UK companies can now apply for up to £400,000 in government investment to fund innovative solutions tackling discrimination and bias in AI systems.

The competition will look to support up to three groundbreaking homegrown solutions, with successful bids securing a funding boost of up to £130,000 each to tackle bias in AI.

It comes ahead of the UK hosting the world’s first major AI Safety Summit to consider how to best manage the risks posed by AI while harnessing the opportunities in the best long-term interest of the British people.

Tackling bias in AI systems is a major priority

The first round of submissions to the Department for Science, Innovation, and Technology’s Fairness Innovation Challenge, delivered through the Centre for Data Ethics and Innovation, will nurture the development of new approaches to ensure fairness underpins the development of AI models.

The challenge will tackle the threats of discrimination and bias in AI by encouraging new approaches, which will see participants building a wider social context to develop their models from the off.

Fairness in AI systems is one of the government’s key principles for AI, as set out in the AI Regulation White Paper. AI is a powerful tool for good, presenting near-limitless opportunities to grow the global economy and deliver better public services.

bias in ai
© shutterstock/aniqpixel

Minister for AI, Viscount Camrose, said: “The opportunities presented by AI are enormous, but to fully realise its benefits we need to tackle bias in AI.

“By ensuring AI models do not reflect bias found in the world, we can not only make AI less potentially harmful but ensure the AI developments of tomorrow reflect the diversity of the communities they will help to serve.”

Harnessing a new, UK-led approach

While there are a number of technical bias audit tools on the market, many of these are developed in the US.

Although companies can use these tools to check for potential bias in AI systems, they often fail to fit alongside UK laws and regulations.

The challenge will promote a new UK-led approach which puts the social and cultural context at the heart of how AI systems are developed, alongside wider technical considerations.

This will focus on two areas. First, a new partnership with King’s College London will offer participants from across the UK’s AI sector the chance to work on potential bias in AI models. The model, developed with Health Data Research UK with the support of NHS AI Lab, is trained on the anonymised records of more than ten million patients to predict possible health outcomes.

Second is a call for ‘open use cases’. Applicants can propose new solutions which tackle discrimination in their own unique models and areas of focus, including tackling fraud, building new law enforcement AI tools, or helping employers build fairer systems which will help analyse and shortlist candidates during recruitment.

Companies currently face various challenges in tackling bias in AI, including insufficient access to data on demographics and ensuring potential solutions meet legal requirements.

The CDEI is working closely with the Information Commissioner’s Office (ICO) and the Equality and Human Rights Commission (EHRC) to deliver this Challenge. This partnership allows participants to tap into the expertise of regulators to ensure their solutions marry up with data protection and equality legislation.

Stephen Almond, Executive Director of Technology, Innovation and Enterprise at the ICO, explained: “The ICO is committed to realising the potential of AI for the whole of society, ensuring that organisations develop AI systems without unwanted bias.”

Baroness Kishwer Falkner, Chairwoman of the Equality and Human Rights Commission, added: “Without careful design and proper regulation, bias in AI systems has the potential to disadvantage protected groups, such as people from ethnic minority backgrounds and disabled people.

“Tech developers and suppliers have a responsibility to ensure that the AI systems do not discriminate.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network