Research initiative launched to address questions about responsible AI use

A £24.4m initiative has been developed to help organisations answer the most pressing questions around the responsible use of artificial intelligence (AI).

Through a series of fellowships, researchers will address a range of AI-related challenges in public organisations, industry, and the third sector.

Appointed from universities across the UK, the Fellows will apply research expertise from humanities and arts, including data ethics, copyright law, digital design, and qualitative analysis, to address questions about responsible AI use.

The Bridging Responsible AI Divides (BRAID) Fellowships are part of the BRAID programme.

The £15.9m six-year programme is led by the University of Edinburgh in collaboration with the Ada Lovelace Institute and the BBC.

BRAID is funded by the Arts and Humanities Research Council, part of UK Research & Innovation (UKRI).

Project partners

All 17 Fellows will partner with an organisation from the public, private, or third sector to unite expertise for tackling AI challenges.

Partners from the technology sector include Datamind Audio, Diverse AI, Adobe, Mozilla Foundation, and Microsoft.

Project partners from regulatory and public organisations include The Alan Turing Institute, BBC, Ada Lovelace Institute, Public Media Alliance, and Institute for the Future of Work.

The Fellows will also work with cultural institutions, including the Arts Council England, Edinburgh International Book Festival, Serpentine Galleries, and Royal Botanic Gardens Kew.

BRAID Co-director Professor Ewa Luge, Chair in Human-Data Interaction at Edinburgh College of Art, said: “The 17 Fellowships offer opportunities for deeper relationships and joint impact, moving towards a genuine embedding of arts and humanities knowledge within how we think about, develop and deploy AI in practice and in the world.

“It is our hope that with these connections, and working towards common challenges across sectors and diverse communities, we will take substantial strides towards a more responsible AI ecosystem.”

The initiative will develop a responsible AI framework

The projects will address several pressing questions, including examining approaches for the use of generative AI in the media and exploring the societal and ethical factors shaping the adoption of AI in a medical setting.

The initiative will also develop a responsible AI innovation framework for the arts and culture sector to support the needs of creatives when using AI.

Other collaborations will research the issue of copyright and generative AI in creative and cultural industries, including the impact of generative AI on writing novels, exploring the creation and ownership of AI-generated sounds, and examining the impact of generative AI in publishing.

The Fellowship will support the creation of an AI ecosystem which will allow both researchers and industry leaders to develop a deeper understanding of the challenges and opportunities surrounding AI.

Click here for a full list of Braid Fellowship recipients.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners



Similar Articles

More from Innovation News Network