Exeter research identifies the development and use of recommendation systems in public service media organisations in the UK and Europe

0

Recommendation algorithms have become an inescapable feature of how we consume content in the digital age. We experience them on a near-daily basis, whether we’re streaming podcasts, browsing boxsets or reading news articles. This has benefits, but also brings ethical and societal risks, for example around ‘echo chambers’, polarisation, transparency and accountability.

In this context of rapid change and innovation, the Institute’s research into algorithmic recommendations within public service media organisations has identified ways to challenge the existing commercialised state-of-play established by big technology platforms.

The research also found that the idea of ‘public service value’ needs to be redefined for the digital age and that more research into algorithmic recommendation systems is needed to address some of the ethical challenges they may pose.

Public service media organisations recognise the challenges involved in creating and using recommendation systems and many are actively working to address them.

The Ada Lovelace Institute encourages organisations to build on this work by increasing algorithmic transparency, giving users and wider society greater control and developing ways for them to participate in the research and development of these systems.

The report makes nine specific recommendations for future research, experimentation and collaboration between public service media organisations, academics, funders and regulators:

Define public service value for the digital age
Fund a public R&D hub for recommendation systems and responsible recommendation challenges
Publish research into audience expectations of personalisation
Communicate and be transparent with audiences
Balance user control with convenience
Expand public participation in design and evaluation
Standardise metadata
Create shared recommendation system resources
Create and empower integrated teams
These recommendations were developed through a literature review and interviews with engineering, product and editorial staff at the BBC, who partnered with the Institute on the research, as well as interviews with the European Broadcasting Union, NPO (Netherlands), ARD (Germany), VRT (Belgium), BR (Bavaria), SR (Sweden), academics, civil society and regulators.

They address some of the ethical issues raised by the use of recommendation systems in public service media, and indicate further areas for research which could support the development of recommendation systems in a way that works for people and society.

Dr Silvia Milano, Lecturer in Philosophy of Data at the University of Exeter and a member of Egenis, the Centre for the Study of Life Sciences said: ‘Recommender systems are the lifeblood of the internet and serve a huge number of goals – from navigating through vast pools of options, to allowing content to be discovered and businesses to ultimately succeed. Yet their operation can often be opaque, which raises several ethical challenges.

‘By automating some editorial judgements, and increasing personalisation, recommender systems can help public service media to achieve important objectives, including reaching new audiences and adapting their communication for the digital age.

‘We have a key opportunity to shape the public conversation around which values are enshrined in technology through our recommendation to make this part of the national AI strategy.’

Carly Kind, Director at the Ada Lovelace Institute, said: ‘There is a real opportunity for public service media to develop a new, responsible approach to algorithmic recommendation, one that works for people and society and offers an alternative to the commercial paradigms of big technology platforms.

‘We encourage funders and regulators to support public service media organisations to engage in responsible innovation as they develop and use recommendation algorithms.’