University of Southampton Report Advocates for New Rights to Safeguard Workers Against Unfair Algorithmic Practices

A report published today [6 June] calls for a new generation of rights to protect workers from the rise of ‘management by algorithm’.

The report published by the Institute of Employment Rights says that algorithmic management threatens to degrade workers’ rights and conditions and that current protections in the law are inadequate in the face of technological change.

To address this, the report’s authors from the universities of Southampton and Bristol are setting out a new generation of rights for the era of algorithmic management at a launch event in London today attended by fellow legal experts, union leaders, and policy makers.

“Technology has revolutionised the way we work in the last 30 years,” says Dr Joe Atkinson, a lecturer in employment law at the University of Southampton and co-author of the report. “Now, it is radically changing the way that we are managed at work.

“Algorithms are doing the jobs of line managers in the way many workers are recruited, directed, and disciplined.  These management systems generate recommendations through complex algorithms, underpinned by huge amounts of data processing about workers and workplaces.

“These practices pose a pressing threat to the enjoyment of decent working conditions, as well as to the effective use of worker voice and workers’ exercise of their human rights.”

More and more workers are being managed by algorithms, from warehouse staff who carry handheld devices that issue instructions and track their movements, to drivers who can be issued with a ‘log off’ penalty if they fail to comply with rules regarding cancellation and acceptance of tasks. The use of such systems accelerated during the pandemic as employers sought to monitor, manage and control newly remote workforces.

“The UK has so far taken a ‘hands-off’ approach to regulating algorithmic management, and not introduced any legislation specifically targeting these practices” says Dr Philippa Collins, a senior lecturer in law at the University of Bristol and co-author of the report.

“While there are some existing legal frameworks that should guide and constrain employers’ use of algorithmic management systems, these are not capable of effectively protecting workers.”

The report sets out three areas in which algorithmic management impinges on workers: worker voice, quality of work and working conditions, and workers’ human rights.

The lack of transparency and complexity of algorithmic management systems are a major barrier to workers challenging their outputs. The TUC found that only 21 per cent of workers surveyed were confident that they or their union could effectively challenge decisions made by these systems.

The report highlights several ways algorithmic management reduces the quality of work including the intensification of pace of work; a reduction in autonomy and deskilling of labour; and increased control exercised over the workforce due to constant monitoring and evaluation.

This systematic collection of data also affects our right to privacy. Systems have also been known to make biased and discriminatory decisions affecting workers’ right to equality and non-discrimination – a hiring algorithm had marked applicants down if their CVs contained the word ‘women’s’.

The report sets out how current regulation falls short when protecting workers from these threats, including in legislation around collective bargaining, health and safety, data protection, equality and unfair dismissal.

Dr Collins adds: “For example, workers will struggle to use the Equality Act to challenge algorithmic discrimination. In most cases, workers will be unaware that they have been subject to a potentially discriminatory algorithmic system. Even when they are, it will be difficult for them to access the information needed to challenge the algorithm successfully.

“The problems of discovering and proving discrimination are not unique to algorithms, but they are particularly pronounced given the lack of transparency and understanding of such systems.”

In addition, some key employment rights aren’t available to those working in the ‘gig economy’ who aren’t classed as employees.

“Rethinking worker protection for the era of algorithmic management provides a real opportunity to break away from the classifications of status that employment law is currently founded upon,” says Dr Atkinson.

The new protections set out in the report include the requirement for systems to be deployed in a manner consistent with workers’ human rights; the prohibition on certain practices deemed to be ‘automatically unfair’; and the right to have a human explain and review decisions taken or supported by an algorithmic system.

To enforce these rights, the report calls for a specialist regulator to be set up, and for any company that develops or sells algorithmic management systems that breach employment law to face joint liability alongside the implementing organisation.

Dr Atkinson says: “These reforms are necessary if we are to secure decent working conditions and workers’ rights in the age of algorithmic management.”

“Currently, however, there is a real risk that the challenges presented by algorithmic management are being overlooked. Any incoming Government needs to make sure they are future proofing employment law for the age of AI, not just addressing the problems of the past”.