UNI Global Union, representing 20 million service workers in more than 150 countries, is launching a campaign today to include workers’ input on how algorithmic management is implemented on the job.
The global union federation’s Professionals & Managers group is reaching out to its member unions to support collective bargaining for the ethical implementation of these tools, which are being increasingly used in workplaces around the world.
Algorithmic management can range from simple software tracking employees’ working time or scanning C.V.s for keywords, to much more sophisticated tools that use machine learning or other forms of artificial intelligence to predict customer footfall in shops, allocate shift patterns, give tasks to workers, or even decide who to hire, promote, or reassign based on a potentially huge amount of collected data.
“Algorithmic tools can unlock efficiencies, but they also pose several significant risks—particularly increased surveillance and data collection, dehumanisation of work, and exacerbating workplace discrimination,” said Christy Hoffman, General Secretary of UNI Global Union. “With the spike of remote work during the Covid-19 pandemic, employers have accelerated their use of these tools to monitor staff, and unions must be proactive in making sure that their use is transparent, non-discriminatory, and negotiated with workers.”
Algorithmic management is most often used recruitment, performance management and everyday workplace decision-making in a wide variety of circumstances. Its uses include the haptic feedback devices that Amazon warehouse workers wear that vibrate to guide their arms to the correct shelf so they can pull merchandise more efficiently. This kind of hyper-efficiency can be extremely stressful for workers who feel constantly under pressure, but it also removes people’s autonomy, making them feel like a mere machine, not even trusted to make decisions about their own limb movements or about which box size to use or which length of tape to cut to seal it.
Call centre operators use algorithmic tools to monitor employees’ language and their tone of voice to ensure they keep a positive attitude with customers. For instance, programs like Cogito or Voci use voice analytics A.I. to provide real-time feedback to workers about whether they are speaking too fast, sound too tired, or insufficiently empathetic. In addition to the reams of data gathered on each employee, workers have said that these programs add stress to their job, and that it feels like the algorithmic tool is dictating what emotion workers should feel.
Even worse, workers report that these kinds of programs are more likely to misinterpret words and expressions of women, employees with regional accents, and racial and ethnic minorities. Too often, their performance is misevaluated due to biases built into the algorithm and cannot be corrected by human management.
“Just because a decision is made by an algorithm, it does not mean that employers get to wash their hands of responsibility for the outcomes,” said Alex Högback, Head of UNI Professionals & Managers. “Like any new technology, our aim is to make the implementation of these tools as fair as possible, so they benefit both the employee and their employer.”
UNI notes that algorithms, when appropriately designed and implemented, can help reduce bias in human resources. For example, studies have shown that hiring managers’ biases inform their decision making. For example, one U.S. experiment found that job applicants with Black-sounding names received 50 per cent fewer interview offers than identical C.V.s with Anglo-Saxon names on top. A well-considered algorithmic approach could help reduce overall bias in the interview process and give candidates who might face discrimination a fairer hearing.
UNI believes unions should aim to negotiate an ‘algorithmic use agreement’ with employers that covers key demands for ethical use of algorithmic management. Key demands in such an agreement would include the right to know what tools are being used, knowledge of what data is being collected and why, and the right to access data collected about them through these tools. Additionally, UNI advocates for a human-in-command approach to these tools, and audits to endure non-discriminatory outcomes. A full list of demands and bargaining guide is available in English here.