Remote-monitoring and surveillance tools could devastate employee relations unless efforts are made to put more power into the hands of workers, the author of a report by the European Commission’s Joint Research Council (JRC) warns.
Kirstie Ball, who spent five months compiling the JRC’s extensive Electronic Monitoring and Surveillance in the Workplace report, says an increase in employee surveillance threatens to undermine trust and commitment to work amongst staff who are left in the dark about why and how data on them is gathered.
A spike in the use of “quick and dirty” monitoring apps prompted by pandemic-era remote working is especially concerning, Ball tells ZDNet, particularly those who use more invasive techniques to snoop on people working in their own homes.
SEE: The secret to being more creative at work? Why timing could be the key
These tools threaten the mental wellbeing of workers upon whom the COVID-19 pandemic has already taken a significant psychological toll. “One of the difficulties with remote working is that a lot of people were dropped into remote working very quickly,” the University of St Andrews professor says.
“In the pandemic, your house was everything. It was where you worshipped, it was where you worked and your school. If you drop invasive monitoring on top of all that, it’s just going to be devastating to people when they don’t have support and are isolated in their homes.”
The JRC report, based on findings from some 400 articles, found that workplace surveillance has grown more pervasive through the ‘datafication’ of work, particularly with the expansion of algorithmic platforms used widely in the gig economy by companies like Uber, Deliveroo and Amazon.
Often, gig workers rely entirely on algorithms to judge their performance and reward them accordingly. This reliance on technology, combined with a lack of autonomy and managerial support, poses significant psychosocial risks to gig workers.
“What we have is heavily datafied work; surveillance algorithms allocating work and rewarding work, and there is no human contact to mitigate it,” says Ball.
“For some platform workers, that’s fine, because they only do it to earn a bit of money from a hobby, or as a part-time top-up of their income. But for those who rely on it, it’s very, very difficult.”
THE CONTROL CHALLENGE
Remote working during the pandemic has also seen an increase in the use of monitoring technologies, some of which – such as email monitoring, biometrics, wearables, and webcam and screen recording – prompt a “very strong sense of privacy invasiveness” amongst workers.
In November, a committee of MPs and peers warned that tougher regulations are needed around the use of AI decision-making tools in the workplace to counteract the “pronounced negative impacts” that constant monitoring and micromanagement has on employee wellbeing.
Ball acknowledges that the urgency of 2020’s work-from-home orders might have led some organizations to implement employee-monitoring tools arbitrarily as a solution to the management challenge it presented.
“When it’s quick and cheap, it is going to be a temptation. Because at the end of the day, the remote working [during the pandemic] did present a control challenge. How do you keep track of people? You want to keep them in a job, but how do you keep track of what they’re doing? How do you understand what they’re doing?”
However, Ball says that technology is not a replacement for proper management techniques: “If you’re a manager in an organization who is trying to suddenly scramble to work from home and you’ve been given technology that tells you what your colleagues are doing at their desks, and take pictures of them, that might be seen as a surrogate for the performance side of it. It’s not.”
Even as surveillance creeps into employees’ homes, workers are being left in the dark about the exact nature and intention of the data being gathered on them. This situation threatens to erode trust and create resistance from employees, driving turnover rates upwards at a time when many workers are already thinking about leaving their jobs.
“The main problem with workplace surveillance is that people can sometimes feel it’s either invasive, authoritarian or excessive,” says Ball.
“When people start to feel that way about the surveillance that they’re subject to, they get a sense that work conditions are less fair and less just, they have lower job satisfaction, they have lower commitment, they have lower creativity and autonomy, and they feel they’re not trusted. Their stress levels go up, and what that means is that they are more likely to quit.”
THE CREEPS
Function creep is another problem presented by monitoring technologies, whereby employers gradually begin to gather more data on their workers that goes beyond what is necessary.
This issue has also been exacerbated by the COVID-19 health crisis, with some organizations having rushed to deploy remote management and monitoring tools without robust policies or clear directives around their use.
“When people start to perceive surveillance as authoritarian intrusiveness… it can sometimes be because the purpose is not clear, or it’s suspected that the purpose has been exceeded, or not properly communicated,” says Ball.
The way that monitoring is perceived by people is also connected with how feedback is used. Organizations can determine a better course for surveillance tools – which Ball accepts “isn’t going to go away” – by looking to empower employees, as opposed to questioning their ability to do their job or their moral integrity.
“The big question for me is whether it will ever be possible for organizations to make worker data available to workers and equip them with the skills and knowledge to look at the data… so that they themselves may start to make decisions about their own personal development and career development, rather than it just being harnessed by organizations to turn the screw?”
When it comes to the growth of AI-driven decision making in the workplace, organizations should be wary of the very real issue of data bias and discrimination. Interrogating decisions made by algorithms should likewise be held to very high scrutiny, argues Ball.
“There should be more of a culture of actually questioning these outputs and interrogating them, in terms of their consequences and veracity… we need to be transparent about how our data is processed, we need to know that it’s not a dangerous or harmful process, and discuss it and challenge it if we wish.”
Source: ZDNet