18 April 2018
Author: Peter Maxwell
CEO Stewart Butterfield described its proposed system as a form of ‘personal analytics’. ‘These are analytics that no one else has access to you except you,’ said Butterfield, speaking at the Wharton People Analytics Conference. ‘And they don’t present you with any real moral value either way, but [they answer questions such as] do you talk to men differently than you talk to women?’
As several commentators have noted, this is likely to sound to many of Slack’s more than 6m daily active users like a new form of surveillance. With the enforcement date of the General Data Protection Regulation (GDPR) looming and Mark Zuckerberg’s congressional hearing still fresh in many people’s memories, the notion that another technology unicorn wants to create a personal profile of its users is likely to cause some consternation.
Although the system Butterfield envisages would be a means of self-assessment, essentially providing a mirror for employees to assess their own conduct, as Quartz reporter Leah Fessler notes: ‘[Slack’s] potential to put data behind damaging (and positive) communication dynamics on a person-by-person basis is unprecedented.’ What is more, given that many large organisations already stipulate that they have access to all internal communications, it is not that far-fetched to think that such data might eventually be used to inform the staff appraisal process.
And would that be such a bad thing? After all, employees’ interactions with their colleagues in real life will be noted by their employer, and likely to form part of any professional assessment. Indeed, several initiatives exist that use data analytics to measure how gender inequality is manifest in analogue business conversations. Launched in 2017, All.ai measures how often you speak up in a meeting and who you were speaking to (or over), as well as assessing whether you sounded positive or nervous. ‘Women in meetings are often overlooked or ignored or interrupted,’ explains Rumman Chowdhury, an artificial intelligence specialist at Accenture who helped develop the app. ‘In some situations men will get the credit. We wanted to disrupt how meetings are handled.’
Even though it is not the intention at present, Butterfield believes that enabling brands to have this sort of internal oversight could have real benefits. ‘If the result of that [data] is not: ‘Hey, it turns out you’re a jerk and we’re firing you,’ but ‘Hey, it turns out we’ve identified some problems around communication or management structure or organisational design, which inhibits the kind of progress we want to make, and therefore we’re going to rectify them,’ that’s a good thing.’
The conflict here lies again – and this is where the debate about GDPR is particularly relevant – in having full transparency about the parameters against which this data is being judged and the frameworks in place for individuals to be able to review and challenge their personal profiles. If those safeguards can be achieved, perhaps we should welcome a little more surveillance into this part of our existence. We recently explored how many HR experts already believe that AI systems will increasingly mediate our working lives and eventually play a fundamental role in determining our future career progression. Slack’s initiative might be the first step on that road. If it places gender equality at the centre of how such systems develop in the future, so much the better.
For more on how to build the right ethical architecture for your business, read our Morality Recoded macrotrend.