Daily BulletinDaily Bulletin

The Conversation

  • Written by Uri Gal, Associate Professor in Business Information Systems, University of Sydney
image

Using computer algorithms to make decisions about employees might seem like an objective management strategy, but it could actually give an inaccurate picture of productivity and compromise employees’ rights in the process.

Many businesses are turning to algorithms to make decisions about hiring and firing employees, assessing their performance and enhancing their productivity. This practice, known as people analytics, is fundamentally reshaping today’s workplace.

People analytics relies on comprehensive collection of digital data about employees’ behaviour. The data can come from employees’ key performance indicator reports, email traffic, in-office interaction patterns, and social networking activity. Once collected and aggregated, data are analysed for patterns by algorithms to inform managerial decisions.

The increasing use of people analytics gives rise to several ethical issues, as well as questions over whether it actually works.

Ethical issues with people analytics

The application of people analytics invades employees’ privacy by tracking their phone, email and internet browsing activity to understand their work interactions and level of engagement. In some cases, it requires employees to wear badges that monitor their physical movements, tone of voice and conversation patterns.

It also threatens to limit employees’ ability to express their creativity and individuality in the workplace.

Many companies, like UK supermarket giant Tesco, use digital tracking devices that closely watch every step their employees take. This is meant to increase productivity by breaking down work to a sequence of simple processes, each of which is made transparent and subject to optimisation. However, this practice elevates micro-management to new heights and dehumanises work.

When a human makes a decision, an employee can discuss the decision with their manager to understand why and how it was made. Conversely, algorithms are complex, and often proprietary and inaccessible. People can experience algorithmic decisions as arbitrary since they have no way of understanding the logic behind them.

The use of predictive analytics – applying algorithms to identify current patterns to predict the probability of future patterns – can lead to situations where people are discriminated against or punished before they have done anything wrong.

For example, many companies – such as financial services company Credit Suisse – employ algorithms to identify employees who might quit. This is done by detecting pattern similarities between current employees and those who have quit in the past.

When companies use these data to make training, promotion or firing decisions, they are basing their decisions on what may happen – rather than on what employees have done.

Does people analytics actually work?

There are reasons to believe the promise of people analytics may not live up to the hype.

Most importantly, the use of algorithms does not make decisions rational or objective. This is because their design involves subjective human judgement. For example, when developing an algorithm to measure employee performance, different metrics can be captured, such as annual revenue, 360-degree feedback score and engagement level. Which metrics to use and what weighting to give each one are subjective human decisions.

Moreover, algorithms emulate human decision-making and can reflect societal biases. For instance, if a company has historically promoted more employees of a particular ethnic group to senior levels, an algorithm will learn what “appropriate” promotions are and make biased promotion decisions.

Therefore, algorithms cannot be said to be reflective of an objective truth.

In addition, people analytics is unlikely to accurately measure human behaviour. Research shows that when employees do not perceive technology as helping boost productivity, they are likely to game the system by feeding it inaccurate data. This is particularly common in professional services and law firms, where employees are required to report their activities in minute detail.

Therefore, the veracity of data captured by people analytics systems should not be taken for granted.

Finally, people analytics’ positive impacts on innovation should also be questioned. It may actually curb organisational innovation, rather than increase it.

For example, many companies seek to hire applicants who have the same traits as their top-performing employees possess. By designing a hiring algorithm that looks for those traits in an applicant pool, companies are likely to keep hiring the same type of people. This will reduce diversity of skills, psychological attributes and viewpoints.

What should companies do?

The use of people analytics raises ethical questions and may not deliver all that its proponents promise. But an informed approach to using it, which considers its potential adverse effects and limitations, can be beneficial.

Using people analytics can enhance the speed and efficiency of decision-making. However, organisations would be well advised to do so with human oversight, and to make the process visible to those affected by the resulting decisions.

People analytics is not a cure-all solution. Organisations should use it only if their culture is compatible with the technology’s logic. A large company with established hierarchy and processes is likely to get more out of people analytics than a small and innovative start-up whose employees have highly flexible working arrangements.

Companies are attracted to implement people analytics in the hope algorithmic power will enhance their performance. However, managers and employees alike would be better off if they approached people analytics with a critical eye to get the most out of it.

Authors: Uri Gal, Associate Professor in Business Information Systems, University of Sydney

Read more http://theconversation.com/why-algorithms-wont-necessarily-lead-to-utopian-workplaces-73132

Healthy Recipe For Chicken Breasts

arrow_forward

Australia's media has been too white for too long. This is how to bring more diversity to newsrooms

arrow_forward

The Conversation
INTERWEBS DIGITAL AGENCY

Politics

Prime Minister Interview with Ben Fordham, 2GB

FORDHAM: Thank you very much for talking to us. I know it's a difficult day for all of those Qantas workers. Look, they want to know in the short term, are you going to extend JobKeeper?   PRI...

Scott Morrison - avatar Scott Morrison

Prime Minister Scott Morrison interview with Neil Mitchell

NEIL MITCHELL: Prime minister, good morning.    PRIME MINISTER: Good morning, how are you?   MICHELL: I’m okay, a bit to get to I apologise, we haven't spoken for a while and I want to get t...

Scott Morrison - avatar Scott Morrison

Prime Minister Interview with Ben Fordham

PRIME MINISTER: I've always found that this issue on funerals has been the hardest decision that was taken and the most heartbreaking and of all the letters and, you know, there's been over 100...

Scott Morrison - avatar Scott Morrison

Business News

SEO In A Time of COVID-19: A Life-Saver

The coronavirus pandemic has brought about a lot of uncertainty for everyone across the world. It has had one of the most devastating impacts on the day-to-day lives of many including business o...

a Guest Writer - avatar a Guest Writer

5 Ways Risk Management Software Can Help Your Business

No business is averse to risks. Nobody can predict the future or even plan what direction a business is going to take with 100% accuracy. For this reason, to avoid issues or minimise risks, some for...

News Company - avatar News Company

5 Ways To Deal With Unemployment and Get Back Into the Workforce

Being unemployed has a number of challenges and they’re not all financial. It can affect you psychologically and sometimes it can be difficult to dig your way out of a rut when you don’t have a job ...

News Company - avatar News Company



News Company Media Core

Content & Technology Connecting Global Audiences

More Information - Less Opinion