Monday, December 5, 2022

How to Prevent Algorithms from Ruling the Workplace

By Swapnil Mishra - December 07, 2021 3 Mins Read

After a huge rise in workplace monitoring, tougher rules are needed around the use of algorithms to monitor employees in the workplace, as well as their role in decision making. Better rules need to be urgently drafted to keep control of the AI systems that are already making decisions about jobs.

Algorithms are making increasingly high-stakes decisions about people’s lives in the workplace, and yet current laws are failing to protect workers from the risk of unfair treatment and discrimination that this might cause.

The growing dependence on algorithmic surveillance and management tools has significant negative impacts on the conditions and quality of work. Invasive monitoring and automated decision poses a profound risk on the team’s well-being. Workers experience extreme pressure of constant, real-time micro-management and automated assessment, which results in negative impacts on mental and physical health.

Here are few recommendations than can help protect businesses from the rapid rise of automated workplace-monitoring and decision-making tools.

Also Read: Strategies to Create a Data-Driven Culture at Workplace

The accountability for algorithm act

The creation of an  Accountability for Algorithms Act, or ‘the AAA’ provides a comprehensive framework, driven by the principles of management. It will control AI in the workplace, as well as set ways to ensure that people maintain oversight of any important decisions made by algorithms.

AAA will incorporate new rights and obligations, to ensure that all significant impacts from decision-making on the work or employee algorithm are taken into account.

Updating digital protection

Accountability for Algorithms Act will fill gaps in existing technological protection in the workplace, which includes providing employees with easily accessible information. The employees should be aware of the purpose, results and impact of algorithmic systems in the workplace and have the right to be involved in shaping their design and implementation.

However, it also requires greater protection for employees who spend a growing portion of their time online, and who spend a lot of time exposed to negative aspects of work in the digital age.

Also Read: 3 Strategies to Simplify the Hybrid Work Model

While AAA should be a vehicle that specifies the use and purpose of workplace monitoring tools, there should also be protections in place to protect engineers from copyright infringement and to stop algorithms from exploiting or “playing” employees.

Enabling a partnership approach

To ensure that AI-based tools are designed taking into account the interests of the wider community, firms must develop relationships with developers and a comprehensive AI ecosystem.

Unions and NGOs should also be given additional rights and participation in the way algorithms are used in the workplace.

This should start with employers informing the relevant unions when algorithmic systems with significant impacts are adopted in the workplace to initiate meaningful consultation. Employees should also be allowed to develop new roles within the AI ​​ecosystem to address the growing inequality of knowledge and power and help bring about a truly personal AI in the public interest.

Now is the time for businesses to start putting some brakes on the algorithms and ramp up regulatory efforts.

Check Out The New Enterprisetalk Podcast. For more such updates follow us on Google News Enterprisetalk News.



AUTHOR

Swapnil Mishra

Swapnil Mishra is a Business News Reporter with OnDot Media. She is a journalism graduate with 5+ years of experience in journalism and mass communication. Previously Swapnil has worked with media outlets like NewsX, MSN, and News24.

Subscribe To Newsletter

*By clicking on the Submit button, you are agreeing with the Privacy Policy with Enterprise Talks.*