Workday adds AI that recommends manager actions

Workday has introduced new artificial intelligence and machine-learning updates to its products that can recommend manager actions, the company said Sept. 27.

The new features provide insights and recommended actions around managerial workflow, including items such as employee skills, goals and important dates. Based on the recommendations, managers can view connections across their teams and create opportunities for collaboration and development.

“Managers play such a pivotal role in the growth and development of their teams but face increasing pressure to improve productivity and performance while navigating evolving workplace policies,” David Somers, group general manager for the office of the chief human resource officer product at Workday, said in a statement.

For instance, the update can offer personalized recommendations including connections, mentors and gigs — for managers to identify opportunities for their employees based on skill interests. This can improve talent mobility and employee engagement across a team, Workday said.

It also will allow managers to identify talent across the organization to assemble a team and define roles. To launch a new product using a skilled team, for instance, managers can see and add suggested employees based on their skills.

Finally, it can provide curated insights to managers with a holistic view of information relevant to their teams, such as important dates, time off schedules and quick actions to complete tasks. Integrations with other programs, such as Teams and Slack, can make the workflow even smoother, Workday said.

As AI capabilities expand this year, companies have been racing to incorporate new tools into HR software. LinkedIn, for example, announced updates to help recruiters create a more efficient experience and aid job seekers with training. However, experts have suggested caution around adopting these tools without considering safety and security concerns, as well as bias in job posts and the hiring process. 

In response, cities and states have begun to take action on AI regulation. Earlier this year, New York City implemented a new law that requires employers to audit any automated employment decision tools and notify candidates about the use of these tools. Employers can’t use these tools if they haven’t been audited for bias, particularly around demographic categories such as sex, race and ethnicity, and intersectional categories of sex, race and ethnicity.

To help with these types of questions, the U.S. Equal Employment Opportunity Commission issued a guide for employers to audit AI and other automated solutions for discrimination or “disparate impact” under Title VII of the Civil Rights Act of 1964. The Consumer Financial Protection Bureau, U.S. Department of Justice, Federal Trade Commission and White House have also announced plans to examine AI tools.