Workplace Relations

Termination via ChatGPT

We have made reference to ChatGPT and AI on various occasions in past editions of the WR News and much has been said about roles and specific tasks being replaced by AI. We thought it would be fun to show you an example where AI was used instead of Human interaction to terminate an employee and the legal and ethical implications involved.

Background
Charlie works in customer services for a tech company. He received an email from the HR department asking him to join a meeting with a ChatGPT-powered virtual assistant. During the virtual meeting, the ChatGPT assistant informed him that due to restructuring his employment was to be terminated.

After reading about what has happened to Charlie, there are three things we should explore in relation to this and consider the impacts made:

  1. legal considerations
  2. ethical considerations
  3. recommendations going forward 

Legal considerations
Under the Fair Work Act 2009, employees are entitled to fair process before termination which includes:

  • the right to respond to any reasons for termination
  • the opportunity to have a support person present during discussions

Using AI to conduct Charlie’s termination meant he was not afforded the opportunity to respond or to have a support person present. The use of AI here would fail to meet the standards of procedural fairness. 

What could happen here? Charlie could potentially file an unfair dismissal claim if he believes his termination was harsh, unjust or unreasonable. In this case Fair Work will consider factors such as:

  • the reasons for Charlie’s dismissal
  • whether Charlie was notified of the reason
  • whether Charlie was given the opportunity to respond
  • whether company polices and procedures were followed

Ethical considerations
Termination is a stressful and emotional experience and no doubt Charlie felt very stressed and upset during the process. What should have been considered?

  • Human dignity and respect – delivering such news through AI can be perceived as impersonal and disrespectful. Charlie may have expected the opportunity to ask questions and express concerns directly to a human. 
  • Duty of care - employers have a duty of care to ensure the process is conducted fairly, transparently and sensitively but also to the employees well being.
  • Trust – I think we can say in Charlie’s case that the trust between himself and his employer has now broken down. By using AI to terminate his employment, Charlie may feel the company have avoided any accountability and the contributions made and service given by him to the company have now been undervalued.
  • Accountability – by shifting the responsibility of Charlie’s termination from a human manager to technology it raises the question of who is accountable in the event of any disputes or claims that Charlie may raise.

What can we learn from the experience Charlie has been through and the mistakes his employer made by using ChatGPT-Virtual Assistant to terminate his employment?

If handled correctly, the company could have successfully used AI in collaboration with human resources personnel to terminate Charlie’s employment by:

  • having HR carry out the initial preparation which includes communicating the termination process, focusing on the reasons and support offered.
  • providing ChatGPT with details of the termination process and reasons for termination so that a draft termination letter can be generated that was clear, professional and legally compliant. 
  • ChatGPT can help draft a script for the termination meeting, ensuring the HR Manager could convey the decision with empathy while providing all necessary information. 

Using ChatGPT to assist in the process of terminating Charlie’s employment could have demonstrated the benefits of integrating AI tools in less sensitive tasks and ultimately leading to a smooth and respectful termination process. 

If you have any questions around AI/ChatGPT contact the WR Team on 07 3872 2264 or workplacerelations@amaq.com.au