| Written by: Johan Rodin
Share this post:
When IBM was recently invited to give their yearly guest lecture at Stockholm School of Economics within the course “Humans vs. Algorithms: Judgment, Prediction and Nudges”, AI & ethics had a spot in the schedule, with extra focus on interactive assistants.
As you may have guessed, centenarians were missing both in front of and behind the chair that day. However, there were plenty of representatives of two hundred-year-old institutions that over the past century have followed or alternately led the development of modern times, if so in various respects.
The School of Business, Economics and Law trains and has trained business leaders in subjects such as innovation, digitalization and entrepreneurship, while IBM has been a leader in technology development- from punch cards to things like nanochips, Watson and quantum computers. If there is one thing that has been required of both the School of Business and Economics and IBM to continue to be relevant year after year, it is the will and ability to develop and embrace the spirit and change of the times. When one of the students expresses the most relevant question about employers’ ethical and moral responsibility when implementing technology that replaces manual work, we know that we as IBMers can speak from experience.
Yes – AI will take over jobs
Let us start by reformulating the question a little, to a question we often get when we present virtual assistants and IBM’s vision in the field: Will AI take our jobs?
The short answer is yes. It has already happened and is happening all the time.
Systems that look for production defects, act as logistics or handle applications, have all taken over their tasks from living people. There are roles, such as professional drivers that we can already imagine being replaced by systems and there are other roles that we imagine will be more difficult to replace soon, such as psychologists.
Our experience from the introduction of AI support in the form of interactive assistants in customer service, is that employees have shifted tasks from answering the most common questions to having more time to deal with more advanced questions. In other words, they have gone from answering “What opening hours do you have?” and “How do I reset my password?” to cases where, for example, emotions are involved or where several different decisions need to be made by connecting different sources of information.
The chatbot can respond to inconvenient working hours
There are case studies (for example for Credit Mutuel and Vodafone) that show that up to 70% of all questions that come to a customer service today can be given automatic answers via an interactive assistant. Such a solution is often perceived as a good alternative by both the customer, the company, and the customer service employee himself. The chatbot gives the same answer to the same question no matter what time of day it is asked. Several measurements of customer satisfaction after the introduction of such systems have shown improved KPIs. In addition, employees avoid the most repetitive questions and answers. The working hours that are freed up for the employee can instead be spent on more demanding matters and where AI is at stake today.
The employer has a moral responsibility to handle the change
So, on the question of the moral responsibility employers have when implementing AI systems that replace manual labor, we want to emphasize the value of supporting people with the help of new technology. Of course, all employees’ experiences are different, but we have seen many examples of gratitude for how AI systems removed tasks that were perceived as repetitive and non-stimulating. In the limited cases where AI systems completely exclude the human operator or alternatively reduce the workforce significantly, the question of moral responsibility remains. As a result of the fact that change and development are part of IBM’s heritage and identity, we have been educated according to Heraclitus’ doctrine “the only constant is change”. We therefore see the change itself as inevitable and believe that the moral aspect lies in how well it is handled.