Human-computer interaction (HCI) is a cornerstone for the success of technical innovation in the logistics and supply chain sector. As a major part of social sustainability, this interaction is changing as artificial intelligence applications (Internet of Things, autonomous transport, Physical Internet) are implemented, leading to larger machine autonomy, and hence the transition from a primary executive to a supervisory role of human operators. A fundamental question concerns the level of control transferred to machines, such as autonomous vehicles and automatic materials handling devices. Problems include a lack of human trust toward automatic decision making or an inclination to override the system in case automated decisions are misperceived. This paper outlines a theoretical framework, describing different levels of acceptance and trust as a key HCI element of technology innovation, and points to the possible danger of an artificial divide at both the individual and firm level. Based upon the findings of four benchmark cases, a classification of the roles of human employees in adopting innovations is developed. Measures at operational, tactical, and strategic level are discussed to improve HCI, more in particular the capacity of individuals and firms to apply state-of-the-art techniques and to prevent an artificial divide, thereby increasing social sustainability.

Logistics Innovation and Social Sustainability: How to Prevent an Artificial Divide in Human–Computer Interaction

Matthias Klumpp;
2019-01-01

Abstract

Human-computer interaction (HCI) is a cornerstone for the success of technical innovation in the logistics and supply chain sector. As a major part of social sustainability, this interaction is changing as artificial intelligence applications (Internet of Things, autonomous transport, Physical Internet) are implemented, leading to larger machine autonomy, and hence the transition from a primary executive to a supervisory role of human operators. A fundamental question concerns the level of control transferred to machines, such as autonomous vehicles and automatic materials handling devices. Problems include a lack of human trust toward automatic decision making or an inclination to override the system in case automated decisions are misperceived. This paper outlines a theoretical framework, describing different levels of acceptance and trust as a key HCI element of technology innovation, and points to the possible danger of an artificial divide at both the individual and firm level. Based upon the findings of four benchmark cases, a classification of the roles of human employees in adopting innovations is developed. Measures at operational, tactical, and strategic level are discussed to improve HCI, more in particular the capacity of individuals and firms to apply state-of-the-art techniques and to prevent an artificial divide, thereby increasing social sustainability.
2019
artificial intelligence
social sustainability
logistics performance
human-computer interaction
File in questo prodotto:
File Dimensione Formato  
J of Business Logistics - 2019 - Klumpp - Logistics Innovation and Social Sustainability How to Prevent an Artificial.pdf

Accesso riservato

: Publisher’s version
Dimensione 1.94 MB
Formato Adobe PDF
1.94 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1234891
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 54
  • ???jsp.display-item.citation.isi??? 42
social impact