When facing the latest advancements of the digital technologies we feel divided in between the excitement of exploring unprecedented innovations and the fear of being overtaken by the technologies themselves. AIs, capable of learning independently, thinking and acting without being supervised in a variety of urban spaces and domains (Crawford, 2021) can take the most diverse forms and effects, leading to unpredictable implications (Roco, 2016). With institutions not yet offering sufficient guidance and regulations in the field, the raising of digital technologies generates several ethical questions, together with fear and a urgent need of protection. Among the others, facial recognition and the collection of biometric data are significantly harmful, as they discriminate and deny several human rights (Amnesty International (2020). With facial recognition cameras, the facial signature can be captured and the data collected without consent or dissent (Kohnstamm, 2012); and despite data became a primary resource of economy (Zuboff, 2019), there is a tendency not to protect our first wealth and uniqueness and most people are not aware of the deployment of this technology (Pew Research Center, 2019; Ada Lovelace Institute, 2019). If, as human beings, we see the concrete risk to lay in a condition of weakness, how do we position ourselves as designers? Design itself is, as Bertola (in Bertola et al., 2021) wrote, one of the critical drivers of innovation when navigating the ongoing transition, due to its capacity of “linking manufacturing and technological systems with cultural and societal evolution” and to its approaches based on users’ and societal values. Indeed, design has always been a human-centric discipline, and thus can be the good guidance for a twin transition where the digital goes together with a sustainability that is not just environmental but regards human beings and rights. Can design be a driver also when dealing with such harmful technologies as facial recognition? While legislators and scholars have widely investigated the concept of privacy, the contribution of design in the context has received much less attention (Wong & Mulligan, 2019). Privacy protection is a strongly technology-based field, in which dominant engineering approaches assume that privacy is predefined and does not need to be challenged at the design level. Only in recent times design — especially critical design and partially service and UX design — started to explore the topic, more in a dimension of social-political activism and criticism (Zuboff, 2019) than in the design of producible solutions. Today, the principle of Privacy By Design (PBD), recently included in the GDPR, introduces the human-centered design approach in the field of personal data protection, and requires organizations to develop designs with the right tools and methods to protect personal data. Given these premises, the chapter investigates the possibility of designing in the anti-surveillance field, combining the critical part with the pragmatic-functional dimension. The goal is to understand the role that design and research undertake and the approach they should have regarding this problem of our present that will shape our future. To address the ethical concept of individual privacy, the presented research adopts a multi-layered systemic approach, and is framed at a crossroad between fashion and textile-knitwear design with engineering for AI, computer vision and machine learning. In such a complexity, the human-centric approach is contaminated by the contribution of other disciplines in an advanced co-design process that uses digital technologies to generate a fashion product that protects the identity of the wearer from harmful digital surveillance. The cultural assets of fashion, that shape individual and social identities through the material and immaterial values of its products (Bertola, 2021; Crane, 2012; Crane & Bovone, 2006), are combined with the high precision of machines and the high complexity of technology in the textile and knitwear field, that open perspectives on innovative technical performances for the development of advanced products, also in fields other than fashion. Through a collaborative effort between engineering, fashion and knitwear design, the research developed an adversarial textile made with computerized knitting machines and resulted in clothes that embed adversarial images able to deceive the facial recognition systems, protecting people from artificial intelligence but making them aware and visible to the eyes of other human beings (Didero & Conti, 2022). By reading the methods, process and the results of the research, the chapter reflects on how designers worked in handling the contribution of engineering researchers and experts in ethics and policies, not just in the development of a fashion product, but in the search of a multifaceted solution to such a complex global issue. The ultimate goal is to observe how the methodologies of design foreground the ethics of design practice, and how such research can reveal potentially hidden agendas and values, explore alternative design values (Bardzell & Bardzell, 2013), give directions to understand how designers should place themselves at the edge between Computer Engineering, Design and Art and think of themselves as “an essential creative engine real-time informed about the impacts, actions and reactions of its surrounding cyberphysical ecosystem.” (Bertola, in Bertola, Mortati & Vandi, 2021, p.61)
When technology becomes harmful. The contribution of designers at a crossroads between fashion, digital and ethics
M. Motta;R. Didero
2024-01-01
Abstract
When facing the latest advancements of the digital technologies we feel divided in between the excitement of exploring unprecedented innovations and the fear of being overtaken by the technologies themselves. AIs, capable of learning independently, thinking and acting without being supervised in a variety of urban spaces and domains (Crawford, 2021) can take the most diverse forms and effects, leading to unpredictable implications (Roco, 2016). With institutions not yet offering sufficient guidance and regulations in the field, the raising of digital technologies generates several ethical questions, together with fear and a urgent need of protection. Among the others, facial recognition and the collection of biometric data are significantly harmful, as they discriminate and deny several human rights (Amnesty International (2020). With facial recognition cameras, the facial signature can be captured and the data collected without consent or dissent (Kohnstamm, 2012); and despite data became a primary resource of economy (Zuboff, 2019), there is a tendency not to protect our first wealth and uniqueness and most people are not aware of the deployment of this technology (Pew Research Center, 2019; Ada Lovelace Institute, 2019). If, as human beings, we see the concrete risk to lay in a condition of weakness, how do we position ourselves as designers? Design itself is, as Bertola (in Bertola et al., 2021) wrote, one of the critical drivers of innovation when navigating the ongoing transition, due to its capacity of “linking manufacturing and technological systems with cultural and societal evolution” and to its approaches based on users’ and societal values. Indeed, design has always been a human-centric discipline, and thus can be the good guidance for a twin transition where the digital goes together with a sustainability that is not just environmental but regards human beings and rights. Can design be a driver also when dealing with such harmful technologies as facial recognition? While legislators and scholars have widely investigated the concept of privacy, the contribution of design in the context has received much less attention (Wong & Mulligan, 2019). Privacy protection is a strongly technology-based field, in which dominant engineering approaches assume that privacy is predefined and does not need to be challenged at the design level. Only in recent times design — especially critical design and partially service and UX design — started to explore the topic, more in a dimension of social-political activism and criticism (Zuboff, 2019) than in the design of producible solutions. Today, the principle of Privacy By Design (PBD), recently included in the GDPR, introduces the human-centered design approach in the field of personal data protection, and requires organizations to develop designs with the right tools and methods to protect personal data. Given these premises, the chapter investigates the possibility of designing in the anti-surveillance field, combining the critical part with the pragmatic-functional dimension. The goal is to understand the role that design and research undertake and the approach they should have regarding this problem of our present that will shape our future. To address the ethical concept of individual privacy, the presented research adopts a multi-layered systemic approach, and is framed at a crossroad between fashion and textile-knitwear design with engineering for AI, computer vision and machine learning. In such a complexity, the human-centric approach is contaminated by the contribution of other disciplines in an advanced co-design process that uses digital technologies to generate a fashion product that protects the identity of the wearer from harmful digital surveillance. The cultural assets of fashion, that shape individual and social identities through the material and immaterial values of its products (Bertola, 2021; Crane, 2012; Crane & Bovone, 2006), are combined with the high precision of machines and the high complexity of technology in the textile and knitwear field, that open perspectives on innovative technical performances for the development of advanced products, also in fields other than fashion. Through a collaborative effort between engineering, fashion and knitwear design, the research developed an adversarial textile made with computerized knitting machines and resulted in clothes that embed adversarial images able to deceive the facial recognition systems, protecting people from artificial intelligence but making them aware and visible to the eyes of other human beings (Didero & Conti, 2022). By reading the methods, process and the results of the research, the chapter reflects on how designers worked in handling the contribution of engineering researchers and experts in ethics and policies, not just in the development of a fashion product, but in the search of a multifaceted solution to such a complex global issue. The ultimate goal is to observe how the methodologies of design foreground the ethics of design practice, and how such research can reveal potentially hidden agendas and values, explore alternative design values (Bardzell & Bardzell, 2013), give directions to understand how designers should place themselves at the edge between Computer Engineering, Design and Art and think of themselves as “an essential creative engine real-time informed about the impacts, actions and reactions of its surrounding cyberphysical ecosystem.” (Bertola, in Bertola, Mortati & Vandi, 2021, p.61)File | Dimensione | Formato | |
---|---|---|---|
MOTTA DIDERO_Chapter in Designing ethically in a complex world_Galluzzo Caratti.pdf
accesso aperto
Descrizione: pdf editore
:
Publisher’s version
Dimensione
6.05 MB
Formato
Adobe PDF
|
6.05 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.