The Ferdinand-Steinbeis-Institute investigates why technological developments often “get the cold shoulder”
Baden-Wuerttemberg is generally considered a technology state in Germany – extremely open to technology, especially if it’s new. But it still has its technology sceptics and sometimes people are openly hostile to technology. Given the current backdrop of digital transformation and sweeping technological change, there is therefore concern that the acceptance for technological novelties is waning in certain areas of the economy and society – if there is not enough acceptance, perhaps the competitive edge will be lost when it comes to innovation. As part of the #techourfuture initiative sponsored by the Baden-Wuerttemberg Ministry of Economic Affairs, Labor, and Housing, the Ferdinand-Steinbeis-Institute (FSTI) has started looking into the issue of technology acceptance in Baden-Wuerttemberg.
The experts at the FSTI set up panel sessions to discuss certain topics and provide an experimental forum to introduce business and members of the public to recent technological developments. Aside from sharing knowledge and discussing people’s practical experiences with new technology, one of the main ideas of the sessions is to conduct a scientific assessment of technology acceptance.
NEW TECHNOLOGY MAY BE REJECTED DUE TO A FEAR OF LOSING CONTROL
The starting point for the project was the realization that reservations about new technologies frequently stem from a concern that there will be a perceived loss of control, especially with technology involving digital solutions and autonomous technology. This is initially connected to the way control is passed between two different parties. The cause of this loss of control may be other parties or of course a technology to which control is transferred (even in part), but there may also be quite normal situations or factors that imply control will be lost. There are also different scenarios under which a transition of control may take place, for example control can be lost voluntarily or involuntarily. Or there are instances in which something that was previously uncontrolled will become controlled (by another entity) at a certain point in time.
One thing that also needs to be considered with these different scenarios is whether people who are affected by certain situations are given the right to intervene. One also needs to remember that by definition, technology or technical systems came into being as a result of human enterprise and creative energy; accordingly, they were also made available, set up and sold within the context of specific people or groups of individuals. One has to look at specific circumstances and consider whether technology results in control being handed over to people or groups of people, because this has substantial implications on the degree to which a loss of control may be perceived as such. Autonomous agents such as Siri and Alexa, which are capable of taking control of situations as independently acting individuals, are particularly important in this context.
USING A CYCLICAL MODEL OF CONTROL AS AN EXPERIMENTAL BASIS
The FSTI is using a cyclical model of control (and thus a cyclical model for the loss of control) as a basis for the research being carried out for the #techourfuture initiative. If a person understands or has an overview of a technology or a technical system, this arms them with fundamental knowledge (abstract or technical) that helps control such a technology; being able to (partially) control technologies and technical systems generates practical experience in dealing with something – and in turn, this raises the degree to which something is understood, etc. The FSTI experts are therefore looking at this issue from the point of view of a control cycle, in which a person has to at least understand or control the underlying principles of a technology or technical system in order to perceive a status of control. If at any point this control cycle is interrupted, the person can no longer (or only insufficiently) understand or control a technology or technical system, and this can lead to a state in which they feel they have lost control, expressed through reservations, concerns, or fears.
This FSTI concept also includes techno-sociological considerations. If autonomous technology somehow starts to play a role in decision-making in an area previously reserved for humans, the instrumental relationship between humans and technology becomes an interactive relationship; the technology evolves into a partner and co-decision-maker as part of a cooperative process, which now takes place within a distributed, hybrid system (Weyer 2006). A key aspect of this loss of control (and many others), induced by technology and technical systems, is that technical functions are transferred/transmitted from helpful, manageable, and accepted applications to risky or threatening applications, for example with military technology, monitoring technology, or when elections are interfered with (Heinrich Böll Foundation 2019).
This raises a question: If there is an interaction going on, or if there is a decision-making process, and a technology becomes more involved or takes control, does this impact the people who are involved in the process to the extent that it affects their ability to intervene, i.e. the chance of them taking action or shaping events? The FSTI experts also asked if this is linked to a greater loss of control and whether this reduces people’s ability to intervene or control a situation. It is often found that the more autonomous technology and technical systems become, the less likely people are to participate and this furthers the broader exclusion of human beings from decision-making processes in hybrid or entirely autonomous systems. Further developments in the automation, autonomization, and hybridization of technology, systems, and processes thus result in people being less likely to be allowed to intervene or take control, and this tends to coerce them into adapting to “guidelines” laid down by technology (Weyer 2006). From a technological and social point of view, there is therefore scientific support for the subjective perception of a loss of control.
FIRST INSIGHTS: PREVIOUS EXPERIENCE IS DECISIVE
This concept model regarding a loss of control is the cornerstone for empirical experiments taking part at the #techourfuture sessions. The people taking part in the events correspond to the target group of a quantitative empirical survey, which is based on standardized questionnaires. An initial look at the findings of the first #techourfuture forums, which took place in Sinsheim in 2019, already points to a number of interdependencies. It can already be expected that there will be significant differences between various sociodemographic groups when it comes to any perception of a loss of control resulting from new technology. Measurable differences can be found for factors such as different gender groups, age groups, education levels, and occupational groups. Concern regarding a loss of control is average to strong across the entire group in the experiment, so it is neither extreme nor marginal. There is one important influence, however: the degree to which people have previous experience with a particular new technology. In particular, respondents are less likely to feel a loss of control if they have a basic understanding or general overview of a certain technology. This also applies to indications relating to the safety and reliability of a new technology. There is however a rapid rise in concerns regarding any loss of control if there is uncertainty regarding how to use information acquired while trying out a new technology, particularly with respect to safety.
The target group felt very positive about the comprehensive and interactive approach to different technology issues at the #techourfuture sessions. The participants were also extremely positive about the information supplied by the experts, who were relevant and credible, as well as their pertinent and understandable examples. There was also positive feedback about the possibility to enter into discussions with the experts and other participants at the sessions, and the fact that there was sufficient time and space.
Combined with output from the next #techourfuture sessions, these results will allow the FSTI experts to determine important points of reference for possible recommendations regarding technology acceptance. These will be used to develop guidance for future events relating to the goals of the Technologie*Begreifen initiative (Grasp Technology). They will also make it possible to design instruments and procedural models for identifying and lining up new technologies to be used in business and society.
Böll-Stiftung (2019): Dem Kontrollverlust vorbeugen.
Weyer, Johannes (2006): Die Kooperation menschlicher Akteure und nicht-menschlicher Agenten. Ansatzpunkte einer Soziologie hybrider Systeme. Arbeitspapier Nr. 16 (August 2006) des Lehrstuhls für Wirtschafts- und Industriesoziologie. Universität Dortmund.