What actually is digital transformation? And how is it affecting different areas of industry and technology?
Nothing is more perennial than change. Very few terms are heard in so many forms as “digital transformation” these days. It’s like a driver of our fast-pace modern society. But in non-linear environments, trends lose their ability to dictate the direction of the overall journey. And disruption has a tendency to challenge business models and the nature of products and services. What’s the best way to safeguard sustainability in the long term when everything is so ill-defined? Gunther Herr, lecturer at the School of Management and Technology at Steinbeis University Berlin (SHB), is convinced that future-proofing is only possible by engendering an as yet undiscovered culture of innovation, by galvanizing systematic processes within interdisciplinary teams in order to deliberately expand the boundaries of achievement.
Companies face a big challenge. They have to keep reinventing themselves by coming up with more and more innovations, yet establish a decisive core competence in the face of the competition. But there are ways to shape future success and overcome individual challenges – by thinking in terms of contradictions. Picturing the future – and drawing on radical ideals – makes it possible to challenge existing business models, posing questions and making predictions that will stand the test of time. And by turning to “shrouded” patterns of development, inspiration can be found for ways of thinking that focus firmly on the future.
There are so many examples of digital transformation doing the rounds at the moment, including a variety of instances of companies apparently enjoying breathtaking success – almost in an instant. And no publication worth its salt can get by without using buzzwords like pain points, customer value, MVP, or non-linear. Apparently, every C-level manager should have several months’ experience in Silicon Valley to point to. One often gets the impression that industrial processes and organizational structures are some kind of anachronism. Modern times are about agility and scrums. Academic books never tire of appealing for a culture of failure. Hardly a speech is given without some call to fill the “innovation funnel” with more ideas. Yet nobody seems to be questioning the success rates people usually quote – less than ten percent. One thing we do know, however, is that the rate of change is accelerating in ways we have never witnessed before.
So what would happen if people actually started adhering to all the demands they are hearing? How many manufacturing companies can really afford to make two huge bad investments in quick succession? What sort of brand can cushion that kind of blow? And what kind of shareholder would try to tweak output or tune their investment risk to startup levels? Can anyone who is invested honestly afford to implement a radical change in direction in the short term? Aside from taking all the required flexibility in decision-making into account, established companies also have to think about path dependency: their own past, with all the other things happening around them. These could be conditions that make things easier, but they might also be a hindrance. But they’re out there. So the worst thing you can do is ignore them. As a result, it might just be useful to come at challenges from a different angle, just one more time, because there’s no doubting that those challenges are there, and our entire societal system faces them.
What lies beneath the culture in Silicon Valley? Why, of all places, is it actually where it is? Why isn’t it in Florida? What makes that specific area so special? And why was it that the Industrial Revolution took place in Europe? There’s probably no obvious answer, or at least not without delving into history. The foundation of rapid development in central Europe actually dates back to the period after the Thirty Years’ War (1616-1648). It was during this time that Arabian scriptures archived in Spain were being translated. These originated from the Moorish Occupation (711-1492 AD). These translations fueled rapid advancements in scientific understanding throughout Europe. This contrasted to the claims to power and even wisdom originating from the Roman Catholic Church, subsequently imposed through the Roman Inquisition (1542-1798). Thanks to Isaac Newton (who told us to create models to predict the future), Galileo (who said change should be measurable to develop things), Descartes (who believed that explaining things involves unraveling complex issues), and Aristotle (four possibilities of logic), Europe established a basis for scientific pursuit. A culture of science developed during this time, closely based on the reproducibility of measurements, quantification, and the possibility to analyze things. These concepts still shape our thinking today. We want clarity and irrefutable evidence, ideally with causal justification. The world of scientific experimentation, the essential European establishments of education, but also industrial processes all go back to principles shaped in the 16th century. Then came social values influenced by the French Revolution (1798-1799), which in turn resulted in comprehensive social security systems.
So what about the west coast of the United States? How come it seems to be breaking with these fundamental principles? Ok, we know Columbus discovered America in 1492. That would give us one reason why a different culture has developed in the centuries since this date. But that’s not enough in itself. After all, from the very beginning there has been close communication between the newly discovered continent and Europe. Did something big happen – something that could shed more light on the differences? Well, on April 18, 1906, at 5:12 in the morning, the whole of the San Francisco area was rocked by an earthquake. The quake and the fires it started resulted in damage that in monetary terms would be about the equivalent of $11 billion today. The whole area was like a blank piece of paper. If anybody was insured, they were generally only covered for fire; earthquake damage wasn’t part of the insurance policy. So new standards started to evolve, quite spontaneously, in a way that had never happened before. Most people affected by the quake suddenly found themselves with nothing, aside from the knowledge that the only way to make things better would be to do something themselves. Without any kind of safety net to speak of, doing anything was better than doing nothing. So the only consequence of failure would be that things were “no better” than before. And things could certainly not get worse. As early as 1915, the “kick up the backside” this gave to the whole area led to the Panama Pacific Exhibition, signaling and celebrating the emergence from the ashes of the entire cosmopolitan area of San Francisco.
So what we have with the social revolution of this day and age stems from a contrast – between two fundamentally different starting positions. On one side of the Big Pond is a European culture underpinned by a comprehensive social security system and based on fundamental scientific principles. The goal is predictability. On the other side is a society that once had nothing after falling victim to a natural disaster. For them, doing anything has to be better than ceding to circumstance. Failure has no consequences. Things can’t get worse. Both starting points have resulted in the release of a significant amount of energy in recent decades. In each social environment, the result has been considerable success. If, however, the classic Aristotelian either/or discussion takes place in Europe and people wonder whether the principles of Silicon Valley are more “right” than the experiences we have gathered in Europe, we’re wasting our breath. That’s not the right question. But if we look at it from a different angle, we could learn some interesting things: Software and data-centric business models, as well as recent startups, typically have a manageable volume of long-term capital commitments. This makes them flexible (or agile). When you incur little financial risk, it’s easier to “try things out.” Decisions get made more quickly – and they can be reversed. In times of rapid and intense change, that holds certain appeal. But unfortunately, that’s not much use when it comes to the fundamentals of manufacturing companies. Production systems and operating infrastructure tie up capital. Once an investment has been made, it automatically means you’re committed to something. Another factor fueling path dependency is human resources, especially if they are subject to binding contracts. Changing tack in the short term requires a significant amount of energy, not just due to capital requirements but also because of the need to share ideas and train people. The modern concept of the Internet of Things (IoT) may already offer an answer to this problem. Digital business models thrive on the symbiosis of both worlds. Sure, it’s important to make good use of the flexibility and speed offered by digital technology. There’s no point living in denial. Despite this, the IoT needs things. And these things need producing. So we have to exploit the potential synergies of both worlds. And to do this, we need “mutual acceptance.” This is about exploiting degrees of freedom, exploring realms that will allow us to overcome previous obstacles. It’s about defining the obstacles between the two worlds so accurately, so succinctly, that they become workable. This work requires experts and people from certain disciplines to be brought around the same table in ways that have not happened before.
The main focus of the Strategic Innovation course offered to the executive MBAs at the School of Management and Technology lies in different aspects of innovation culture and innovation science. These are pulled together into an ecosystem that provides systematic support to “Innovation within the Value Creation Chain” in order to create new business models. What is crucial in this process is that people on the course understand the importance of mindset and a culture of discussion, breaking the concept shaped by our education system that there are “desired solutions for given problems.” In times of digital transformation, this is about making our thinking principles and decision-making models compatible with a business environment that is undergoing non-linear change. The ability to “think abstract” plays a decisive role in this respect.