The Dutch have no problem texting while they cycle and are happy to install solar panels, but they are troubled by nuclear energy and afraid of a terrorist attack. These are just a few examples of situations in which people act and react on the basis of intuition instead of reason. Because if you look at the statistics behind these situations, you will see that a terrorist attack is far less likely than causing an accident on your bike while texting. By the same token, figures suggest that a greater reliance on nuclear energy has more positive effects on the environment than a rise in the number of solar panels. Even so, we humans tend to trust our intuition. Welcome to the world of Professor Mariëlle Stoelinga, who on 23 November will give her inaugural lecture as Professor of Risk Management for High-Tech Systems at the University of Twente.
The professor knows where the trouble begins. “Not many people are trained in the principle of calculating risks in advance. And that means they often see techniques for calculating probability and risk models as time-consuming and complicated. But as systems become increasingly complex and diverse technologies are integrated, it is more essential than ever that we continue to oversee the entire process. Without risks there can be no progress and innovation is simply not possible. But the risks we take should be calculated risks, ones that we have examined systematically and explained clearly to the outside world.”
In her field, it is a phenomenon Professor Stoelinga sees time and time again: the human factor is the most unpredictable. We take shortcuts, innovate too quickly and tend to take too rosy a view of certain types of risk. For evidence, look no further than the recent tragic accident involving an electric cart (Stint) in the Dutch town of Oss, or the problems with the self-driving cars tested by Uber and Google. I am convinced that these incidents could have been prevented by sound risk management at an earlier stage. Especially in this time of rapid developments, many of which are technology-driven, this is more important than ever.”
The same principle of trusting your intuition rather than facts backed up by figures can also stand in the way of progress, as in the discussion about nuclear energy. Critics point directly to the Chernobyl and Fukushima disasters and all the attendant dangers. Yet the figures and studies show that this type of energy has more positive effects on the environment and that the safety of nuclear power plants has been optimized in recent decades thanks to technological developments. Of course, this conclusion does not mean that we should dismiss the need for further research into this area. Cybersecurity risks, in particular, are well worth investigating more closely. The problem here is that safety (absence of accidents, which are unintentional by nature) and security (absence of wilful attacks) often interfere with each other. The same measures you take to increase safety can reduce security. The Internet of Things (IoT) is a good example: it offers wonderful opportunities for improving the safety of nuclear power plants. But at the same time, IoT security is notoriously poor, offering hackers all kinds of opportunities to break into the system. Finding an approach that balances safety and security is therefore an important theme.
People and technology are always connected. Professor Stoelinga sees this as one of the most fascinating things about her field of study. “People are often the weakest link, but many factors lie at the root of this. Systematic and continuous risk management is essential when it comes to limiting problems with new technology. The core principle is to systematically map all possible failures. Then we have to formulate suitable measures. Two things are essential. Firstly, you need to incorporate the right expertise into the risk analyses: in the case of the Stint electric vehicle, you need electrical engineers who know how the engine works, mechanical engineers who understand the construction, and psychologists who know whether people are capable of handling this new product safely. This is the only way to balance out the relative importance of human, technological and environmental factors and properly assess the interaction between them. In addition, you have to accept that you can never eliminate every single risk. All you can do is estimate the probability that a problem will arise and what the potential impact of the problem might be, and then use this as a basis for developing a solid Plan B. Once you have these things well under control, a great deal of trouble can be prevented. The University of Twente has a strong track record in this kind of multidisciplinary research: there’s a good reason why our motto is ‘High-Tech, Human Touch’!”
She may work with risks on a daily basis, but Professor Stoelinga says she she isn’t losing any sleep over them. “Our world is safer than ever, and we are living longer than ever before. What does sometimes keep me awake at night is lack of research funding. The Netherlands is at the bottom of the pile when it comes to investing in research and development. Much of the research I carry out – and enjoy enormously – is practical in nature, but sustainable, fundamental research is also essential for us as a university. That’s what enables us to continue to innovate and to address the major challenges in our field: the risks generated by an increasing dependence on IT systems, digital transformation, cybersecurity threats and climate change.”
Prof. Dr. Mariëlle Stoelinga, appointed by the Executive Board of the University of Twente as Professor of Risk Management for High-Tech Systems, will be giving an inaugural lecture in the Prof.ir. M.P. Breedveld Room, Waaier Building, at 16.00 hrs on Friday 23 November 2018 (be present at 15.30) to mark the occasion of her appointment. She held her inaugural lecture at the Radboud University a day before.