· Devrim Yazan
Title: SHAREBOX - Secure Management Platform for Shared Process Resources
To pave the way forward for Industrial Symbiosis (IS) as a solution for more efficient processing and energy systems for the process industry, we develop a secure ICT platform, called SHAREBOX, for the flexible management of shared process resources that provide plant operations and production managers with the robust and reliable information that they need in real-time in order to effectively and confidently share resources (plant, energy, water, residues, and recycled materials) with other companies in a symbiotic eco-system. A suite of new analysis and optimization tools for flexible energy use and material flow integration are developed for optimizing symbiosis among companies. These tools are based on input-output (IO) modelling for resource (waste and energy) supply-demand matching and process efficiency analysis (to understand physical and technological conditions), game theoretical (GT) approach for integrating company behavior in cost-, benefit-, and resource-sharing (to understand economic conditions), and agent-based modelling (ABM) for designing the (economic, environmental, and social) optimal symbiotic networks (to have the holistic optimum). The outputs from the SHAREBOX controller provide plant and operations managers with commands for actions to be taken and/or recommendations for decision support.
· Berend Alberts
Title: Design of a datafication framework
Abstract: Both decision makers and engineers like the idea of data as 'just data' - a neutral instruments, or in this case an objective representation of the world. For decision makers, this means they don't really have to understand the technology, and can continue what they have always done. For engineers, it means that they can simply focus on 'making the thing work', and they don't have to take responsibility for the consequences; ethics is a matter of how you use the technology. But cases like the Cambridge Analytics scandal show us that data is not a neutral instrument. We see unintended consequences, data being re-used in new places, and AI that 'learns' to be racist. So how should we deal with this non-neutrality of data and embedded values? How did the data get biased in the first place?
The term 'datafication' helps us understand that data aren't simply 'out there', just lying around to be 'gathered'. Instead, data have to be generated, desinged and imagined, before there are any data for our analyses and algorithms to work on. Because all data is in some way human-made, our prejudices and biases are already part of the data before the first bit is registered and written.