Researchers of DMMP work in several research areas with applications in different fields, like health care, traffic, energy, ICT, games and auctions, logistics and timetabling. The overview of previously completed theses gives an indication of what kind of topics for a final assignment are possible. We collaborate with different external partners outside of the UT for internships and final assignments, and to name only a few, that could be DAT.mobility, ORTEC, Thales, NS, CQM, and many more. Also foreign Universities are an option. The list below in therefore indicative, and shows a few of the open problems to work on. If you are interested in assignments for an internship or master's thesis, please contact any member of the group.
The following list of potential MSc topics is always under construction and will be updated regularly. To get more information, you can always contact any of the staff.
Internship & MSc thesis
In a Hot Strip Mill thick steel slabs are hot rolled out to long strips having a thickness range of 2 to 25 mm. After hot rolling the strip needs to be cooled down on the runout Table (ROT) from about 900°C to about 500°C or lower after which the strip is coiled. The new market developments in hot rolled products are mainly in the advanced high-strength steels. These products require a precise and highly flexible control of the cooling path on the runout table. Not only the final temperature (coiling temperature) , but also the cooling rates and intermediate temperatures are important for achieving the right mechanical properties of the steel.
Before a steel strip enters the ROT, there is limited time available for the controller to determine the optimum setup . As there are many variables involved (the settings of each individual bank, material properties, velocity) of which some variables are discrete (e.g. the valve settings: 0%, 70% or 100% open) it is very complex to find the minimum of the objective function within the limited available time: we have about 6 seconds to find the optimum out of possible control settings. There are various algorithms available, however, many of them are not suitable to find the global minimum (they might find a local minimum as optimum) and/or are not fast enough to be useful. To find a suitable solution, the method must be able to solve a non-convex (having both local minima and a global minimum), non-linear, and discrete problem.
This is a project with R&D at Tata Steel (The Netherlands. For more details and a more detailed problem desciption, please contact Johann Hurink or Marc Uetz.
Congestion games are a fundamental model in optimization and game theory, with applications e.g. in traffic routing. The price of stability is a game theoretic concept that relates the quality of the best Nash equilibrium to that of an optimal solution. It is the "smaller brother" of the well known price of anarchy as defined by Koutsoupias and Papadimitriou in 2001, and has been first defined by Anshelevich et al. in 2004. The basic question that is asked here is if and how the combinatorial structure of the strategy spaces of players influences the quality of the possible equilibria. In that respect, a recent progress was made for uniform matroids and the price of anarchy, which equals approximately 1.35188. The conjecture is that the price of stability for that (and maybe even for more general models) equals 4/3. The proof of this conjecture is the topic of this project. Background literature is a paper by de Jong, Klimm and Uetz on "Efficiency of Equilibria of Uniform Matroid Congestion Games" as well as the more recent paper "The asymptotic price of anarchy for k-uniform congestion games " by de Jong, Kern, Steenhuisen and Uetz. Both papers are available upon request.
For further questions, contact Jasper de Jong or Marc Uetz.
In a recent paper (de Jong and Uetz, https://arxiv.org/abs/1709.10289) we have analyzed the quality of several types of equilibria for so-called set packing and throughput scheduling games. In that model, players subsequently select items to maximize the total value of the selected items, yet each player is restricted in the feasible subsets she can choose. The results are bounds on the quality of Nash and other game theoretic equilibria.
One of the distinguishing features of that model is that no item can be chosen by more than one player. That is a natural assumption in sequential games, but appears somewhat artificial when considering single-shot games.
The question that is to be analyzed in this MSc project is what happens when that assumption is relaxed? First, what type of models adequately model the situation that several players choose one and the same item? And what are the consequences for the resulting equilibria? What is the price of anarchy for pure and mixed Nash equilibria for such a model?
For more information, please contact Marc Uetz.
In a series of recent publications, several researchers have analyzed sequential games and subgame perfect equilibria in order to circumvent the sometimes bad quality of Nash equilibria. Specifically, de Jong and Uetz (2015) have done that for congestion games with two or three players, showing that the sequential price of anarchy equals 1.5 and 1039/488, respectively. Subsequently, Correa, de Jong, de Keijzer and Uetz (2016) have considered network routing games and showed that -surprisingly- the sequential price of anarchy for games with n players can even be unbounded (while the price of anarchy is only 2.5). All these results are for pure strategy Nash and subgame perfect equilibria. One of the open questions is what happens if we consider mixed strategies, or settings in which the demand of a player is splittable. As a starting point, one can consider games with two or three players... The underlying research papers are available upon request.
For further information, contact Jasper de Jong or Marc Uetz.
For many optimization problems, finding optimal solutions is prohibitive because the problems are NP-hard. This often holds even in the natural case, where the instances of the optimization problem consists of points in the Euclidean plane. In order to still be able to solve these problems, heuristics have been developed in order to find close-to-optimal in reasonable time. While many such heuristics show a remarkable performance in practice, their theoretical performance is poor and fails to explain practical observations.
Smoothed analysis is a relatively new paradigm in the analysis of algorithms that aims at explaining the performance of such heuristics for which there is a gap between theoretical analysis and practical observation.
Recently, Bläser et al. (Smoothed Analysis of Partitioning Algorithms for Euclidean Functionals, Algorithmica, to appear) have developed a framework to analyze so-called partitioning heuristics for optimization problems embedded in the Euclidean plane. The goal of this thesis is to generalize this framework to higher dimensions and to apply it to analyze further heuristics for Euclidean optimization problems.
For more information, please contact Bodo Manthey.
In Game Theory we learn that the existence of Nash equlilibria (Nash's Theorem) follows from Brouwer's Fixed Point Theorem. How about the converse? The answer depends on which version of Nash's Theorem we take, but details must still be clarified. In addition: literature study and overview to see what is known w.r.t. complexity of the two problems (computing fixed points vs. computing Nash equlibria).
Please contact Walter Kern.
Let N be a set of n=|N| players. A simple game is defined by a partition of the power set of N into two non-empty sets L and W, W the set of winning coalitions, and L the set of losing coalitions, such that L is closed under taking subsets and hence W is closed under taking supersets. Most well-known are weighted voting games, where each player i in N has an associated nonnegative weight p(i) and winning coalitions W are defined by p(W) being at least equal to 1. To check whether a simple game is a weighted voting game, one is led to consider
For weighted voting games, the min max value is less than 1. In general this ratio may be seen as measuring the distance of a simple game to weighted voting games. For general simple games, a conjecture states that the optimum is at most n/4.
Assignment: Study particular classes of simple games or try to prove (approximately) the conjecture. Please contact Walter Kern.
Resource allocation problems are a special type of highly structured mathematical programs. In these problems, the goal is to allocate a given amount of resource (e.g., money, products) over a set of activities (e.g., time) taking into account bounds on the allocated resource for each activity while minimizing a given convex objective function. This problem occurs widely in operations research and engineering (e.g., portfolio optimization, production planning, scheduling, energy management), especially as a subproblem in more complex problems (see Patriksson, 2008 https://www.sciencedirect.com/science/article/pii/S0377221706012215). In particular, we are interested in these problems for their application in device modelling in smart grids.
Much research on resource allocation problems focuses on the case where the objective function is separable, i.e., it can be written as the sum of single-variate functions. For this class of problems, many very efficient algorithms are known to solve the problem to optimality. In contrast, little research has been done on the general, nonseparable case, despite its wide range of applications. Recently, we derived a condition on the problem instance that enables us to apply solution methods for the separable version to solve certain nonseparable problems. This project builds on this work and its main goal is to analyze the nonseparable resource allocation problem and derive efficient algorithms to solve (subclasses of) this problem.
For more information please contact Martijn Schoot Uiterkamp.