[M] Explainable AI for Drone Imagery

Master Assignment

[M] Explainable AI for drone imagery

Type: Master EE/CS/HMI

Period: TBD

Student: (Unassigned)

If you are interested please contact :

Background:

Information about what is on the Earth’s surface is vital for sustainable development. Capturing the growth of cities helps understand where they are growing and when they are growing into hazardous areas. Yet such maps are either unavailable or outdated for significant parts of Low- to Middle Income Countries. Artificial Intelligence is increasingly combined with images from satellites and drones to fill this gap. But if AI-generated maps are to be used to design structures to prevent floods, or to distribute aid after a flood event, it is vital that we can trust the algorithm’s output.

Explainable AI plays a key role in understanding the reliability of deep learning classifications. It has been gaining a lot of attention in the Computer Vision community, but is only starting to gain momentum in the Geosciences. The purpose of this Master’s Assignment is to consider how XAI methods for fully convolutional networks can be adapted to the characteristics of drone imagery.

Objectives:

To develop explainable-AI methods for the classification of remote sensing imagery from drones.

Related works: