UTFacultiesEEMCSDisciplines & departmentsDSResearchFinding Blind Spots: Data quality & the European Information Systems for security, border and migration management

Finding Blind Spots: Data quality & the European Information Systems for security, border and migration management

Finding Blind Spots:

Data quality & the European Information Systems for security, border and migration management

Project Supervisors:

Funding:

European Research Council

 

Discription:

How does migration enact Europe? This question can be answered legally and politically, as most policy makers, sociologists and journalists are doing. Or, it can be answered technically. How do data infrastructures for migrant processing co-produce citizens, Europe and territory?

Intensifying migration waves are changing not only EU policies, but also the way knowledge about individuals, institutions and space is created. Information systems are key enablers of this knowledge. They materialize legislative, political, administrative dynamics in which citizenship, state and territory are co-produced. This is the point of departure of “Processing Citizenship. Digital registration of migrants as co-production of citizens, territory and Europe”, a five-year project involving a team made of sociologists, ethnographers, software developers and policy analysts.

Thanks to the financial support of an ERC Starting Grant (2017-22), we are investigating the informational processing of third-country nationals as inter-governmental practices that are challenging our established notions of “citizenship”, “state”, “Europe” and “territory”, as they become embedded in digital infrastructures that cross member states. This is a pressing technical and operative issue. Technically, migrant data circulation requires infrastructural standardization and integration among agencies at European, national and local levels. Operationally, gaps and misalignments in data collection, classification and circulation can lead to major drawbacks not only in the European migration machine, but also in European multi-level governance.

This doctoral research by W. Van Rossem is one part of the Processing Citizenship project that looks at interoperability of information systems, a widespread goal of governments to make better use of data they have available to make it useable between systems and/or organisations. Semantic interoperability of data is a high priority as it aims to make data understandable by other systems. We see interoperability as part of wider data quality strategies, such as standardisation or linking records between systems. This research aims to understand the methods used for undertaking interoperability projects by looking back into the past at how data quality is constituted within techno-social assemblages, what methods are used to harmonise data quality rules in interoperability-aimed projects, and what effects this has on the application of the rules.

We will look at the European Information Systems for security, border and migration management and the organisation that manages these systems. These information systems store and process similar data, but all have different original purposes. They all do identity management, but one is for example concerned with identifying asylum seekers, while another deals with visa information of travellers. Interoperability between these systems is a goal and therefore provides an interesting case for analysing different purposes and practices of various social groups surrounding data quality.

We will use a combination of methods to analyse this with different approaches and introduce a novel view to look at how data standards and quality are inscribed within these techno-social assemblages.. By analysing technical, design, and legislative documents we expect to see the practical politics of data quality inscriptions and uncover otherwise hidden choices. Ethnographic fieldwork at the organisation that manages these information systems aims to understand the different cultures and historical changes surrounding data quality.