UTUT FieldLabExperiment 5 – Vegetation monitoring

Experiment 5 – Vegetation monitoring

From plant to pixel – what can camera pixels teach us about plants?

This UT FieldLab experiment falls under the theme Food security and biodiversity. View all experiments related to this theme.

Introduction

Can you tell how healthy a plant is from a photo? In this experiment, we explore how camera images help us understand vegetation. Using a 12-metre-high camera arch, we capture hourly images of grass, crops, and natural vegetation. This allows us to learn how light, season, and weather affect the image – and how to distinguish real growth changes from shadow effects. Wietske Bijker: “What am I actually seeing? Are the differences in a series of photographs of plants the result of growth, drought, or simply variations in camera settings, lighting, and weather conditions?”

What are we researching?

Our goal is to calibrate cameras and develop reliable methods for vegetation monitoring. How do time of day, sun angle, and weather influence the image? And how can we detect signs of drought or pests? This knowledge is crucial for agriculture, nature management, and biodiversity research.

How does it work?

The camera arc is equipped with a variety of sensors, including a multispectral Tetracam µMCA 6 and a FLIR SC5000 thermal camera (mid-infrared). In addition, there is space to integrate other sensors into the calibration process, such as the Parrot Sequoia multispectral camera, several smaller thermal cameras, and the Citizen Science multispectral camera. These devices capture images across multiple wavelengths, allowing us to record not only colour, but also temperature and reflection patterns across space and time. By linking imagery from different cameras with weather data—such as solar radiation, precipitation, and evaporation—we can better interpret growth processes. The data help us distinguish between genuine changes in vegetation and variations caused by light or shadow. In time, we aim to connect these insights to drones and satellites, enhancing large-scale monitoring. We deliberately use a mix of lightweight, simple cameras alongside more advanced models, so we can learn where low-cost solutions suffice and where more sophisticated approaches are necessary. 

Why is this important?

Reliable image analysis enables more accurate predictions of drought, pests, and crop yields. This helps farmers work more efficiently and nature managers protect ecosystems more effectively. It’s a step towards a future where technology and ecology go hand in hand.

Contact

dr.ir. W. Bijker (Wietske)
Assistant Professor