computer vision
sensor technology

The Butterfly Foundation, a protector of butterflies and dragonflies for 40 years, notes that the butterfly population has been declining for years. The Vlinderstichting protection fund combats butterfly and insect mortality. The fund combines a people-oriented approach, ecological craftsmanship and a go-getter mentality.


The Butterfly Foundation wants good ecological management of roadsides and green spaces to increase biodiversity in the landscape so that butterfly populations have the opportunity to increase. To achieve this, roadsides must be mowed in phases. So far, this is done manually and each square meter of green space is counted out by hand for the presence of flowers and plants that attract butterflies. The Butterfly Foundation's question is whether this mowing process can be automated in such a way that flora attractive to butterflies is spared.

The process

An important role in this process is played by the MowHawk. With this smart camera system mounted on the roadside mower, we can label flowers by color based on collected images. This gives us more insight into the diversity of flora present and which ones are attractive to butterflies. Several times a year we can have the MowHawk monitor roadsides and green spaces for the presence of flowers.


Together with Wim van Breda, we designed a smart camera system (the 'MowHawk') for on the arm of a mower. The software and prototype for the camera was designed by Datacadabra, Wim van Breda provided the hardware (camera, chassis and mounting) and domain expertise. The system uses computer vision (AI) and location and orientation sensors to collect data about the roadside.


We developed a model for color detection. In addition to the software for the MowHawk, Datacadabra also provided an algorithm for processing the data from the various sensors.


The Butterfly Foundation is excited about the initial results. Together with Datacadabra and Wim van Breda, the foundation will further roll out the collaboration.

The result

We have developed a basic model with which we will literally go out in 2024 to test in practice. With the results from practice, we will continue to train, improve and expand the model. Depending on these experiences, we will explore whether we can expand the capabilities further.