Scientists need to follow standardised methodology, but designing an effective protocol is not without its challenges.
It is yet another morning of driving to the field site, and we pull up the weather forecast for the last days of this sampling session. Will we finally have the dry window of 48 hours that we need to set invertebrate traps in the twenty-four cages of our project? Leaving our traps out during high rain probability not only means we run the risk of lower insect activity and therefore non-representative numbers in our traps, but also that in the event of heavy rain, traps could flood and make the data nonviable. The previous samplings took place during good weather, when insects were more active. Over the course of two years, each year has samplings set at specific times in the seasons to be able to compare changes over time. Thus, more importantly, we want data to be comparable to our previous samplings and to other studies with the same aims.
To have relevant data for one’s research questions, researchers need to think critically during protocol design. For example, differences as subtle as the colour and material of invertebrate traps will influence which invertebrates are drawn to traps. Add to that the direction traps are placed in relation to the sun, the proper time of day to set traps, and weather conditions, and every action one takes is going to influence the end result. In a project such as RECODYN (ERC project #101043548), this is further complicated by sampling multiple components of the ecosystem. To understand how an ecosystem’s recovery dynamics play out at the community scale, the project looks at both plant and invertebrate species at multiple trophic levels and collects data on environmental and biological factors such as temperature and soil respiration. Each element requires its own protocol, and the standardisation of such is key.
Standardisation across studies
One challenge in conducting fieldwork is generating results that can be replicated by other researchers. During the design process, researchers should check state-of-the-art research in their field so that their work will be comparable to similar studies, and therefore trends in science can be seen at larger scales, across studies. The project employs well-established sampling methods such as quadrats, which define the limits of plants to sample, and randomisation of sampling order and locations to eliminate selection bias. In the RECODYN project, we combed through literature on sampling methods and reached out to researchers directly when details on methodology were sparse.

This process resulted in, for example, the decision to follow the advice of Brown & Mathews (2016), who made a plea to those studying invertebrates to design traps in a consistent manner, with transparent materials of specific dimensions. In addition to these pitfall traps placed on the ground, each cage at the site also features a hanging Malaise trap, adapted to the dimensions of each cage. This well-known trap, named for the Swedish entomologist who designed it, is used to catch invertebrates traveling by air. In deciding on these two methods, our decisions would mean we would not be focusing on nocturnal insects whose behaviour would require different traps altogether. Next, to see the relationship between herbivores and the plants growing on site, we needed to decide which kind of feeding patterns we would search for and quantify.

Standardisation within a project
However, how is data collected consistently when there are up to ten team members involved in the data collection? In this sense, standardisation of data is important not just at the level of the broader scientific community but also within projects. The project chose to focus on three feeding guilds not only because of their widespread occurrence in literature on plant-herbivore interactions, but also because they would be the easiest for an interdisciplinary team to spot in a consistent manner. In addition to whether the plant was chewed, mined by larvae, or fell prey to being sucked by insects such as aphids, how to quantify the percentage of each plant eaten would also influence our accuracy as a team. Here we followed the recommendation of Johnson et al. (2016), who showed that estimating damage percentages visually in categories, or ‘bins’, would be as accurate and more time efficient than employing software to do it for us. However, Cornelissen et al. (2026) found visual estimations to be ten times faster, but prone to overestimation. To minimise the observer effect in our data, all team members underwent training and practice sessions. Additionally, when data began to be collected regularly, the person estimating feeding damage was noted so that this variable could be considered a random factor during data analysis. These methods are also shown to improve data quality for volunteers of the broader public who help advance science by participating in citizen/community science initiatives (Kosmala et al., 2016). With clear instructions and proper training, anyone can contribute to the generation of scientific knowledge.
Yet, how to deal with mistakes and inconsistencies that surface despite all of the protocols and training sessions? First, team members check each other’s work in the moment. However, long days in the field mean sleepy researchers and errors can increase. In this project, having everything documented in multiple ways has been key. Forgot to write down the feeding damage? The datasheet doesn’t list whether there were aphids on the plant? We often go back to our photo records of plant and invertebrate species that have been sorted in the lab. Sometimes these photos reveal errors that can be easily corrected, but when doubts persist, a meeting is held to review and make corrections.
Taking a decision
Arriving at the field site, we need to make a decision on the reliability of the weather forecast. It seems it will be our best bet to set the traps this morning and collect them in forty-eight hours. Putting it off any longer runs the risk of a rainy weekend where the conditions in the foothills of the Pyrenees can change at a moment’s notice. If the current forecast holds true, we can even use our soil respiration machine tomorrow. From what we’ve read, it is best to avoid using it after any heavy rainfall, as the soil will become too saturated and respiration rates will not be representative of other samplings. A project as complicated as this one highlights the importance of being able to improvise and adapt in the field while simultaneously keeping protocols standardised.

It is thanks to fellow scientists that researchers are able to make decisions on how to sample multiple elements of an ecological community. Sharing best practices for how data should be collected helps us standardise our methods. The scientific community must work together to have robust and reliable data. We are eager to see and share the results of our efforts before, during, and after data collection. The transparency of sharing our methods will ensure our results are replicable and useful to the scientific community and society. It is through this knowledge generation that science points to trends in nature and helps inform decision-making on ecosystem restoration and conservation.
References
- Brown, G. R., & Matthews, I. M. (2016). A review of extensive variation in the design of pitfall traps and a proposal for a standard pitfall trap design for monitoring ground‐active arthropod biodiversity. Ecology and evolution, 6(12), 3953-3964.
- Cornelissen, T., Mendes, G. M., Silveira, F. A., Dáttilo, W., Guevara, R., Aguilar, R., … & Wetzel, W. C. (2026). Quantifying leaf herbivory: A guide to methodological trade‐offs and best practices. Ecology, 107(2), e70308.
- Johnson, M. T., Bertrand, J. A., & Turcotte, M. M. (2016). Precision and accuracy in quantifying herbivory. Ecological Entomology, 41(1), 112-121.
- Kosmala, M., Wiggins, A., Swanson, A., & Simmons, B. (2016). Assessing data quality in citizen science. Frontiers in Ecology and the Environment, 14(10), 551-560.
Disclaimers
Funded by the European Union (ERC, RECODYN, 101043548). Views and opinions expressed are, however, those of the author(s) only and do not necessarily reflect those of the European Union or the European Research Council Executive Agency. Neither the European Union nor the granting authority can be held responsible for them.
This work has benefited from state aid managed by the French national research agency under the Future Investments programme bearing the reference ANR-11-INBS-0001AnaEE-Services.
Claudia Christensen García
Adaia Cid-Alarcón
Tânia D Costa
Andrea Lirola-Jiménez
Project Technicians
Basque Centre for Climate Change
Please Note: This is a Commercial Profile
Please note, this article will also appear in the 26th edition of our quarterly publication.


