Spring is in the air which means it’s time for our April Issue! This issue features two Applications and two Practical Tools articles, as well as methods for quantifying macronutrient content in invertebrates, mapping forest leaf area density, classifying avian vocal activity and much more! Plus, read on to discover more about our intriguing cover image.

Featured Articles

MEDI *Open Access – Practical Tools* Macronutrients underpin many ecological processes, but their quantification is often inaccurate and laborious. In this study, Cuff et al. present Macronutrient Extraction and Determination from Invertebrates (MEDI), a protocol for the direct, rapid and relatively low‐cost determination of macronutrient content from single small macroinvertebrates. Using MEDI, the total macronutrient content of over 50 macroinvertebrates can be determined within around 3 days of collection at a cost of ~$1.35 per sample.

Tree species mapping *Open Access* Information about the spatial distribution of species lies at the heart of many important questions in ecology but logistical limitations and collection biases limit the availability of such data. Remotely sensed information can alleviate some of these concerns, and recent advances in machine learning offer a promising and cost‐efficient approach for gathering a large amount of species distribution data from aerial photographs. Here, Tang et al. propose a novel machine learning framework, artificial perceptual learning (APL), to tackle the problem of weakly supervised pixel‐level mapping of tree species in forests.

Forest leaf area density Terrestrial lidar data are useful for estimating the three‐dimensional (3D) distribution of leaf area in forests. However, little is currently known about its potential and limits in dense forests. Here, Béland & Kobayashi worked to fill knowledge gaps, establishing initial guidelines for terrestrial lidar survey protocols for mapping leaf area density in forests. The leaf area density voxel arrays derived are among the most accurate plot‐level 3D characterizations of foliage arrangement produced to date.

Avian vocal activity *Open Access* Acoustic indices combined with clustering and classification approaches have been increasingly used to automate identification of the presence of vocalising taxa or acoustic events of interest. However, large‐scale studies often require collaboration between research groups and integration of data from multiple sources to fulfil objectives, which can lead to variation in recording equipment and data collection protocols. Here, Yip et al. investigate how analytical approaches and variation in data collection and processing that is typical of regional acoustic monitoring programmes influences accuracy when identifying vocal activity in breeding birds.

Practical Tools

Investigating UV perception *Open access* The ability to see UV light may have importance for foraging, communication or navigation in many taxa. However, our knowledge of UV perception is constrained by the challenge of creating and calibrating stimuli that reflect or emit UV. To overcome this limitation, Powell et al. designed and constructed a RGB‐V‐UV LED display, a device useful for behavioural tests of colour vision across a broad spectrum (350–650 nm) visible to many animals. It can be used to investigate various questions concerning animal perception, including colour discrimination and categorisation.

Applications

maxnodf *free access* Nestedness measures tend to be correlated with fundamental properties of networks, such as size and connectance, so nestedness values must be normalised to enable fair comparisons between different ecological communities. Current approaches, such as using null‐corrected nestedness values and z‐scores, suffer from extensive statistical issues. Thus a new approach called NODFc was recently proposed, where nestedness is expressed relative to network size, connectance and the maximum nestedness that could be achieved in a particular network. Here, Hoeppke & Simmons develop three highly optimised algorithms, based on greedy, hill climbing and simulated annealing approaches, for calculation of NODFc, spread along a speed‐quality continuum. Users thus have the choice between a fast algorithm with a less accurate estimate, a slower algorithm with a more accurate estimate and an intermediate option.

metabaR *free access* Artefacts present in metabarcoding datasets often preclude a proper interpretation of ecological patterns. Here, Zinger et al. present metabaR, an R package that provides a comprehensive suite of tools to effectively curate DNA metabarcoding data after basic bioinformatic analyses. In particular, metabaR uses experimental negative or positive controls to identify different types of artefactual sequences, that is, contaminants and tag‐jumps. It also flags potentially dysfunctional PCRs based on PCR replicate similarities when those are available. Finally, metabaR provides tools to visualise DNA metabarcoding data characteristics in their experimental context as well as their distribution, and facilitates assessment of the appropriateness of data curation filtering thresholds.

The Nose on the Cover

This month’s cover image shows a great crested newt (Triturus cristatus) detected by the specially trained nose of a wildlife detection dog. The nose belongs to Border Collie “Zammy”, one of four wildlife detection dogs owned by the authors of the review article by Grimm‐Seyfarth et al. on detection dogs in nature conservation. Zammy is trained to detect newts in their terrestrial habitat, helping the scientists to collect entirely new data about microhabitat conditions with the overall goal to better protect endangered amphibian species. To get an idea about the worldwide deployment of wildlife detection dogs, their target species, the breeds used and their performance compared to other methods, Grimm‐Seyfarth et al. collected and analysed 1220 publications from 1930 onwards. This first comprehensive review about historic and current use of wildlife detection dogs is accompanied by a database and analyses differences in the dogs’ performance. Photo credit: ©Annegret Grimm‐Seyfarth. Read more about the study in her blog post.