Applying machine learning tools for automated behaviour classification in invasive lionfish and comparison with human observations
Abstract
Modern neuroscience and ecology are increasingly adopting machine learning (ML) methods to automate the tracking and classification of animal behaviour. These techniques are particularly valuable for quantifying fitness‑related behaviours such as hunting in invasive predators. Here, we evaluate the effectiveness of two ML tools, DeepLabCut for pose estimation and SimBA for random‑forest behaviour classification, at distinguishing four behaviours in invasive lionfish (Pterois volitansand P. miles): hovering, resting, swimming, and hunting, and benchmark the ML outputs against annotations from trained human observers. We also introduce a customized, user‑friendly feature‑extraction script tailored to lionfish. The script converts positional landmark coordinates extracted by DeepLabCut into a comprehensive set of kinematic metrics (e.g., body‑angle variance, fin‑beat frequency), essential for behaviour classification, as SimBA relies on these metrics rather than raw body‑part positions. A companion GitHub guide further clarifies which specific metrics are most informative under different behavioural scenarios. To our knowledge, this is the first study to explore how a wide‑angle camera lens influences the DeepLabCut–SimBA workflow. Behavioural trials, conducted in controlled aquarium settings, showed that the models classified high‑motion behaviours (hunting and swimming) with high precision and recall, likely owing to distinctive kinematic signatures. In contrast, low‑motion behaviours such as hovering and resting were harder to detect because of subtle movement cues, occasional suboptimal body orientation, and distortions introduced by the wide‑angle lens. While applicable to other mid-bodied fishes, lionfish were chosen due to their significant ecological impact, where quantifying behaviours like hunting can aid invasive species management and reef conservation.
Related articles
Related articles are currently not available for this article.