Deep leaning auto-segmentation of tomographic datasets, repeatability and biomechanical outcomes
Abstract
Automated segmentation of three-dimensional micro-computed tomography (CT) scan data is a critical bottleneck in computational morphometrics and biomechanical modelling across musculoskeletal biology. Although advances in imaging have generated increasingly large and complex datasets, manual segmentation remains prohibitively time-consuming, while existing deep learning solutions are often application-specific and rarely validated for their impact on downstream analyses. Here we present a generalisable computational framework for automated segmentation and biomechanical validation of skeletal structures, implemented using an attention-augmented 3D U-Net architecture. Using the adult zebrafish (Danio rerio) mandible as a representative case study, the network was trained on 47 manually segmented specimens and evaluated using a combined Dice-Hausdorff metric that integrates both volumetric and surface accuracy to capture biologically relevant morphology better than Dice score alone. To assess performance in a downstream biomechanical context, we directly compared automated segmentations with those produced by three expert human annotators and constructed finite element models from each. Quantitative comparisons of segmentation accuracy and statistical analyses of finite element outputs demonstrate that automated segmentations perform within the range of expert human annotations, with no systematic bias in mechanical predictions. By explicitly linking geometric accuracy to biomechanical outcomes, this work establishes an end-to-end pipeline for validating automated segmentation in computational biology. The framework will be applicable to a wide range of musculoskeletal and palaeobiological contexts, including comparative anatomy, ageing and disease studies, and fossil or incomplete specimens, where scalable and mechanically faithful segmentation is essential.
Related articles
Related articles are currently not available for this article.