Segment Anything for Microscopy

2025 | journal article. A publication with affiliation to the University of Göttingen.

Jump to:Cite & Linked | Documents & Media | Details | Version history

Cite this publication

​Segment Anything for Microscopy​
Archit, A.; Freckmann, L.; Nair, S.; Khalid, N.; Hilt, P.; Rajashekar, V. & Freitag, M. et al.​ (2025) 
Nature Methods,.​ DOI: https://doi.org/10.1038/s41592-024-02580-4 

Documents & Media

License

GRO License GRO License

Details

Authors
Archit, Anwai; Freckmann, Luca; Nair, Sushmita; Khalid, Nabeel; Hilt, Paul; Rajashekar, Vikas; Freitag, Marei; Teuber, Carolin; Buckley, Genevieve; von Haaren, Sebastian; Pape, Constantin
Abstract
Abstract Accurate segmentation of objects in microscopy images remains a bottleneck for many researchers despite the number of tools developed for this purpose. Here, we present Segment Anything for Microscopy (μSAM), a tool for segmentation and tracking in multidimensional microscopy data. It is based on Segment Anything, a vision foundation model for image segmentation. We extend it by fine-tuning generalist models for light and electron microscopy that clearly improve segmentation quality for a wide range of imaging conditions. We also implement interactive and automatic segmentation in a napari plugin that can speed up diverse segmentation tasks and provides a unified solution for microscopy annotation across different microscopy modalities. Our work constitutes the application of vision foundation models in microscopy, laying the groundwork for solving image analysis tasks in this domain with a small set of powerful deep learning models.
Issue Date
2025
Journal
Nature Methods 
ISSN
1548-7091
eISSN
1548-7105
Language
English

Reference

Citations


Social Media