posted on 2021-07-21, 13:01authored byHelen Spiers, Harry Songhurst, Luke Nightingale, Joost De Folter, Zooniverse Volunteer Community, Roger Hutchings, Christopher J Peddie, Anne Weston, Amy Strange, Steve Hindmarsh, Chris Lintott, Lucy M Collinson, Martin L Jones
Advancements in volume electron microscopy mean it is now possible to generate thousands of serial images at nanometre resolution overnight, yet the gold standard approach for data analysis remains manual segmentation by an expert microscopist, resulting in a critical research bottleneck. Although some machine learning approaches exist in this domain, we remain far from realising the aspiration of a highly accurate, yet generic, automated analysis approach, with a major obstacle being lack of sufficient high‐quality ground‐truth data. To address this, we developed a novel citizen science project, Etch a Cell, to enable volunteers to manually segment the nuclear envelope of HeLa cells imaged with Serial Blockface SEM. We present our approach for aggregating multiple volunteer annotations to generate a high quality consensus segmentation, and demonstrate that data produced exclusively by volunteers can be used to train a highly accurate machine learning algorithm for automatic segmentation of the nuclear envelope, which we share here, in addition to our archived benchmark data.
Funding
Crick (Grant ID: 10233, Grant title: STP Scientific Computing)
Crick (Grant ID: 10004, Grant title: STP Electron Microscopy)