Neural Process Reconstruction from Sparse User Scribbles

DSpace/Manakin Repository

Neural Process Reconstruction from Sparse User Scribbles

Citable link to this page


Title: Neural Process Reconstruction from Sparse User Scribbles
Author: Roberts, Mike; Jeong, Won-Ki; Vázquez-Reina, Amelio; Unger, Markus; Bischof, Horst; Lichtman, Jeff; Pfister, Hanspeter

Note: Order does not necessarily reflect citation order of authors.

Citation: Roberts, Mike, Won-Ki Jeong, Amelio Vázquez-Reina, Markus Unger, Horst Bischof, Jeff Lichtman, and Hanspeter Pfister. 2011. “Neural Process Reconstruction from Sparse User Scribbles.” Lecture Notes in Computer Science: 621–628.
Full Text & Related Files:
Abstract: We present a novel semi-automatic method for segmenting neural processes in large, highly anisotropic EM (electron microscopy) image stacks. Our method takes advantage of sparse scribble annotations provided by the user to guide a 3D variational segmentation model, thereby allowing our method to globally optimally enforce 3D geometric constraints on the segmentation. Moreover, we leverage a novel algorithm for propagating segmentation constraints through the image stack via optimal volumetric pathways, thereby allowing our method to compute highly accurate 3D segmentations from very sparse user input. We evaluate our method by reconstructing 16 neural processes in a 1024×1024×50 nanometer-scale EM image stack of a mouse hippocampus. We demonstrate that, on average, our method is 68% more accurate than previous state-of-the-art semi-automatic methods.
Published Version: doi:10.1007/978-3-642-23623-5_78
Terms of Use: This article is made available under the terms and conditions applicable to Open Access Policy Articles, as set forth at
Citable link to this page:
Downloads of this work:

Show full Dublin Core record

This item appears in the following Collection(s)


Search DASH

Advanced Search