Expected exponential loss for gaze-based video and volume ground truth annotation

Lejeune, Laurent Georges Pascal; Christoudias, Mario; Sznitman, Raphael (16 July 2017). Expected exponential loss for gaze-based video and volume ground truth annotation. In: International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI), LABELS - Workshop. 10.1007/978-3-319-67534-3_12

[img] Text
1707.04905.pdf - Published Version
Restricted to registered users only
Available under License Publisher holds Copyright.

Download (4MB) | Request a copy

Many recent machine learning approaches used in medical imaging are highly reliant on
large amounts of image and groundtruth data. In the context of object segmentation, pixelwise annotations are extremely expensive to collect, especially in video and 3D volumes. To reduce this annotation burden, we propose a novel framework to allow annotators to simply observe the object to segment and record where they have looked at with a $200 eye gaze tracker. Our method then estimates pixel-wise probabilities for the presence of the object throughout the sequence from which we train a classifier in semi-supervised setting using a novel Expected Exponential loss function. We show that our framework provides superior performances on a wide range of medical image settings compared to existing strategies and that our method can be combined with current crowd-sourcing paradigms as well.

Item Type:

Conference or Workshop Item (Paper)

Division/Institute:

10 Strategic Research Centers > ARTORG Center for Biomedical Engineering Research > ARTORG Center - AI in Medical Imaging Laboratory

Graduate School:

Graduate School for Cellular and Biomedical Sciences (GCB)

UniBE Contributor:

Lejeune, Laurent Georges Pascal, Sznitman, Raphael

Subjects:

600 Technology > 610 Medicine & health
600 Technology > 620 Engineering

Series:

Lecture Notes in Computer Science

Language:

English

Submitter:

Raphael Sznitman

Date Deposited:

01 May 2018 09:58

Last Modified:

05 Dec 2022 15:09

Publisher DOI:

10.1007/978-3-319-67534-3_12

ArXiv ID:

1707.04905

BORIS DOI:

10.7892/boris.108438

URI:

https://boris.unibe.ch/id/eprint/108438

Actions (login required)

Edit item Edit item
Provide Feedback