Skip to main content
ISINET: AN INSTANCE-BASED APPROACH FOR SURGICAL INSTRUMENT SEGMENTATION

Abstract

Results


Table 1. Comparison against the state-of-the-art on the EndoVis 2017 dataset.

METHODDT
TernausNet [28]
MF-TAPNet [15]
ISINet (Ours)


ISINet (Ours)

CHALLENGE IOUMEAN CLASS IOU
35.27
37.35
53.55
55.62
10.17
10.77
26.92
28.96
66.27
67.74
36.48
38.08

Table 2. Comparison against the state-of-the-art on the EndoVis 2018 dataset.

METHODDT
TernausNet [28]
MF-TAPNet [15]
ISINet (Ours)


ISINet (Ours)

CHALLENGE IOUMEAN CLASS IOU
46.22
67.87
72.99
73.03
14.19
24.68
40.16
40.21
77.19
77.47
44.58
45.29

Qualitative results


Figure 1. Each row depicts an example result for the task of instrument type segmentation on the EndoVis 2017 and 2018 datasets. The columns from left to right: image, annotation, segmentation of TernausNet [28], segmentation of MFTAPNet [15] and the segmentation of our method ISINet. The instrument colors represent instrument types.

Downloads


References


[1] Allan, M., Shvets, A., Kurmann, T., Zhang, Z., Duggal, R., Su, Y.H., et al.:2017 robotic instrument segmentation challenge. arXiv preprint arXiv:1902.06426(2019)

[2] Allan, M., Shvets, A., Kurmann, T., Zhang, Z., Duggal, R., Su, Y.H., et al.: 2017 robotic instrument segmentation challenge. arXiv preprint arXiv:1902.06426 (2019)

[15] Jin, Y., Cheng, K., Dou, Q., Heng, P.A.: Incorporating temporal prior from motion flow for instrument segmentation in minimally invasive surgery video. In: Medical Image Computing and Computer Assisted Intervention – MICCAI 2019. pp. 440–448. Springer International Publishing, Cham (2019)

[28] Shvets, A.A., Rakhlin, A., Kalinin, A.A., Iglovikov, V.I.: Automatic instrument segmentation in robot-assisted surgery using deep learning