PhD Seminar • Computer Vision (Artificial Intelligence) — Regularized Losses for Weakly-supervised CNN SegmentationExport this event to calendar

Friday, May 4, 2018 2:00 PM EDT

Meng Tang, PhD candidate
David R. Cheriton School of Computer Science

Minimization of regularized losses is a principled approach to weak supervision well established in deep learning, in general. However, it is largely overlooked in semantic segmentation currently dominated by methods mimicking full supervision via "fake" fully-labeled training masks (proposals) generated from available partial input. To obtain such full masks the typical methods explicitly use standard regularization techniques for "shallow" segmentation, e.g., graph cuts or dense CRFs. In contrast, we integrate such standard regularizers and clustering criteria directly into the loss functions over partial input. This approach simplifies weakly-supervised training by avoiding extra MRF/CRF inference steps or layers explicitly generating full masks, while improving both the quality and efficiency of training. 

This paper proposes and experimentally compares different losses integrating MRF/CRF regularization terms and pairwise clustering criteria. We juxtapose our regularized losses with earlier proposal-generation methods using explicit regularization steps or layers. Our approach achieves state-of-the-art accuracy in semantic segmentation with near full-supervision quality.

Location 
DC - William G. Davis Computer Research Centre
2310
200 University Avenue West

Waterloo, ON N2L 3G1
Canada

S M T W T F S
27
28
29
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
  1. 2024 (99)
    1. April (22)
    2. March (27)
    3. February (25)
    4. January (25)
  2. 2023 (296)
    1. December (20)
    2. November (28)
    3. October (15)
    4. September (25)
    5. August (30)
    6. July (30)
    7. June (22)
    8. May (23)
    9. April (32)
    10. March (31)
    11. February (18)
    12. January (22)
  3. 2022 (245)
  4. 2021 (210)
  5. 2020 (217)
  6. 2019 (255)
  7. 2018 (217)
  8. 2017 (36)
  9. 2016 (21)
  10. 2015 (36)
  11. 2014 (33)
  12. 2013 (23)
  13. 2012 (4)
  14. 2011 (1)
  15. 2010 (1)
  16. 2009 (1)
  17. 2008 (1)