Master’s Thesis Presentation • Computer Graphics • Learning Sample-Based Monte Carlo Denoising from Noisy Training DataExport this event to calendar

Friday, January 14, 2022 5:00 PM EST

Please note: This master’s thesis presentation will be given online.

Andrew Tinits, Master’s candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Stephen Mann

Monte Carlo rendering allows for the production of high-quality photorealistic images of 3D scenes. However, producing noise-free images can take a considerable amount of compute resources. To lessen this burden and speed up the rendering process while maintaining similar quality, a lower-sample count image can be rendered and then denoised after rendering with image-space denoising methods. These methods are widely used in industry, and have recently enabled advancements in areas such as real-time ray tracing. While hand-tuned denoisers are available, the most successful denoising methods are based on machine learning with deep convolutional neural networks (CNNs). These denoisers are trained on large datasets of rendered images, consisting of pairs of low-sample count noisy images and the corresponding high-sample count reference images. Unfortunately, generating these datasets can be prohibitively expensive because of the cost of rendering thousands of high-sample count reference images.

A potential solution to this problem comes from the Noise2Noise method, where denoisers can be learned solely from noisy training data. Lehtinen et al. applied their technique to Monte Carlo denoising, and were able to achieve similar performance to using clean reference images. However, their model was a proof of concept, and it is unclear whether the technique would work equally well with state-of-the-art Monte Carlo denoising methods. The authors also do not test their hypothesis that better results could be achieved by training on the additional noisy training data that could be generated with the same compute budget that was previously allocated to generating clean training data. Finally, it remains to be seen whether the authors' suggested parameters are equally effective when Noise2Noise is used with different denoising methods.

In this thesis, I answer the above questions by applying Noise2Noise to a state-of-the-art Monte Carlo denoising algorithm called Sample-based Monte-Carlo denoising (SBMC). I adapt the SBMC scene generator to produce a dataset of noisy image pairs, use this dataset to train an SBMC-like CNN, and conduct experiments to determine the impact of various parameters on the performance of the denoiser. My results show that the Noise2Noise technique can be effectively applied to a state-of-the-art Monte Carlo denoising algorithm. I achieved comparable results to the original implementation at a significantly lower cost. I find that using additional training data can further improve these results, although more investigation is needed in this area. Finally, I detail the parameters that were necessary to achieve these results.


To join this master's thesis presentation on MS Teams, please go to https://teams.microsoft.com/l/meetup-join/19%3ameeting_YTJkZmI1NDMtY2JlMy00MmJmLTliZTgtNjJlMmVlOWEzOGNm%40thread.v2/0?context=%7b%22Tid%22%3a%22723a5a87-f39a-4a22-9247-3fc240c01396%22%2c%22Oid%22%3a%225fa380f1-507c-4648-8d2d-638617cd2f47%22%7d.

Location 
Online master’s thesis presentation
200 University Avenue West

Waterloo, ON N2L 3G1
Canada
Event tags 

S M T W T F S
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
  1. 2024 (100)
    1. April (23)
    2. March (27)
    3. February (25)
    4. January (25)
  2. 2023 (296)
    1. December (20)
    2. November (28)
    3. October (15)
    4. September (25)
    5. August (30)
    6. July (30)
    7. June (22)
    8. May (23)
    9. April (32)
    10. March (31)
    11. February (18)
    12. January (22)
  3. 2022 (245)
  4. 2021 (210)
  5. 2020 (217)
  6. 2019 (255)
  7. 2018 (217)
  8. 2017 (36)
  9. 2016 (21)
  10. 2015 (36)
  11. 2014 (33)
  12. 2013 (23)
  13. 2012 (4)
  14. 2011 (1)
  15. 2010 (1)
  16. 2009 (1)
  17. 2008 (1)