ORCID ID
https://orcid.org/0000-0002-3973-2124
Date Awarded
2023
Document Type
Dissertation
Degree Name
Doctor of Philosophy (Ph.D.)
Department
Computer Science
Advisor
Pieter Peers
Committee Member
Robert M Lewis
Committee Member
Peter Kemper
Committee Member
Weizhen Mao
Committee Member
Steve Marschner
Abstract
Creating realistic computer generated imagery is essential for modern movies and video games. Recreating the appearance of materials is integral to generating such photo-realistic images. While the problem of how to model materials is well studied, here we will focus on the question of how to recreate the appearance of specific materials found in the real world. In this dissertation we will begin with a short introduction to rendering, followed by a discussion of various material models, techniques for measuring reflectance, and strategies for fitting these models to reflectance data. We will then introduce a novel two-stage process for fitting, and demonstrate the efficacy of our fitting technique both quantitatively using perceptual metrics, as well as qualitatively with a user study. Then we will investigate how our fitting strategy might be adapted to work without dense measurements, instead relying on an image as input. In the second part of this dissertation, we will look at a deep-learning based material relighting strategy that bypasses the need for an explicit reflectance model altogether. By directly transforming an image of a material under one lighting to an image of the same material under another lighting, we avoid the ambiguity inherent in moving from the image domain to the higher-dimensional space of materials. We demonstrate that not only does this relighting network faithfully reproduce the appearance of spatially varying materials, but those synthetic relightings can be used as inputs to material estimation networks such that they give higher quality results than when given a single real input image. We hope that the research presented here will inspire future efforts to recreate the appearance of real materials through both physically-based and learning-based means. Our methods are tied together by the observation that techniques for reproducing material appearance are best judged by the perceptual similarity of the images that they create. From this observation, we argue that techniques for recreating the appearance of real materials should be built with perception in mind.
DOI
https://dx.doi.org/10.21220/s2-xts5-z181
Rights
© The Author
Recommended Citation
Bieron, James Christopher, "Appearance Driven Reflectance Modeling" (2023). Dissertations, Theses, and Masters Projects. William & Mary. Paper 1697552547.
https://dx.doi.org/10.21220/s2-xts5-z181