Neural Bidirectional Texture Function Compression and Rendering

Research output: Contribution to conferencePoster


The recent success of Machine Learning encouraged research using artificial neural networks (NNs) in computer graphics. A good example is the bidirectional texture function (BTF), a data-driven representation of surface materials that can encapsulate complex behaviors that would otherwise be too expensive to calculate for real-time applications, such as self-shadowing and interreflections. We propose two changes to the state-of-the-art using neural networks for BTFs, specifically NeuMIP. These changes, suggested by recent work in neural scene representation and rendering, aim to improve baseline quality, memory footprint, and performance. We conduct an ablation study to evaluate the impact of each change. We test both synthetic and real data, and provide a working implementation within the Mitsuba 2 rendering framework. Our results show that our method outperforms the baseline in all these metrics and that neural BTF is part of the broader field of neural scene representation. Project website:

Original languageEnglish
Publication statusPublished - 9 Dec 2022
EventSIGGRAPH Asia 2022 - Daegu, Korea, Democratic People's Republic of
Duration: 6 Dec 20229 Dec 2022


ConferenceSIGGRAPH Asia 2022
Country/TerritoryKorea, Democratic People's Republic of


  • bidirectional texture function
  • neural materials
  • neural networks
  • neural representation


Dive into the research topics of 'Neural Bidirectional Texture Function Compression and Rendering'. Together they form a unique fingerprint.

Cite this