Skip to yearly menu bar Skip to main content


Poster

IReNe: Instant Recoloring of Neural Radiance Fields

Alessio Mazzucchelli · Adrian Garcia-Garcia · Elena Garces · Fernando Rivas-Manzaneque · Francesc Moreno-Noguer · Adrian Penate-Sanchez

Arch 4A-E Poster #109
[ ] [ Project Page ] [ Paper PDF ]
[ Poster
Wed 19 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract:

Advancements in neural radiance fields have allowed for detailed 3D scene reconstructions and novel view synthesis. Yet, efficiently editing these representations while retaining photorealism is an emerging challenge. Recent methods face three primary limitations: they’re slow for interactive use, lack precision at object boundaries, and struggle to ensure view consistency in the edits. In this paper, we introduce IReNe to address these three key limitations, enabling swift, near real-time color editing in NeRF. Leveraging a pre-trained NeRF model and a single training image with user-applied color edits, IReNe swiftly adjusts network parameters in seconds. This adjustment allows the model to generate new scene views, accurately representing the color changes from the training image while also controlling object boundaries and view-specific effects. Enhanced object boundary control is achieved by integrating a trainable segmentation module into the model. The process gains efficiency by retraining only the weights of the last network layer. Moreover, we’ve observed that neurons in this layer can be classified into those responsible for view-dependent effects and those contributing to color rendering. We introduce an automated classification approach to identify these neuron types and exclusively finetune the weights of the color-rendering neurons. This further accelerates training and ensures consistent color edits across different views. A thorough validation on a new dataset, meticulously edited for object colors, demonstrates significant quantitative and qualitative advancements over competitors, accelerating speeds by 5× to 500×.

Chat is not available.