Point-Based Radiance Fields for Controllable Motion Synthesis

Deheng Zhang*1, Haitao Yu*1, Peiyuan Xie*1, Tianyi Zhang*1

(* means equal contribution)

1ETH Zürich  

Abstract


Pipeline of our controllable point radiance field.

We propose a novel controllable motion synthesis method for fine-level deformation based on static point-based radiance fields. Although previous editable neural radiance field methods can generate impressive results on novel-view synthesis and allow naive deformation, few algorithms can achieve complex 3D human editing such as forward kinematics. Our method exploits the explicit point cloud to train the static 3D scene and apply the deformation by encoding the point cloud translation using a deformation MLP. To make sure the rendering result is consistent with the canonical space training, we estimate the local rotation using SVD and interpolate the per-point rotation to the query view direction of the pre-trained radiance field. Extensive experiments show that our approach can significantly outperform the state-of-the-art on fine-level complex deformation which can be generalized to other 3D characters besides humans


Human Motion Synthesis



Character Motion Synthesis



Effect of Ray Bending



Citation


@misc{yu2023pointbased,
      title={Point-Based Radiance Fields for Controllable Human Motion Synthesis}, 
      author={Haitao Yu and Deheng Zhang and Peiyuan Xie and Tianyi Zhang},
      year={2023},
      eprint={2310.03375},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}