PHYSPLAT: a Framework for Photorealistic Hybrid Simulation of Real and Synthetic Elements using 3D Gaussian Splatting
Abstract
We present an integrated, end-to-end system that enables photorealistic real-world objects—reconstructed using 3D Gaussian Splatting (3DGS)—to interact seamlessly with synthetic elements such as polygonal meshes, fluids, fabrics, and robotic systems within a unified simulation environment. By leveraging Material Point Method (MPM) simulation, our system ensures the compatibility of the 3DGS representation with established physics engines while remaining extensible.Our workflow begins by capturing real-world scenes “in the wild” using 3DGS, from which we derive a simplified, appearance-agnostic particle proxy suitable for physics simulation. These particles, along with synthetic primitives, are imported into the system, where the simulator computes positions and deformation gradients for all bodies—including 3DGS-derived particles—at each timestep.We validate our system through collision and deformation scenarios, and showcase a robotics application in which a manipulator plans and executes tasks involving both captured objects and synthetic elements. By selecting the most appropriate solver and constitutive model for each material—such as MPM for granular media and deformables, PBD for cloth, or SPH for fluids—our approach delivers: (i) high visual fidelity, (ii) accurate, material-specific physical behavior, and (iii) minimal performance overhead.Our pipeline streamlines scene preparation, offering a significant advantage over traditional mesh-centric photogrammetry for time-sensitive reconstruction and emergency scenarios. This combination of flexibility and realism makes our system well-suited for robot task planning, photorealistic multiview dataset generation for autonomous navigation, and other embodied AI applications.