Robust Time-of-Flight-based Material Imaging using Three-Dimensional Deep Neural Networks on Spatial Neighborhoods of Pixels
Time-of-Flight cameras are active sensors for depth assessment between the camera and object that measure the time traveled by the modulated light from an optical transmitter to a pixel array. The conventional methods of material imaging supported by RGB cameras fail in the dense classifica- tion of look-alike materials. To date, the Material Impulse Response Func- tion (MIRF), which yields valuable features for distinguishing materials using ToF cameras has been considered mostly in the temporal dimension. Our novel approach introduces material imaging based on both spatial and temporal dimensions, exploiting three-dimensional features. Firstly, we propose an innovative approach to per-pixel material imaging using a set of features over a spatial neighborhood. Secondly, we introduce a bilateral weight matrix to boost the quality of the ToF three-dimensional features. Thirdly, an attempt has been made to avoid boundary region pixel mis- classification while simultaneously reducing the computation by selecting the K nearest neighbors in the ToF feature space within a patch. Finally, the above-mentioned approaches are validated on several datasets using newly-proposed three-dimensional deep learning models. II