Precise Euclidean distance transforms in 3D from voxel coverage representation
Distance transforms (DTs) are, usually, defined on a binary image as a mapping from each background element to the distance between its centre and the centre of the closest object element. The concept of DT is closely related to many procedures and operations in image processing, such as computation of morphological operations and/or geometrical representations, image segmentation, template matching, image registration, and many others. Interest for further development and improvement of methods and algorithms for DT computation, as well as for finding new applications of DTs, remains high till nowadays. However, due to discretization effects, such DTs have limited precision, including reduced rotational and translational invariance. We show in this paper that a significant improvement in performance of Euclidean DTs can be achieved if voxel coverage values are utilized and the position of an object boundary is estimated with sub-voxel precision. We propose two algorithms of linear time complexity for estimating Euclidean DT with sub-voxel precision. The evaluation confirms that both algorithms provide increased accuracy compared to what is achievable from a binary object representation.