Deep Learning-Based 3D Face Recognition Using Derived Features from Point Cloud


Atik M. E. , Duran Z.

5th International Conference on Smart City Applications, SCA 2020, Karabük, Turkey, 7 - 09 October 2020, vol.183, pp.797-808 identifier

  • Publication Type: Conference Paper / Full Text
  • Volume: 183
  • Doi Number: 10.1007/978-3-030-66840-2_60
  • City: Karabük
  • Country: Turkey
  • Page Numbers: pp.797-808
  • Keywords: Deep learning, Face recognition, Feature map, Point cloud

Abstract

© 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.With developing technology and urbanization, smart city applications have increased. Accordingly, this development brought some difficulties such as public security risk. Identifying people’s identities is a requirement in both smart city challenges and smart environment or smart interaction difficulties. Face recognition has a huge potential for people’s identification. It was possible to perform face recognition applications in larger databases and different situations with the development of deep learning methods. 2D images are usually used for face recognition applications. However, different challenges such as pose change and illumination cause difficulties in 2D facial recognition applications. Laser scanning technology has provided the production of 3D point clouds, including the geometric information of the faces. When the point clouds are combined with deep learning techniques, 3D face recognition has great potential. In the study, 2D images were created for facial recognition using feature maps obtained from 3D point clouds. ResNet-18, ResNet-50 and ResNet-101 architectures, which are different versions of ResNet architecture, were used for classification purposes. Bosphorus database was used in the study. 3D Face recognition was performed with different facial expressions and occlusions based on the data of 105 people. As a result of the study, overall accuracy was obtained with ResNet-18, ResNet-50, and ResNet-101 architectures at 77.36%, 77.03% and 81.54% respectively.