Volume 20, Issue 4 (December (Special Issue on ADLEEE) 2024)                   IJEEE 2024, 20(4): 162-172 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Amirfathiyan A, Ebrahimnezhad H. Efficient 3D Shape Matching: Dense Correspondence for non-isometric Deformation. IJEEE 2024; 20 (4) :162-172
URL: http://ijeee.iust.ac.ir/article-1-3504-en.html
Abstract:   (206 Views)
This paper presents an application of deep learning in computer graphics, utilizing learn-based networks for 3D shape matching. We propose an efficient method for shape matching between 3D models with non-isometric deformation. Our method organizes intrinsic and directional attributes in a structured manner. For this purpose, we use a hybrid feature derived from Diffusion-Net and spectral features. In fact, we combine learned-based intrinsic properties with orientation-preserving features and demonstrate the effectiveness of our method. We achieve this by first extracting features from Diffusion-Net. Then, we compute two maps based on the functional map networks to obtain intrinsic and directional features. Finally, we combine them to achieve a desired map that can resolve symmetry ambiguities on models with high deformation. Quantitative results on the TOSCA dataset indicate that the proposed method achieves lowest average geodetic error of 0.0023, outperforming state-of-the-art methods and reducing the error by 70.66%. We demonstrate that our method outperforms similar approaches by leveraging an accurate feature extractor and effective geometric regularizers, allowing for better handling of non-isometric shapes and resulting in reduced matching errors.
Full-Text [PDF 604 kb]   (73 Downloads)    
Type of Study: Closed - 2024 Special Issue on Applications of Deep Learning in Electrical and Electronic Engineerin | Subject: Image Processing
Received: 2024/10/19 | Revised: 2025/01/03 | Accepted: 2024/12/15

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License
© 2022 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.