publication number 18758967251335691

Neighbor-Aware Information Fusion for Point Cloud Classification and Segmentation

SHUIFA SUN 1, 2
Yongheng Tang 2
Anning Xu 2
Xuchen Li 2
Yongwei Miao 1
Ben Wang 1
Yirong Wu 3
Publication typeJournal Article
Publication date2025-05-08
scimago Q2
wos Q4
SJR0.364
CiteScore4.2
Impact factor1.0
ISSN10641246, 18758967
Abstract

In recent years, with the development of technologies such as computer vision, machine learning, and deep learning, as well as the popularity of large-scale data collection devices, 3D point cloud processing has become increasingly important. 3D point cloud processing can be widely used in fields such as object recognition, robot navigation, building information modeling (BIM), and urban planning. With more and more 3D point cloud data acquired, it has become a challenge for present 3D point cloud processing models to accurately and efficiently process this data. To improve the accuracy of point cloud classification and segmentation tasks, this study proposes an improved point cloud classification and segmentation model based on neighborhood aware information fusion. The model includes a Fusion Neighbor Information Feature Enhancement (FNIFE) module, which connects points in the local neighborhood and obtains the features of the current point through the feature relationships between the points in the neighborhood. By enhancing the feature expression of the point, it reduces the feature loss caused by the feature extraction operation and improves the accuracy of point cloud classification. Additionally, the model includes a Reverse Transmission of Point Features (RToPF) module, in which interpolation parameters are adjusted to ensure that the enhanced feature information can be effectively transmitted, thereby improving the accuracy and computing speed of the model. Finally, to further improve classification accuracy further, a module containing the X-Conv operator is utilized in the model to replace the max-pooling in the original network and reduce the feature loss generated during feature extraction. Comparative experiments are conducted on ModelNet40, ShapeNet, S3DIS datasets and ScanNet datasets. The experimental results show that the overall accuracy of proposed model reaches 92.4%. The average accuracy reaches 90.2% in the point cloud classification task, and the average intersection ratio reaches 84.5% in the point cloud segmentation task, achieving superior performance in classification and segmentation tasks compared with the state-of-the-art models.

Found 

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
0
Share
Cite this
GOST |
Cite this
GOST Copy
SUN S. et al. Neighbor-Aware Information Fusion for Point Cloud Classification and Segmentation // Journal of Intelligent and Fuzzy Systems. 2025. 18758967251335691
GOST all authors (up to 50) Copy
SUN S., Tang Y., Xu A., Li X., Miao Y., Wang B., Wu Y. Neighbor-Aware Information Fusion for Point Cloud Classification and Segmentation // Journal of Intelligent and Fuzzy Systems. 2025. 18758967251335691
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1177/18758967251335691
UR - https://journals.sagepub.com/doi/10.1177/18758967251335691
TI - Neighbor-Aware Information Fusion for Point Cloud Classification and Segmentation
T2 - Journal of Intelligent and Fuzzy Systems
AU - SUN, SHUIFA
AU - Tang, Yongheng
AU - Xu, Anning
AU - Li, Xuchen
AU - Miao, Yongwei
AU - Wang, Ben
AU - Wu, Yirong
PY - 2025
DA - 2025/05/08
PB - SAGE
SN - 1064-1246
SN - 1875-8967
ER -
BibTex
Cite this
BibTex (up to 50 authors) Copy
@article{2025_SUN,
author = {SHUIFA SUN and Yongheng Tang and Anning Xu and Xuchen Li and Yongwei Miao and Ben Wang and Yirong Wu},
title = {Neighbor-Aware Information Fusion for Point Cloud Classification and Segmentation},
journal = {Journal of Intelligent and Fuzzy Systems},
year = {2025},
publisher = {SAGE},
month = {may},
url = {https://journals.sagepub.com/doi/10.1177/18758967251335691},
pages = {18758967251335691},
doi = {10.1177/18758967251335691}
}