Capsule networks (CapsNet) have emerged as a promising architectural framework for various machine-learning tasks and offer advantages in capturing hierarchical relationships and spatial hierarchies within data. One of the most crucial components of CapsNet is the squash function, which plays a pivotal role in transforming capsule activations. Despite the success achieved by standard squash functions, some limitations remain. The difficulty learning complex patterns with small vectors and vanishing gradients are major limitations. Standard squash functions may struggle to handle large datasets. We improve our methodology to enhance squash functions to address these challenges and build on our previous research, which recommended enhancing squash functions for future improvements. Thus, high-dimensional, and complex data scenarios improve CapsNet’s performance. Enhancing CapsNet for complex tasks like bone marrow (BM) cell classification requires optimizing its fundamental operations. Additionally, the squash function affects feature representation and routing dynamics. Additionally, this enhancement improves feature representation, preserves spatial relationships, and reduces routing information loss. The proposed method increased BM data classification accuracy from 96.99% to 98.52%. This shows that our method improves CapsNet performance, especially in complex and large-scale tasks like BM cells. Comparing the improved CapsNet model to the standard CapsNet across datasets supports the results. The enhanced squash CapsNet outperforms the standard model on MNIST, CIFAR-10, and Fashion MNIST with an accuracy of 99.83%, 73%, and 94.66%, respectively. These findings show that the enhanced squash function improves CapsNet performance across diverse datasets, confirms its potential for real-world machine learning applications, and highlight the necessity for additional research.
Capsule networks (CapsNet) have emerged as a promising architectural framework for various machine-learning tasks and offer advantages in capturing hierarchical relationships and spatial hierarchies within data. One of the most crucial components of CapsNet is the squash function, which plays a pivotal role in transforming capsule activations. Despite the success achieved by standard squash functions, some limitations remain. The difficulty learning complex patterns with small vectors and vanishing gradients are major limitations. Standard squash functions may struggle to handle large datasets. We improve our methodology to enhance squash functions to address these challenges and build on our previous research, which recommended enhancing squash functions for future improvements. Thus, high-dimensional, and complex data scenarios improve CapsNet’s performance. Enhancing CapsNet for complex tasks like bone marrow (BM) cell classification requires optimizing its fundamental operations. Additionally, the squash function affects feature representation and routing dynamics. Additionally, this enhancement improves feature representation, preserves spatial relationships, and reduces routing information loss. The proposed method increased BM data classification accuracy from 96.99% to 98.52%. This shows that our method improves CapsNet performance, especially in complex and large-scale tasks like BM cells. Comparing the improved CapsNet model to the standard CapsNet across datasets supports the results. The enhanced squash CapsNet outperforms the standard model on MNIST, CIFAR-10, and Fashion MNIST with an accuracy of 99.83%, 73%, and 94.66%, respectively. These findings show that the enhanced squash function improves CapsNet performance across diverse datasets, confirms its potential for real-world machine learning applications, and highlight the necessity for additional research.
Primary Language | English |
---|---|
Subjects | Biomedical Imaging, Biomedical Engineering (Other) |
Journal Section | Research Articles |
Authors | |
Early Pub Date | September 13, 2024 |
Publication Date | September 15, 2024 |
Submission Date | June 6, 2024 |
Acceptance Date | September 12, 2024 |
Published in Issue | Year 2024 Volume: 7 Issue: 5 |