ChebyNet: Boosting Neural Network Fitting and Efficiency through Chebyshev Polynomial Layer Connections

Abstract

Traditional deep neural networks (DNNs) predominantly adhere to a similar design paradigm. Even with the incorporation of additive shortcuts, they lack explicit modeling of relationships between non-adjacent layers. Consequently, this paradigm constrains the fitting capabilities of existing DNNs. To address this issue, we propose ChebyNet, a novel network paradigm to build Chebyshev polynomial connections between general network layers. Specifically, we establish recursive relationship among adjacent layers and polynomial relationship between non-adjacent layers to construct ChebyNet, which improves representation capabilities of the network. Experimentally, we comprehensively evaluate ChebyNet on diverse tasks, including function approximation, semantic segmentation, and visual recognition. Across all these tasks, ChebyNet consistently outperforms traditional neural networks under identical training conditions, demonstrating superior efficiency and fitting properties. Our findings underscore the potential of polynomial-based layer connections to significantly enhance neural network performance, offering a promising direction for future deep learning architectures.

Publication
ICLR 2025
Yue Xin
Yue Xin
忻岳 | Second-Year Master’s Student

My research interests include machine learning, interpretable AI and computer vision. Welcome to contact me !