Yue Xin

Yue Xin

忻岳 | Second-Year Master’s Student

Shanghai Jiao Tong University (SJTU)

Biography

I am a second-year M.S. student at Institute of Media, Information, and Network (min), Shanghai Jiao Tong University, advised by Prof. Hongkai Xiong and Prof. Wenrui Dai. I received my bachelor’s degree in Electronic Science and Technology from SJTU.

I have a keen interest in machine learning, interpretable AI, and large (language) models. My goal is to identify the fundamental optimization goal of deep learning and establish interpretable machine learning systems.

Download my CV.

Interests
  • Machine Learning
  • Large Language Models
  • Interpretable AI
Education
  • M.S. in Information and Communication Engineering, 2023

    SJTU

  • B.S. in Electronic Science and Technology (major) and Computer Science and Technology (minor), 2019

    SJTU

Publications

Clarifying the Behavior and the Difficulty of Adversarial Training
Towards the Dynamics of a DNN Learning Symbolic Interactions
ChebHiPoly: Hierarchical Chebyshev Polynomial Modules for Enhanced Approximation and Optimization
GLEAM: Global Share Local Transform MoE for Downstream Transferring With Enhanced Parameter Efficiency
Bootstrap Prompt Learning with Feature Adaptation for Vision-Language Efficient Tuning
RST: Residual Side Tuning with Cross-Layer Correlation for Parameter Efficient Transfer Learning
Enhancing Chain-of-Thought Reasoning with Critical Representation Fine-tuning
SalaMAnder: Shapley-based Mathematical Expression Attribution and Metric for Chain-of-Thought Reasoning
SCOT: Disentangled Controllable Generation via Side-Tuning Orthogonalized ControlNet

Research Experience

 
 
 
 
 
Institute of Media, Information and Network (min), SJTU
Machine Learning and Computer Vision Intern and Master’s Student
November 2022 – Present Shanghai, China

Advised by Prof. Hongkai Xiong and Prof. Wenrui Dai, my researches include:

  • Proposed modular Chebyshev connections into general network layers to improve the approximation capability of a neural network. Specifically, established recursive relationship among adjacent layers and polynomial relationship between non-adjacent layers. Comprehensive experiments vefify its strong approximation and optimization capability. (ICME 2025 on submission)
  • Proposed Residual Side Tuning (RST) framework to enhance information extraction efficiency by employing a dual-block side-tuning structure with low-rank linear mapping on aggregated features, and introducing element-wise feature enhancement strategy to model cross-layer information. The properties and performance of RST is verified through mathematical proof and various experiments. (ICML 2025 on submission)
  • Proposed GLEAM, an efficient fine-tuning method for large model parameters. This method leverages the high similarity of parameter matrices in LoRA to construct a low-rank decomposition, further reducing the number of parameters required for fine-tuning while enhancing performance. (ICME 2025 on submission)
  • Proposed PAT, a fault-tolerant multimodal classifer to solve the problem of conflict between prompter learning and adapter tuning. Specifically, it aligns model representations obtained from Soft Prompt and Adapter-based methods and incorporats contrastive learning loss to enhance model performance and generalization. (ICML 2025 on submission)
 
 
 
 
 
Feitian Lab, Alibaba Cloud
Interpretable LLM Research Intern
March 2024 – October 2024 Hangzhou, China

Advised by Prof. Jieping Ye, my researches include:

  • Proposed SalaMAnder, a Shapley-value-based framework for quantifying component-level contributions in CoT reasoning. Specifically, we develop an efficient stratified sampling algorithm to compute Shapley value for mathematical expression attribution and CoSP (Cardinality of Shapley Positives) metric. Theoretical derivation and comprehensive validation across multiple models and benchmarks present a robust monotonic correlation between CoSP and model performance, providing theoretical explanations for the empirical success of CoT. (ACL 2025 on submission)
  • Proposed CRFT, a novel method that identifies and optimizes critical representations that integrate significant information from preceding layers or regulate subsequent layer representation. CRFT effectively optimizes the representations in a low-rank linear subspace through information flow analysis. (ACL 2025 on submission)
  • Classfied and summarized research work on interpretability on large language models.
 
 
 
 
 
Interpretable ML Lab, SJTU
Interpretable Machine Learning Intern
February 2022 – November 2023 Shanghai, China

Advised by Prof. Quanshi Zhang, my researches include:

  • Theoretically derived the analytical solution for multi-step adversarial attacks, which explains the reasons behind the optimization difficulties in adversarial training. This is validated through experimental results. (Accepted by AAAI 2024)
  • Theoretically derived the two-stage dynamic interaction process of DNNs, proving that the network learning process gradually encodes interactions of varying complexity. This provides a theoretical foundation for understanding overfitting. (Accepted by NeurIPS 2024)
  • Theoretically derived and validated the robustness of concepts with different complexities.
 
 
 
 
 
SunnyLab, SJTU
Machine Learning and Computer Vision Intern
SunnyLab, SJTU
May 2021 – May 2022 Shanghai, China

Advised by Prof. Chongyang Zhang, my researches include:

  • Developped Swin Transformer based model to implemente instance segmentation of workpiece welding area.
  • Designed a space-time filter to remove false positive samples in pedestrian detection.
  • Developped YOLOv5-based model to detect tower crane, recognize dangerous tower crane, and label electronic fence.

Engineering Experience

 
 
 
 
 
Qilin Investment
Quantitative Strategy Research and Machine Learning Intern
October 2024 – January 2025 Shanghai, China

My research and engineering projects include:

  • Developed various models to complete intraday prediction tasks, surpassing the baseline model in global domains (A-shares) and improving performance in live trading within three weeks. Conducted ablation studies on the model to explain the reasons for performance improvement and validate its generalizability.
  • Fine-tuned the above model to enhance performance by 10% in local domains (ZZ800 and ZZ1000) while maintaining minimal decrease in overall performance.

Competitions

The 20th Chinese Graduate Mathematical Modeling Competition
The Mathematical Contest in Modeling
The Huawei Cloud ‘Cloud Pioneers’ Few-Shot Detection Competition
The 12th National College Student Mathematical Competition
The 2nd National ‘August 1st Cup’ Online Mathematics Competition
Chinese Physics Olympiad

Honors and Awards

Outstanding Undergraduate Graduate of Shanghai Jiao Tong University
National Scholarship
Shanghai Jiao Tong University A-Class Excellent Scholarship for Undergraduate
Shenzhen Stock Exchange Scholarship
Shanghai Jiao Tong University B-Class Excellent Scholarship for Undergraduate

Skills

Programming Languages and Frameworks

Python, C++, Matlab, LaTeX, Linux, PyTorch, NumPy, Anaconda, Git, OpenCV

Mathematics

calculus, linear algebra, probability statistics

Language

mandarin (native), English (fluent)

Activities and Leaderships

 
 
 
 
 
Member of School Table Tennis Team
SJTU
September 2019 – Present
 
 
 
 
 
Head Coach of College Table Tennis Team and Club
Zhiyuan College, SJTU
September 2021 – Present
 
 
 
 
 
Captain of College Table Tennis Team
School of Electronic Information and Electrical Engineering, SJTU
September 2021 – December 2023

Contributions include:

  • Third Place in the Team Category at the Tizong Cup in 2021.
  • Second Place in the Team Category at the School Sports Meet in 2022.
 
 
 
 
 
Member of School Track and Field Team
SJTU
September 2020 – May 2021

Contributions include:

  • Second Place in the Men’s 4$\times$100-Meter Relay at the School Sports Meet in 2020.
  • First Place in the Men’s 4$\times$100-Meter Relay at the 2021 Track and Field Athletics Meet.
 
 
 
 
 
Counselor of Physics Subject Camp
SJTU
September 2020 – January 2021