May 01, 2025 | 🎉 Our first (i) preliminary version, MGPath has been accepted to the Workshop on Foundation Models in the Wild, ICLR 2025 and (ii) another one about LLaMA-Adapter’s prompt learning is accepted at ICML 2025 (Code is coming soon!). |
Apr 20, 2025 | 🎉 Our work in building a new Inductive Message Passing Network for Efficient Human-in-the-Loop Annotation of Mobile Eye Tracking Data has been accepted at Scientific Report, Nature Portfolio. |
Feb 20, 2025 | Excited to share our latest work! 🎉: (i) On Zero-Initialized Attention: Optimal Prompt and Gating Factor Estimation – We introduce a Mixture of Experts (MoE) perspective to explain the mechanism behind LLaMA-Adapter’s prompt learning. (ii) MGPath – A novel multi-granular prompt learning method for few-shot WSI pathology prediction, leveraging the power of foundation vision-language models. |
Oct 08, 2024 | 🇨🇠Start my visiting research at ETH AI Center, ETH Zurich. The topics are about Multi-Modal LLMs for Healthcare empowered by Retrieval-Augmented Generation. |
Oct 07, 2024 | Excited to introduce our latest work on medical multi-modal LLMs: LoGra-Med, a novel pre-training algorithm that incorporates multi-graph alignment to effectively address the data-hungry nature of autoregressive learning. |
Oct 06, 2024 | The paper PiToMe has been accepted at NeurIPS 2024. Our code will be available soon! |
Jun 10, 2024 | Our new preprint PiToMe is online. We propose a new method to do token merging in the Transformer with spectrum-preserving. |
May 01, 2024 | A paper submitted to ICML 2024 on the molecular conformer aggregation network topic is accepted. |
Jan 15, 2024 | A paper submitted to ICLR 2024 on the topic of accelerating transformers is accepted as an oral talk. |
Sep 22, 2023 | A paper submitted to NeurIPS 2023 on a large-scale medical image pre-trained models using second-order graph matching is accepted. |