news

Feb 20, 2025 :bell: Excited to share our latest work! 🎉: (i) On Zero-Initialized Attention: Optimal Prompt and Gating Factor Estimation – We introduce a Mixture of Experts (MoE) perspective to explain the mechanism behind LLaMA-Adapter’s prompt learning. (ii) MGPath – A novel multi-granular prompt learning method for few-shot WSI pathology prediction, leveraging the power of foundation vision-language models.
Oct 08, 2024 🇨🇭 Start my visiting research at ETH AI Center, ETH Zurich. The topics are about Multi-Modal LLMs for Healthcare empowered by Retrieval-Augmented Generation.
Oct 07, 2024 :bell: Excited to introduce our latest work on medical multi-modal LLMs: LoGra-Med, a novel pre-training algorithm that incorporates multi-graph alignment to effectively address the data-hungry nature of autoregressive learning.
Oct 06, 2024 :rocket: The paper PiToMe has been accepted at NeurIPS 2024. Our code will be available soon!
Jun 10, 2024 :bell: Our new preprint PiToMe is online. We propose a new method to do token merging in the Transformer with spectrum-preserving.
May 01, 2024 :rocket: A paper submitted to ICML 2024 on the molecular conformer aggregation network topic is accepted.
Jan 15, 2024 :rocket: A paper submitted to ICLR 2024 on the topic of accelerating transformers is accepted as an oral talk.
Sep 22, 2023 :rocket: A paper submitted to NeurIPS 2023 on a large-scale medical image pre-trained models using second-order graph matching is accepted.