Do large language models have compositional ability? an investigation into limitations and scalability Z Xu*, Z Shi*, Y Liang COLM 2024: Conference on Language Modeling; ICLR 2024 Workshop on …, 2024 | 33 | 2024 |
Towards Few-Shot Adaptation of Foundation Models via Multitask Finetuning Z Xu, Z Shi, J Wei, F Mu, Y Li, Y Liang ICLR 2024: The Twelfth International Conference on Learning Representations, 2024 | 25 | 2024 |
Conv-basis: A new paradigm for efficient attention inference and gradient computation in transformers Y Liang*, H Liu*, Z Shi*, Z Song*, Z Xu*, Junze Yin* arXiv preprint arXiv:2405.05219, 2024 | 23 | 2024 |
Improving foundation models for few-shot learning via multitask finetuning Z Xu, Z Shi, J Wei, Y Li, Y Liang ICLR 2023 Workshop on Mathematical and Empirical Understanding of Foundation …, 2023 | 11 | 2023 |
Generalized tensor regression with covariates on multiple modes Z Xu, J Hu, M Wang arXiv preprint arXiv:1910.09499 425, 2019 | 8 | 2019 |
Out-of-distribution generalization via composition: a lens through induction heads in transformers J Song, Z Xu, Y Zhong arXiv preprint arXiv:2408.09503, 2024 | 6 | 2024 |
AdaInf: Adaptive Inference for Resource-Constrained Foundation Models Z Xu, KD Nguyen, P Mukherjee, S Chaterji, Y Liang, Y Li Workshop on Efficient Systems for Foundation Models II@ ICML2024, 2024 | 3 | 2024 |
Spatial transcriptomics dimensionality reduction using wavelet bases Z Xu, K Sankaran arXiv preprint arXiv:2205.11243, 2022 | | 2022 |
Spatial transcriptomics dimensionality reduction usin g wavelet bases [version 1; peer review: 3 approved with Z Xu, K Sankaran | | 2022 |
Towards Better Adaptation of Foundation Models Z Xu | | |