X-LoRA: Mixture of Low-Rank Adapter Experts, a Flexible Framework for Large Language Models with Applications in Protein Mechanics and Molecular Design ericlbuehler/mistral.rs • 11 Feb 2024 Starting with a set of pre-trained LoRA adapters, our gating strategy uses the hidden states to dynamically mix adapted layers, allowing the resulting X-LoRA model to draw upon different capabilities and create
![Papers with Code - The latest in Machine Learning](https://cdn-ak-scissors.b.st-hatena.com/image/square/f1b2d987a7ce0db86e8318ba5151cb9da93a31bd/height=288;version=1;width=512/https%3A%2F%2Fpaperswithcode.com%2Fstatic%2Findex.jpeg)