Skip to content

Commit 64b5af1

Browse files
committed
Update tagline: Various LoRA adapters. One shared basis.
1 parent ef3fe3d commit 64b5af1

3 files changed

Lines changed: 5 additions & 5 deletions

File tree

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
</p>
44

55
<p align="center">
6-
<strong>Shared low-rank subspaces for efficient LoRA adapter management.</strong>
6+
<strong>Various LoRA adapters. One shared basis.</strong>
77
</p>
88

9-
Based on the [Share paper](https://arxiv.org/abs/2602.06043): LoRA adapters across tasks share a common low-rank subspace. Instead of storing *N* separate adapters, maintain **one shared basis** and **per-task coefficient vectors**achieving up to 122× compression at scale.
9+
Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to 122× compression at scale. Based on the [Share paper](https://arxiv.org/abs/2602.06043).
1010

1111
## Install
1212

docs/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
# vlora
22

3-
**Shared low-rank subspaces for efficient LoRA adapter management.**
3+
**Various LoRA adapters. One shared basis.**
44

5-
Based on the [Share paper](https://arxiv.org/abs/2602.06043): LoRA adapters across tasks share a common low-rank subspace. Instead of storing *N* separate adapters, maintain **one shared basis** and **per-task coefficient vectors**achieving up to 122× compression at scale.
5+
Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to 122× compression at scale. Based on the [Share paper](https://arxiv.org/abs/2602.06043).
66

77
## Install
88

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ build-backend = "hatchling.build"
55
[project]
66
name = "vlora-dev"
77
version = "0.2.1"
8-
description = "Shared low-rank subspaces for efficient LoRA adapter management"
8+
description = "Various LoRA adapters. One shared basis. Up to 122x compression at scale."
99
readme = "README.md"
1010
license = "Apache-2.0"
1111
requires-python = ">=3.9"

0 commit comments

Comments
 (0)