Comment by npollock Comment by npollock 17 hours ago 2 replies Copy Link View on Hacker News LoRA adapters modify the model's internal weights
Copy Link jsight 14 hours ago Next Collapse Comment - Yeah, I honestly think some of the language used with LoRA gets in the way of people understanding them. It becomes much easier to understand when looking at an actual implementation, as well as how they can be merged or kept separate. Reply View | 0 replies
Copy Link make3 16 hours ago Prev Collapse Comment - not unless they're explicitly merged, which is not a requirement but a small speed only thing Reply View | 0 replies
Yeah, I honestly think some of the language used with LoRA gets in the way of people understanding them. It becomes much easier to understand when looking at an actual implementation, as well as how they can be merged or kept separate.