Skip to content

Fix Llama.close didn't free lora adapter#1679

Merged
abetlen merged 1 commit intoabetlen:mainfrom
jkawamoto:free-lora-adapter
Aug 15, 2024
Merged

Fix Llama.close didn't free lora adapter#1679
abetlen merged 1 commit intoabetlen:mainfrom
jkawamoto:free-lora-adapter

Conversation

@jkawamoto
Copy link
Contributor

Llama.close didn't free the LoRA, but Llama.__del__ does. This PR moves the freeing of the LoRA to the ExitStack that we already use to free the model, context, and other resources.

Additionally, this PR moves the initialization of the ExitStack to the top of __init__, ensuring that _stack is always present. As a result, we no longer need to check whether _stack exists or is None.

@abetlen abetlen merged commit 3c7501b into abetlen:main Aug 15, 2024
@jkawamoto jkawamoto deleted the free-lora-adapter branch August 15, 2024 22:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants