Skip to content

Add non-lora PEFT state into intermediate checkpoint saving.#1813

Open
yophis wants to merge 2 commits intohaotian-liu:mainfrom
yophis:main
Open

Add non-lora PEFT state into intermediate checkpoint saving.#1813
yophis wants to merge 2 commits intohaotian-liu:mainfrom
yophis:main

Conversation

@yophis
Copy link

@yophis yophis commented Dec 23, 2024

Add support for saving non-lora PEFT states in intermediate checkpoints during training.
This solves the problem where intermediate checkpoints trained with lora is not loadable: here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant