Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for HuBERT batch norm instead of weight norm in pos_conv_emb #34229

Closed
gallilmaimon opened this issue Oct 17, 2024 · 3 comments · Fixed by #34389
Closed

Add support for HuBERT batch norm instead of weight norm in pos_conv_emb #34229

gallilmaimon opened this issue Oct 17, 2024 · 3 comments · Fixed by #34389
Labels
Feature request Request for a new feature

Comments

@gallilmaimon
Copy link
Contributor

gallilmaimon commented Oct 17, 2024

Feature request

Motivation

  • This will give the opportunity to support other textlesslib/fairseq models and specifically the newer, better HuBERT model - mhubert-base-25hz which uses this and is currently unsupported.
  • This model is frequently used for training speech language models, and would benefit the community as well as myself in a project I am working on.

Your contribution

I can create a PR to implement this, but would love some guidance @ylacombe

@gallilmaimon gallilmaimon added the Feature request Request for a new feature label Oct 17, 2024
@avishaiElmakies
Copy link
Contributor

Hi, would love this feature as well @ylacombe

@ylacombe
Copy link
Contributor

Hey @avishaiElmakies and @gallilmaimon , this would indeed be a great addition.

Would you like to open a PR to correct this ?
You'd have to add the possibility to use batch norm in the configuration_hubert.py, propagate the change to the modeling file and the converting file, and finally add an integration test. How does it sound?

Thanks

@gallilmaimon
Copy link
Contributor Author

@ylacombe Sounds good. I will work on something and let you know when the PR is ready

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants