Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a way to find the earliest version of transformers that has a certain model? #35097

Open
ZihaoZheng98 opened this issue Dec 5, 2024 · 5 comments
Labels
Feature request Request for a new feature

Comments

@ZihaoZheng98
Copy link

Feature request

Is there a way to find the earliest version of transformers that has a certain model? For example, I want to use CLIP into my project, but the existing transformers version is old, I want to upgrade transformers to a lowest version that can use CLIP, so that other parts of my code do not change.

Motivation

There are situations where I need to use a new model in the existing codebase. But when updating transformers, there could be some parts of the code out of date, and need to be modified.

Your contribution

I don't know, but I will try to help.

@ZihaoZheng98 ZihaoZheng98 added the Feature request Request for a new feature label Dec 5, 2024
@egojoseph
Copy link

egojoseph commented Dec 5, 2024

Hi @ZihaoZheng98,

This is a great question, and it’s a common concern when integrating new models into an existing codebase with older dependencies. Here's how you might approach this:

Solution

  1. Use the Hugging Face Model Hub

    • Each model on the Hugging Face Model Hub (e.g., CLIP) specifies the minimum version of transformers required in its documentation. You can find this information under the "Files and versions" tab or in the model's README.
  2. Check the Release Notes

    • The release notes in the Transformers GitHub Repository provide detailed information on new features and models introduced in each version. Searching for "CLIP" in the release history should help you identify the earliest version that includes this model.
  3. Determine Compatibility Programmatically

    • If you’re programmatically managing dependencies, you can install different versions of transformers and test their compatibility with CLIP. Use the following steps:
      pip install transformers==<version_number>
      python -c "from transformers import CLIPModel; print(CLIPModel.from_pretrained('openai/clip-vit-base-patch32'))"
      Replace <version_number> with specific versions to find the lowest compatible version.
  4. Explore the compatibility Script

    • While there isn't currently an automated script to identify the lowest compatible version, it could be worth proposing as a feature for the Transformers library. This could be a valuable addition for users managing legacy systems.

Advice
In the short term, you can:

  • Upgrade transformers incrementally (one version at a time) to minimize disruptions.
  • Pin dependencies to a specific version to ensure reproducibility:
    pip install transformers==4.X.X

If desired, I can help draft a proposal for adding a utility to automate identifying the minimum version of transformers required for a given model. This feature could potentially benefit many users with similar needs.

Let me know if you would like further assistance.

Best,
Ego Joseph

@ZihaoZheng98
Copy link
Author

Hi @ZihaoZheng98,

This is a great question, and it’s a common concern when integrating new models into an existing codebase with older dependencies. Here's how you might approach this:

Solution

  1. Use the Hugging Face Model Hub

    • Each model on the Hugging Face Model Hub (e.g., CLIP) specifies the minimum version of transformers required in its documentation. You can find this information under the "Files and versions" tab or in the model's README.
  2. Check the Release Notes

    • The release notes in the Transformers GitHub Repository provide detailed information on new features and models introduced in each version. Searching for "CLIP" in the release history should help you identify the earliest version that includes this model.
  3. Determine Compatibility Programmatically

    • If you’re programmatically managing dependencies, you can install different versions of transformers and test their compatibility with CLIP. Use the following steps:

      pip install transformers==<version_number>
      python -c "from transformers import CLIPModel; print(CLIPModel.from_pretrained('openai/clip-vit-base-patch32'))"

      Replace <version_number> with specific versions to find the lowest compatible version.

  4. Explore the compatibility Script

    • While there isn't currently an automated script to identify the lowest compatible version, it could be worth proposing as a feature for the Transformers library. This could be a valuable addition for users managing legacy systems.

Advice In the short term, you can:

  • Upgrade transformers incrementally (one version at a time) to minimize disruptions.
  • Pin dependencies to a specific version to ensure reproducibility:
    pip install transformers==4.X.X

If desired, I can help draft a proposal for adding a utility to automate identifying the minimum version of transformers required for a given model. This feature could potentially benefit many users with similar needs.

Let me know if you would like further assistance.

Best, Ego Joseph

Thanks! Very helpful.

@egojoseph
Copy link

egojoseph commented Dec 6, 2024

Hi @ZihaoZheng98,
This is a great question, and it’s a common concern when integrating new models into an existing codebase with older dependencies. Here's how you might approach this:
Solution

  1. Use the Hugging Face Model Hub

    • Each model on the Hugging Face Model Hub (e.g., CLIP) specifies the minimum version of transformers required in its documentation. You can find this information under the "Files and versions" tab or in the model's README.
  2. Check the Release Notes

    • The release notes in the Transformers GitHub Repository provide detailed information on new features and models introduced in each version. Searching for "CLIP" in the release history should help you identify the earliest version that includes this model.
  3. Determine Compatibility Programmatically

    • If you’re programmatically managing dependencies, you can install different versions of transformers and test their compatibility with CLIP. Use the following steps:

      pip install transformers==<version_number>
      python -c "from transformers import CLIPModel; print(CLIPModel.from_pretrained('openai/clip-vit-base-patch32'))"

      Replace <version_number> with specific versions to find the lowest compatible version.

  4. Explore the compatibility Script

    • While there isn't currently an automated script to identify the lowest compatible version, it could be worth proposing as a feature for the Transformers library. This could be a valuable addition for users managing legacy systems.

Advice In the short term, you can:

  • Upgrade transformers incrementally (one version at a time) to minimize disruptions.
  • Pin dependencies to a specific version to ensure reproducibility:
    pip install transformers==4.X.X

If desired, I can help draft a proposal for adding a utility to automate identifying the minimum version of transformers required for a given model. This feature could potentially benefit many users with similar needs.
Let me know if you would like further assistance.
Best, Ego Joseph

Thanks! Very helpful.

Hi @ZihaoZheng98, I am glad you found the suggestions helpful.

To expand on this idea, I’d like to tag @Rocketknight1 to gather his thoughts on whether a utility for automating compatibility checks could align with the project’s vision. This feature might streamline workflows for developers handling legacy systems and could bring value to the community.

I would love to hear any thoughts or feedback.

@Rocketknight1
Copy link
Member

I thought I replied to this yesterday, but apparently not! We're unlikely to add this - the reason is that the earliest version that supports a model is more likely to contain bugs or small issues in that model's code, and so we don't really want to encourage users to use it.

Instead, we just recommend that users stay current with the latest versions of transformers, which should have better performance and stability across the board

@egojoseph
Copy link

I thought I replied to this yesterday, but apparently not! We're unlikely to add this - the reason is that the earliest version that supports a model is more likely to contain bugs or small issues in that model's code, and so we don't really want to encourage users to use it.

Instead, we just recommend that users stay current with the latest versions of transformers, which should have better performance and stability across the board

Thank you for clarifying. I completely understand the focus on encouraging users to stay current with the latest stable versions. It's great to see how the project prioritizes stability and performance. I appreciate the opportunity to engage in this discussion and learn from the team's approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

No branches or pull requests

3 participants