Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure that we get the cuda version of faiss. #1078

Merged
merged 1 commit into from
Dec 9, 2022

Conversation

vyasr
Copy link
Contributor

@vyasr vyasr commented Dec 8, 2022

If we try to use the current dependency spec in a situation where other channels are enabled that contain the non-CUDA version of FAISS we need to ensure that the CUDA-enabled package is downloaded.

@vyasr vyasr added bug Something isn't working 3 - Ready for Review non-breaking Non-breaking change labels Dec 8, 2022
@vyasr vyasr requested a review from a team as a code owner December 8, 2022 00:46
@vyasr vyasr self-assigned this Dec 8, 2022
@bdice
Copy link
Contributor

bdice commented Dec 8, 2022

See this comment for additional details: #1065 (comment)

This is problematic specifically when pytorch has higher channel priority than conda-forge, because pytorch has a package matching libfaiss>=1.7.0 but it does not have a selector like faiss-proc on conda-forge which ensures the CUDA package is preferred.

@cjnolet
Copy link
Member

cjnolet commented Dec 8, 2022

We are actively working to remove the FAISS dependency (aiming for 23.02) so hopefully we shouldn't need to do this dance for much longer.

cc @benfred @teju85 @dantegd

@cjnolet
Copy link
Member

cjnolet commented Dec 9, 2022

@gpucibot merge

@rapids-bot rapids-bot bot merged commit e9ce36a into rapidsai:branch-23.02 Dec 9, 2022
@vyasr vyasr deleted the feat/faiss_gpu branch December 9, 2022 18:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3 - Ready for Review bug Something isn't working non-breaking Non-breaking change
Projects
Development

Successfully merging this pull request may close these issues.

4 participants