-
Notifications
You must be signed in to change notification settings - Fork 137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using precomputed MSA and PDB files for running massive 3d structure prediction #274
Comments
Did this not work?: sokrypton/ColabFold#563 (comment) I use |
So, basically, I appended all my peptide sequences together, using ":" as the separator between them. Let's say that file's name is tmp.fasta. and I received the following error:
|
Please show me your commit hash number. For example, ColabFold on my machine has
If your commit hash number is old, updating LocalColabFold will fix this issue. |
Just in case, I freshly installed localcolabfold with the script
I am running this on CPUs and my gcc version is 9.4.0. |
Hello,
I have a fasta file containing thousands of peptide sequences. I wanted to predict their 3D structures using LocalColabFold 1.5.5 installed in an HPC cluster and I have access to GPU clusters as well. Now, I was successfully able to generate PDB & MSA files by following the post/issue: sokrypton/ColabFold#563.
However, as I mentioned, I have multiple peptides in my fasta file and I would like to use my GPU access to produce 3D structure generations with colabfold_batch comment, using the PDB & MSA files I precomputed using the HPC cluster. This was asked in the attached issue but seems to fly under the radar.
Currenty, does LocalColabFold support massive prediction of peptides with the --pdb-hit-file flag?
The text was updated successfully, but these errors were encountered: