torch
#6132
Replies: 2 comments 2 replies
-
|
I was in there trying to add GrokAdamW and I saw patterns that would force me to import the pytorch version of the optimizer (thus limiting us only to optimizers that are already implemented in pytorch?) |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
it does not need to be torch. fine as long as there's a reliable source to compare with. we have LARS and LAMB that compare with tonsorflow version because that's the reference for MLPerf |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Do we really need to import torch? It seems like a bad idea.
tinygrad/test/test_optim.py
Line 2 in d051308
Beta Was this translation helpful? Give feedback.
All reactions