▲ | que-encrypt 8 hours ago | |
jax is very much a working (and in my view better, aside from the lack of community) software support. Especially if you use their images (which they do). > > Tensorflow They have been using jax/flax/etc rather than tensorflow for a while now. They don't really use pytorch from what I see on the outside from their research works. For instance, they released siglip/siglip2 with flax linen: https://github.com/google-research/big_vision TPUs very much have software support, hence why SSI etc use TPUs. P.S. Google gives their tpus for free at: https://sites.research.google/trc/about/, which I've used for the past 6 months now | ||
▲ | throwaway314155 4 hours ago | parent [-] | |
> They have been using jax/flax/etc rather than tensorflow for a while now Jax has a harsher learning curve than Pytorch in my experience. Perhaps it's worth it (yay FP!) but it doesn't help adoption. > They don't really use pytorch from what I see on the outside from their research works Of course not, there is no outside world at Google - if internal tooling exists for a problem their culture effectively mandates using that before anything else, no matter the difference in quality. This basically explains the whole TF1/TF2 debacle which understandably left a poor taste in people's mouths. In any case while they don't use Pytorch, the rest of us very much do. > P.S. Google gives their tpus for free at: https://sites.research.google/trc/about/, which I've used for the past 6 months now Right and in order to use it effectively you basically have to use Jax. Most researchers don't have the advantage of free compute so they are effectively trying to buy mindshare rather than winning on quality. This is fine, but it's worth repeating as it biases the discussion heavily - many proponents of Jax just so happen to be on TRC or have been given credits for TPU's via some other mechanism. |