▲ | michaelt 7 hours ago | |
nvidia, who make AI chips with kinda good software support, and who have sales reflecting that, is worth 3.5T google, who make AI chips with barely-adequate software, is worth 2.0T AMD, who also make AI chips with barely-adequate software, is worth 0.2T Google made a few decisions with TPUs that might have made business sense at the time, but with hindsight haven't helped adoption. They closely bound TPUs with their 'TensorFlow 1' framework (which was kinda hard to use) then they released 'TensorFlow 2' which was incompatible enough it was just as easy to switch to PyTorch, which has TPU support in theory but not in practice. They also decided TPUs would be Google Cloud only. Might make sense, if they need water cooling or they have special power requirements. But it turns out the sort of big corporations that have multi-cloud setups and a workload where a 1.5x improvement in performance-per-dollar is worth pursuing aren't big open source contributors. And understandably, the academics and enthusiasts who are giving their time away for free aren't eager to pay Google for the privilege. Perhaps Google's market cap already reflects the value of being a second-place AI chipmaker? | ||
▲ | que-encrypt 2 hours ago | parent [-] | |
jax is very much a working (and in my view better, aside from the lack of community) software support. Especially if you use their images (which they do). > > Tensorflow They have been using jax/flax/etc rather than tensorflow for a while now. They don't really use pytorch from what I see on the outside from their research works. For instance, they released siglip/siglip2 with flax linen: https://github.com/google-research/big_vision TPUs very much have software support, hence why SSI etc use TPUs. P.S. Google gives their tpus for free at: https://sites.research.google/trc/about/, which I've used for the past 6 months now |