Remix.run Logo
jauntywundrkind 4 hours ago

Google's work on Jax, pytorch, tensorflow, and the more general XLA underneath are exactly the kind of anti-moat everyone has been clamoring for.

morkalork 3 hours ago | parent [-]

Anti-moat like commoditizing the compliment?

sharpy 2 hours ago | parent | next [-]

If they get things like PyTorch to work well without carinng what hardware it is running on, it erodes Nvidia's CUDA moat. Nvidia's chips are excellent, without doubt, but their real moat is the ecosystem around CUDA.

qeternity 2 hours ago | parent [-]

PyTorch is only part of it. There is still a huge amount of CUDA that isn’t just wrapped by PyTorch and isn’t easily portable.

svara an hour ago | parent [-]

... but not in deep learning or am I missing something important here?

layer8 6 minutes ago | parent | prev [-]

*complement