Remix.run Logo
the__alchemist 3 months ago

Rust support next? RN I am manually [de]serializing my data structures as byte arrays to/from the kernels. It would be nice to have truly shared data structures like CUDA gives you in C++!

KeplerBoy 3 months ago | parent | next [-]

Isn't Rust still very seldomly used in the areas where CUDA shines (e.g. number crunching of any kind, let it be simulations or linear algebra)? Imo C++ or even Fortran are perfectly fine choices for those things, since the memory allocation pattern aren't that complicated.

IshKebab 3 months ago | parent | next [-]

Mainly because number crunching code tends to be very long-lived (hence why FORTRAN is still in use).

nine_k 3 months ago | parent [-]

Not only that. Fortran is very good for writing number-crunching code. Modern Fortran is a pretty ergonomic language, it gives you a really easy way to auto-parallelize things in many ways, and new Fortran code is being produce unironically. Of course it normally uses the treasure trove of existing numerical Fortran code. (Source: a friend who worked at CERN.)

pjmlp 3 months ago | parent | prev [-]

Yes, and the new kid in town, slang has more chances of adoption.

KeplerBoy 3 months ago | parent [-]

sorry, could you link to the project? Seems there are quite a few languages called slang.

_0ffh 3 months ago | parent [-]

I guess he might mean this one https://shader-slang.org/ though at first glance at least it looks more graphics than GPGPU oriented.

Edit: Hmm, this part of the same project looks general purpose-y and apparently integrates with PyTorch https://slangpy.shader-slang.org/en/latest/

pjmlp 3 months ago | parent [-]

Yes that is the one, and all shader languages also support compute as well, not only graphics.

_0ffh 3 months ago | parent [-]

Thanks yes. Though I did not mean the bare possibility, but intended use case, which may lead to different design choices.

chasely 3 months ago | parent | prev | next [-]

The Rust-CUDA project just recently started up again [0], I've started digging into it a little bit and am hoping to contribute to it since the summers are a little slower for me.

[0] https://github.com/rust-gpu/rust-cuda

the__alchemist 3 months ago | parent [-]

Still broken though! Has been for years. In a recent GH issue regarding desires for the reboot, I asked: "Try it on a few different machines (OS, GPUs, CUDA versions etc), make it work on modern RustC and CUDA versions without errors." The response was "That will be quite some work." Meanwhile, Cudarc works...

LegNeato 3 months ago | parent | next [-]

Maintainer here. It is not broken, it works. See https://rust-gpu.github.io/blog/2025/03/18/rust-cuda-update

the__alchemist 3 months ago | parent [-]

Thanks! Will give it a try, and report back.

edit: I'm still showing the latest release as from 2022, which I've already verified doesn't work.

chasely 3 months ago | parent | prev [-]

Totally, it's going to take a minute to get it all working. On a positive note, they recently got some sponsorship from Modal [0], who is supplying GPUs for CI/CD so they should be able to expand their hardware coverage.

LegNeato 3 months ago | parent | prev | next [-]

https://github.com/rust-gpu/rust-cuda

the__alchemist 3 months ago | parent [-]

Not functional.

Micoloth 3 months ago | parent | prev | next [-]

What do you think of the Burn framework? (Honest question, I have no clue what I’m talking about)

airstrike 3 months ago | parent | next [-]

I used it to train my own mini-GPT and I liked it quite a bit. I tend to favor a different style of Rust with fewer generics but maybe that just can't be avoided given the goals of that project.

The crate seems to have a lot of momentum, with many new features, releases, active communities on GH and Discord. I expect it to continue to get better.

the__alchemist 3 months ago | parent | prev [-]

Have not heard of it. Looked it up. Seems orthogonal?

I am using Cudarc.

taminka 3 months ago | parent | prev [-]

even putting aside how rust ownership semantics map poorly onto gpu programming, ml researchers will never learn rust, this will never ever happen...

pjmlp 3 months ago | parent | next [-]

While I agree in principle, CUDA is more than only AI, as people keep forgetting.

taminka 3 months ago | parent [-]

everyone else who uses cuda isn't going to learn rust either

pjmlp 3 months ago | parent [-]

First Rust needs to have tier 1 support for CUDA, in a way that doesn't feel like yak shaving when coding for CUDA.

int_19h 3 months ago | parent | prev | next [-]

The ML researchers are relying on libraries written by someone else. Today, those libraries are mostly C++, and they would benefit from Rust same as most other C++ codebases.

malcolmgreaves 3 months ago | parent | prev | next [-]

ML reachers don’t write code, they ask ChatGPT to make a horribly inefficient, non-portable notebook that has to be rewritten from scratch :)

staunton 3 months ago | parent [-]

It's made easier by that notebook only having to work just once, to produce some plots for the paper/press release/demo.

saagarjha 3 months ago | parent | prev | next [-]

I don’t think this is true. It seems to me more that nobody has put in a serious effort to make a nice interface build using Rust.

the__alchemist 3 months ago | parent | prev [-]

GPGPU programming != ML.