Remix.run Logo
timmg 2 days ago

I (vaguely) think what the Mojo guys' goal is makes a lot of sense. And I understand why they thought Python was the way to start.

But I just think Python is not the right language to try to turn into this super-optimized parallel processing system they are trying to build.

But their target market are Python programmers, I guess. So I'm not sure what a better option would be.

It would be interesting for them to develop their own language and make it all work. But "yet another programming language" is a tough sell.

cactusfrog 2 days ago | parent | next [-]

What language do you think they should have based Mojo off of? I think Python syntax is great for tensor manipulation.

fluidcruft 2 days ago | parent [-]

I wouldn't mind a python flavor that has a syntax for tensors/matrices that was a bit less bolted on in parts vs Matlab. You get used to python and numpy's quirks it but it is a bit jarring at first.

Octave has a very nice syntax (it's an extended Matlab's syntax to provide the good parts of numpy broadcasting). I assume Julia uses something very similar to that. I have wanted to work with Julia but it's so frustrating to have to build so much of the non-interesting stuff that just exists in python. And back when I looked into it there didn't seem to be an easy way to just plug Julia into python things and incrementally move over. Like you couldn't swap the numerics and keep with matplotlib things you already had. You had to go learn Julia's ways of plotting and doing everything. It would have been nice if there were an incremental approach.

One thing I am on the fence about is indexing with '()' vs '[]'. In Matlab both function calls and indexing use '()' which is a Fortran style (the ambiguity lets you swap functions for matrices to reduce memory use but that's all possible with '[]' in python) which can sometimes be nice. Anyway if you have something like mojo you're wanting to work directly with indices again and I haven't done that in a long time.

Ultimately I don't think anyone would care if mojo and python just play nicely together with minimal friction. (Think: "hey run this mojo code on these numpy blobs"). If I can build GUIs and interact with the OS and parse files and the interact with web in python to prep data while simultaneously crunching in mojo that seems wonderful.

I just hate that Julia requires immediately learning all the dumb crap that doesn't matter to me. Although it's seeming like LLM seem very good at the dumb crap so some sort of LLM translation for the dumb crap could be another option.

In summary: all mojo actually needs is to be better than numba and cython type things with performance that at least matches C++ and Fortran and the GPU libraries. Once that happens then things like the mojo version of pandas will be developed (and will replace things like polars)

golly_ned 2 days ago | parent | prev | next [-]

The syntax is based on python, but its runtime is not. So nothing about the contrast between the python language and mojo's use as a super-parallelized parallel processing system is inconsistent.

pjmlp 2 days ago | parent | prev | next [-]

This is attempt number 2, it was already tried before with Swift for Tensorflow.

Guess why it wasn't a success, or why Julia is having adoption issues among the same community.

Or why although Zig is basically Modula-2 type system, it is being more hyped than Modula-2 ever was since 1978 (it is even part of GCC nowadays).

Syntax and familiarity matters.

a96 2 days ago | parent [-]

I think the only Zig hype I'm seeing is about its compiler and compatibility. Those might well be the same two reasons why you never hear about modula-2.

pjmlp 2 days ago | parent [-]

I am older than Modula-2, so I heard a lot, many of the folks hyping Zig still think the world started with UNIX.

ziofill 2 days ago | parent | prev | next [-]

Exactly, the idea of not having to learn yet a new language is very compelling

mempko 2 days ago | parent | prev [-]

Except by all accounts they succeeded. I believe they have the fastest matmul on nvidia chips in the industry

timmg 2 days ago | parent | next [-]

I was under the impression that their uptake it slow or non-existant. Am I wrong on that?

ozgrakkurt 2 days ago | parent | prev | next [-]

Is it really faster than cublas?

melodyogonna 2 days ago | parent [-]

In some things yes. They're mostly identical in performance though

saagarjha 2 days ago | parent | prev | next [-]

CUTLASS would like to have a word with you.

fooblaster 2 days ago | parent | prev [-]

evidence?

chrislattner 2 days ago | parent [-]

Modular/Mojo is faster than NVIDIA's libraries on their own chips, and open source instead of binary blob. See the 4 part series that culimates in https://www.modular.com/blog/matrix-multiplication-on-blackw... for Blackwell for example.

fooblaster 2 days ago | parent [-]

thanks