Remix.run Logo
pjmlp 6 days ago

I have followed Sebastian Aaltonen's work for quite a while now, so maybe I am a bit biased, this is however a great article.

I also think that the way forward is to go back to software rendering, however this time around those algorithms and data structures are actually hardware accelerated as he points out.

Note that this is an ongoing trend on VFX industry already, about 5 years ago OTOY ported their OctaneRender into CUDA as the main rendering API.

gmueckl 6 days ago | parent | next [-]

There are tons of places within the GPU where dedicated fixed function hardware provides massive speedups within the relevant pipelines (rasterization, raytracing). The different shader types are designed to fit inbetween those stages. Abandoning this hardware would lead to a massive performance regression.

formerly_proven 6 days ago | parent | next [-]

Just consider the sheer number of computations offloaded to TMUs. Shaders would already do nothing but interpolate texels if you removed them.

efilife 6 days ago | parent | prev [-]

Offtop, but sorry, I can't resist. "Inbetween" is not a word. I started seeing many people having trouble with prepositions lately, for some unknown reason.

> “Inbetween” is never written as one word. If you have seen it written in this way before, it is a simple typo or misspelling. You should not use it in this way because it is not grammatically correct as the noun phrase or the adjective form. https://grammarhow.com/in-between-in-between-or-inbetween/

Antibabelic 5 days ago | parent | next [-]

"Offtop" is not a word. It's not in any English dictionary I could find and doesn't appear in any published literature.

Matthew 7:3 "And why beholdest thou the mote that is in thy brother's eye, but considerest not the beam that is in thine own eye?"

Joker_vD 5 days ago | parent | next [-]

Oh, it's a transliteration of Russian "офтоп", which itself started as a borrowing of "off-topic" from English (but as a noun instead of an adjective/stative) and then went some natural linguistic developments, namely loss of a hyphen and degemination, surface analysis of the trailing "-ic" as Russian suffix "-ик" [0], and its subsequent removal to obtain the supposed "original, non-derived" form.

[0] https://en.wiktionary.org/wiki/-%D0%B8%D0%BA#Russian

fngjdflmdflg 5 days ago | parent [-]

>subsequent removal to obtain the supposed "original, non-derived" form

Also called a "back-formation". FWIF I don't think the existence of corrupted words automatically justifies more corruptions nor does the fact that it is a corruption automatically invalidate it. When language among a group evolves, everyone speaking that language is affected, which is why written language reads pretty differently looking back every 50 years or so, in both formal and informal writing. Therefore language changes should have buy-in from all users.

speed_spread 5 days ago | parent | prev [-]

Language evolves in mysterious ways. FWIW I find offtop to have high cromulency.

dist-epoch 6 days ago | parent | prev | next [-]

If enough people use it, it will become correct. This is how language evolves. BTW, there is no "official English language specification".

And linguists think it would be a bad idea to have one:

https://archive.nytimes.com/opinionator.blogs.nytimes.com/20...

mikestorrent 5 days ago | parent | prev | next [-]

Surely you mean "I've started seeing..." rather than "I started seeing..."?

dragonwriter 5 days ago | parent [-]

Either the present perfect that you suggest or the past perfect originally presented is correct, and the denotation is basically identical. The connotation is slightly different, as the past perfect puts more emphasis on the "started...lately" and the emergent nature of the phenomenon, and the present perfect on the ongoing state of what was started, but there’s no giant difference.

cracki 5 days ago | parent | prev [-]

Your entire post does not once mention the form you call correct.

If you intend for people to click the link, then you might just as well delete all the prose before it.

torginus 5 days ago | parent | prev | next [-]

I really want to make a game using a software rasterizer sometime - just to prove its possible. Back in the good ol' days, I had to get by on my dad's PC, which had no graphics acceleration, but a farily substatial Pentium 3 processor.

Games like the original Half-Life, Unreal Tournament 2004, etc. ran surprisingly well and at decent resolutions.

With the power of modern hardware, I guess you could do a decent FPS in pure software with even naively written code, and not having to deal with the APIs, but having the absolute creative freedom to say 'this pixel is green' would be liberating.

Fun fact: Due to the divergent nature of computation, many ray tracers targeting real time performance were written on CPU, even when GPUs were quite powerful, software raytracers were quite good, until the hardware apis started popping up.

darzu 5 days ago | parent | next [-]

You should! And you might enjoy this video about making a CPU rasterizer: https://www.youtube.com/watch?v=yyJ-hdISgnw

Note that when the parent comment says "software rendering" they're referring to software (compute shaders) on the GPU.

pjmlp 4 days ago | parent | prev [-]

You could start by staying on the CPU side, and make use of AVX, Larrabee style.

Which is easier to debug.

Going with Mesh shaders, or GPU compute would be the next step.

mrec 6 days ago | parent | prev | next [-]

Isn't this already happening to some degree? E.g. UE's Nanite uses a software rasterizer for small triangles, albeit running on the GPU via a compute shader.

jsheard 6 days ago | parent | next [-]

Things are kind of heading in two opposite directions at the moment. Early GPU rasterization was all done in fixed-function hardware, but then we got programmable shading, and then we started using compute shaders to feed the HW rasterizer, and then we started replacing the HW rasterizer itself with more compute (as in Nanite). The flexibility of doing whatever you want in software has gradually displaced the inflexible hardware units.

Meanwhile GPU raytracing was a purely software affair until quite recently when fixed-function raytracing hardware arrived. It's fast but also opaque and inflexible, only exposed through high-level driver interfaces which hide most of the details, so you have to let Jensen take the wheel. There's nothing stopping someone from going back to software RT of course but the performance of hardware RT is hard to pass up for now, so that's mostly the way things are going even if it does have annoying limitations.

HelloNurse 4 days ago | parent [-]

And hardware raytracing is on the same trajectory as hardware rasterization: devs finding ways to repurpose it, leading to pressure for more general APIs, which enable further repurposing, until hardware raytracing evolves into a flexible hardware accelerated facility for indexing, reordering, etc.

gmueckl 3 days ago | parent | prev | next [-]

Nanite is just working around an inefficiency that occurs on small triangles that require screen space derivatives, which the hardware approxinates using finite differences between neighbors, e.g. for the texture footprint estimation in mipmapping. The rasterizer invokes additional shader instances around triangle borders to get the source values for these operations. That gets excessive when triangles are tiny. This is an edge case, but it becomes important when there is lots of tiny geometric details on screen.

djmips 6 days ago | parent | prev [-]

Why do you say 'albeit'? I think it's established that 'software rendering' can mean running on the GPU. That's what Octane is doing with CUDA in the comment you are replying to. But good callout on Nanite.

mrec 6 days ago | parent [-]

No good reason, I'm just very very old.

Q6T46nT668w6i3m 6 days ago | parent | prev [-]

But they still rely on fixed functions for a handful of essential ops (e.g., intersection).