Remix.run Logo
webdevver 5 days ago

they all suck. someone needs to make an open source gpu already, its been way too long.

timschmidt 4 days ago | parent | next [-]

We did back in 2007: https://en.wikipedia.org/wiki/Open_Graphics_Project

And there have been some others as well: https://en.wikipedia.org/wiki/Free_and_open-source_graphics_...

Recently https://www.furygpu.com/

Part of the problem is that every ASIC manufacturer (and indeed each fabrication process) has a different toolchain with a different set of primitives for circuit design. Yosys and other open tooling for FPGAs has helped a great deal in lowering the barrier to chip design and by association reuse of circuits. But every ASIC, at the moment, is tied to some vendor's PDK. Here's the one Google open sourced for Cypress Semi's SKY130 process node: https://github.com/google/skywater-pdk

Findecanor 4 days ago | parent | prev | next [-]

It is at least theoretically possible to build a headless "GPU" from RISC-V processors that have the vector extension (RVV). RVV had been designed to be able to run programs compiled for the SIMT execution model that most GPUs use.

This Orange Pi RV2 has a small vector unit in each core, and could be used for at least prototyping the software until more powerful chips are available.

BTW. There have also been a couple hardware startups that have been working on commercial GPUs based on RISC-V's vector extension, with their own GPU-specific instruction set extensions for texture lookup and the like.

LeFantome 4 days ago | parent | next [-]

https://www.tomshardware.com/pc-components/gpus/startup-clai...

Joel_Mckay 4 days ago | parent [-]

The 94% market dominance of CUDA GPUs will roast new competition for sure.

RISC-V has a fragmented ISA standard, and every version is a magical unicorn part (the worst facet of ARM6.)

A Standard doesn't need to be good, but must be consistent to succeed. =3

dlcarrier 4 days ago | parent | prev [-]

The vector instructions are only in four of the eight cores. There's also extra cache in those cores, but they are otherwise symmetrical to the other four.

ekianjo 5 days ago | parent | prev [-]

It's probably a series of patent landmines...

AnthonyMouse 4 days ago | parent [-]

Hardware patents are orthogonal to open source software. If a patent covers the hardware then someone who wants to manufacture the hardware needs to license the patent, but you were never going to get free-as-in-beer hardware anyway, and a hardware patent is independent of whether the hardware is fully documented or has firmware with published source code and a license that allows users to make changes to it.

Joel_Mckay 4 days ago | parent [-]

Indeed, most IP used in silicon design are licensed, cost real money, and are under NDA.

I wouldn't say "never", but a clone is highly unlikely for another decade or so. =3

AnthonyMouse 3 days ago | parent [-]

NDAs in the context of a patent license should legitimately be considered grounds for patent invalidation. Patent literally means "open to public inspection" and the entire premise is that you get a temporary monopoly in exchange for openly documenting how your invention works in the published patent. If there is something about the invention for an NDA to cover it means you left it out of the patent application, which is essentially an admission of wrongdoing.

A patent is an alternative to a trade secret. You can't eat your cake and still have it.

Joel_Mckay 3 days ago | parent [-]

Patents do exist on popular IP cores, but in general standards compliant Verilog libraries are just vendor specific IP products. Thus, the copyright, NDA, and license agreements already keep the IP fairly locked down. For example, figuring out dram timing and DDR bus control yourself is nontrivial.

Some groups have attempted open IP cores, and made some progress:

https://opencores.org/

However, the effort involved in getting standards compliant ASIC built puts folks in a Fabless manufacturing sector. Most firms that survive, will choose to stay with a generic FPGA option to avoid custom silicon unless absolutely necessary.

Patents are often useless/vague in many places, but on occasion may prevent platform decay for a few years. One can be sure a unique/new design will not go to fab unless such protection is in place. =3

AnthonyMouse 3 days ago | parent [-]

It seems like the fab companies like TSMC/Samsung/GF/Intel are missing an opportunity here. Commoditize your complement:

https://gwern.net/complement

Publish the software that does this for free so that more customers come to you instead of using FPGAs or just not making the attempt. Make it easier to design new chips so that more people do it and you get more customers.

Joel_Mckay 3 days ago | parent [-]

When it comes to hardware, the OSS model stops making economic sense.

People have tried, but helping other competitors for free rapidly decays the market. =3

Rule #23: Don't compete to be at the bottom, as you just might actually win.

AnthonyMouse 3 days ago | parent [-]

The point isn't to help your competitors, it's to help your customers. Qualcomm or Apple doesn't have the incentive to help their competitors design chips, but TSMC or Samsung does have the incentive to help Qualcomm's competitors design chips, because then there are more companies making chips and the fab gets more business and prevents any one of their customers from getting too much leverage over them.

Joel_Mckay 2 days ago | parent [-]

Samsung actually had a technological lead in many areas, and it translated into real revenue. They are a business, and exist to provide utility to consumers in exchange for investor profit:

https://www.youtube.com/watch?v=KCWDzWG1BcI

Many incentives to push technology forwards are high-risk/expensive investments, and expecting the public/customers to willingly help pay that cost is naive:

https://www.youtube.com/watch?v=cru2bkqwSYk

I assure you academic funding does not cover such large costs, government grants are only a fraction of expected taxes in late stage Technology Readiness Levels, and competitor/cloner fractured markets erode fiscal returns needed to pay for the total incurred project cost.

Apple makes minimal utility products, but relies on intangible branding to maintain perceived value. Thus, only Apple could get away with selling zero chip designer handbags, and would still make absurd revenue (not a real product yet.) Steve Jobs observed very early, that selling raw motherboards was a low margin business. Which it was why the company shifted into consumer products.

For almost every other brand, consumers have shown they prefer the near material cost in opportunistic China/India factories, and thus simply ignore most firms products that include 10 years of R&D costs to pay back the investors.

There are many shelved technologies that will never see a Patent or the public markets. This is because the conditions are not ready for advanced products yet, and competitors irrationally nurture the lowest value volume market sectors. Thus, everyone gets a 15% value boost at regular intervals, and people remain excited about 3 decade old technology.

Qualcomm cellular chip product lines essentially lived off iPhone sales. Like any loyal dog, they will unlikely bite the hand that feeds them...

Rule #3: popularity is not an indication of utility.

Have a wonderful day =3