Remix.run Logo
adastra22 2 days ago

Any computer with a display has a GPU.

om8 a day ago | parent [-]

Sure, but integrated graphics usually lacks vram for LLM inference.

adastra22 a day ago | parent [-]

Which means that inference would be approximately the same speed (but compute offloaded) as the suggested CPU inference engine.

a day ago | parent [-]
[deleted]