▲ | warrenm 7 days ago | |||||||||||||||||||||||||
GPUs are [effectively] irrelevant for many use cases (IoT, embedded, most servers, etc) | ||||||||||||||||||||||||||
▲ | Joel_Mckay 4 days ago | parent | next [-] | |||||||||||||||||||||||||
On Raspberry Pi, the GPU is the only thing that makes a responsive GUI or web-browser feasible, and is the primary reason most people use the HDMI LCD screens for games etc. It also took a large effort to bring up a v4l2 kernel driver for the camera modules etc. For example, on the CPU one may pin all cores to stream a USB camera or software decode h264. With the SoC GPU decoding or streaming with the v4l2 interface might take up 30% on one core (mainly to handle the network traffic.) The Raspberry Pi are not the fastest or "best" option (most focus on h264 or MJPEG hardware codecs), but the software/kernel ecosystem provides real value. Also, the foundation doesn't EOL their hardware often, or abandon software support after a single OS release. A cheap RISC-V SBC is great, but ISA versions are generally so fractured (copied the worst ideas of ARM6)... few OS will likely waste resources targeting a platform that will have 5 variants a year, and proprietary drivers. A Standard doesn't even need to be good, but must be consistent to succeed. =3 | ||||||||||||||||||||||||||
▲ | _zoltan_ 4 days ago | parent | prev [-] | |||||||||||||||||||||||||
the title says "... AI projects". now, maybe our definitions are different, but you probably want some hardware acceleration. | ||||||||||||||||||||||||||
|