| ▲ | cheema33 a day ago | |||||||||||||||||||||||||||||||
I did use whisper last night to get the captions out of a video file. The standard whisper tool from OpenAI uses CPU. It took more than 20 minutes to fully process a video file that was a little more than an hour long. During that time my 20-Core CPU was pegged at 100% utilization and the fan got very loud. I then downloaded an Intel version that used the NPU. CPUs stayed close to 0% and fans remained quiet. Total task was completed in about 6 minutes. NPUs can be useful for some cases. The AI PC crap is ill thought out however. | ||||||||||||||||||||||||||||||||
| ▲ | BeetleB a day ago | parent | next [-] | |||||||||||||||||||||||||||||||
I suggest trying whisper-cpp if you haven't. It's probably the fastest CPU only version. But yeah, NPUs likely will be faster. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | zamadatix a day ago | parent | prev [-] | |||||||||||||||||||||||||||||||
If you mean OpenVINO, it uses CPU+GPU+NPU - not just the NPU. On something like a 265K the NPU would only be providing 13 of the 36 total TOPS. Overall, I wish they would just put a few more general compute units in the GPU and have 30 TOPS or something but more overall performance in general. | ||||||||||||||||||||||||||||||||