▲ | daemonologist 4 days ago | |
ONNX Runtime purports to support CoreML: https://onnxruntime.ai/docs/execution-providers/CoreML-Execu... , which gives a decent amount of compatibility for inference. I have no idea to what extent workloads actually end up on the ANE though. (Unfortunately ONNX doesn't support Vulkan, which limits it on other platforms. It's always something...) |