| ▲ | song 4 hours ago | |
So, on a mac, what good alternative to ollama supports mlx for acceleration? My main use case is that I have an old m1 max macbook pro with 64 gb ram that I use as a model server. | ||
| ▲ | wrxd 3 hours ago | parent [-] | |
I read good things about https://omlx.ai but I don't really know enough of the ecosystem to know if there are better options. If someone has opinions please let us know! | ||