Comment by flutetornado

Comment by flutetornado 3 days ago

3 replies

I was able to compile ollama for AMD Radeon 780M GPUs and I use it regularly on my AMD mini-PC which cost me 500$. It does require a bit more work. I get pretty decent performance with LLMs - just making a qualitative statement as I didn't do any formal testing, but I got comparable performance vibes as a NVIDIA 4050 GPU laptop I use as well.

https://github.com/likelovewant/ROCmLibs-for-gfx1103-AMD780M...

vkazanov 3 days ago

Same here on lenovo thinkpad 14s with AMD Ryzen™ AI 7 PRO 360 that has a Radeon 880M iGPU. Works OK on ubuntu.

Not saying it works everywhere but it wasn't even that hard to setup, comparable to cuda.

Hate the name though.