Post
used open-webui for the first time with llama.cpp ROCM server + AMD RX 5700 + llama-3 8B instruct. very slick local ai workflow. speed is very fast. slightly dumber but its not too bad.
0
0
0
Yeah I use that too!!!
I run it on my Mac Studio. Works well!
What else did you try?
There seems to be an active community and there are several new releases every week! Haven’t found a better frontend yet
0
0
0
0