Post
Ok smart devs of nostr. need your help.
I am doing this for learning purposes really, not that i expect a real output from it.
I have ollama installed (llama3.2, gemma3, qwen2.5) and im trying to connect stacks to it.
i run ollama and on stacks configure I set it to OpenAI-compatable.
API secret: localhost
URL: 127.0.0.1:11434
i tried so many different combos of things. but it does not seem to be working. any suggestions?
I tried it with Goose. it worked flawlessly.
#asknostr
Alex Gleason any words of wisdom on this.