Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
buyucu
on Jan 31, 2025
|
parent
|
context
|
favorite
| on:
Llama.cpp supports Vulkan. why doesn't Ollama?
llama.cpp already supports Vulkan. This is where all the hard work is at. Ollama hardly does anything on top of it to support Vulkan. You just check if the libraries are available, and get the available VRAM. That is all. It is very simple.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: