Show HN: Timber – Ollama for classical ML models, 336x faster than Python github.com 42 points by kossisoroyce 3 hours ago
mehdibl 12 minutes ago Ollama is quite a bad example here. Despite popular, it's a simple wrapper and more and more pushed by the app it wraps llama.cpp.Don't understand here the parallel.
Dansvidania 18 minutes ago Can’t check it out yet, but the concept alone sounds great. Thank you for sharing.
Ollama is quite a bad example here. Despite popular, it's a simple wrapper and more and more pushed by the app it wraps llama.cpp.
Don't understand here the parallel.
Can’t check it out yet, but the concept alone sounds great. Thank you for sharing.
I have been waiting for this! Nice