So we can run it with llama.cpp
Also waiting for one
Same here
Waiting
· Sign up or log in to comment