ref:
ggerganov/ggml#302
#1991
This PR paves the way for integrating more models into llama.cpp. It changes the file format in which we convert the models by extending it with key-value pairs meta...
They still consider it a beta but there we go! It’s happening :D
AFAIK supposedly GGUF is a more extensible format that contains (or can contain) more metadata types that make it usable for different model architectures. The main advantage is that this should be the last breaking format change, as future changes can be added in a more modular way.
The significance is we have a new file format standard, bad news is it breaks compatibility with the old format so you’ll have to update to use newer quants and you can’t use your old ones
The good news is this is the last time that’ll happen (it’s happened a few times so far) as this one is meant to be a lot more extensible and flexible, storing a ton of extra metadata for extra compatibility
Sorry, I’m trying to get in the loop on this stuff, what’s the significance of this, and who will it effect?
AFAIK supposedly GGUF is a more extensible format that contains (or can contain) more metadata types that make it usable for different model architectures. The main advantage is that this should be the last breaking format change, as future changes can be added in a more modular way.
The significance is we have a new file format standard, bad news is it breaks compatibility with the old format so you’ll have to update to use newer quants and you can’t use your old ones
The good news is this is the last time that’ll happen (it’s happened a few times so far) as this one is meant to be a lot more extensible and flexible, storing a ton of extra metadata for extra compatibility
The great news is that this paves the way for better model support as we’ve seen already with support for falcon being merged: https://github.com/ggerganov/llama.cpp/commit/cf658adc832badaaa2ca119fe86070e5a830f8f6