You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The transformer-based Falcon 3 models use the Llama architecture, so they are already supported. I played around with the 1B, 3B, and 7B models yesterday.
I did run into an issue because the Faclon 3 models did not specify a bos token. Once I set this manually, the model appeared to work correctly.
Falcon3 was just released today:
https://huggingface.co/collections/tiiuae/falcon3-67605ae03578be86e4e87026
More details:
https://huggingface.co/blog/falcon3
More stuff:
details and stuff
The text was updated successfully, but these errors were encountered: