You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These are excellent models, and they've recently been added to awq so as to maximize compatibility. Beats out Qwen 2.5 even in my custom RAG benchmark. I've tested the version 3.5 of exaone, but apparently they've released a 3.0 awhile back:
These are excellent models, and they've recently been added to awq so as to maximize compatibility. Beats out Qwen 2.5 even in my custom RAG benchmark. I've tested the version 3.5 of exaone, but apparently they've released a 3.0 awhile back:
https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct
https://github.com/casper-hansen/AutoAWQ/releases/tag/v0.2.7.post3
apparently people confuse it with llama's architecture, but see the nuances here:
huggingface/transformers#34652
HF being jerks about adopting it, but I figured ctranslate2 can!
The text was updated successfully, but these errors were encountered: