-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting a traceback when generating helptext #86
Comments
Thanks for the report! The issue is very simple :) python-llama-cpp library cannot find certain symbols inside llama-cpp library in F40 - these 2 libraries are out of sync. So I though I'd just update python-llama-cpp and we'd be good. Unfortunately, it's not that easy. llama-cpp was recently updated in f40:
Even though when I update python-llama-cpp to "0.3.1", the latest upstream release, logdetective will still pick the older version due to our dependency setting:
Which causes:
So in order to fix F40, we need to:
|
Should be helped by #87 |
Finally figured it out, thanks everyone for help! https://src.fedoraproject.org/rpms/python-llama-cpp-python/pull-request/10 Will merge, build and do a bodhi update soon. Smoke-tested in a f40 container and all worked well:
|
https://bodhi.fedoraproject.org/updates/FEDORA-2024-25db690b63 please provide karma if the update fixes this problem @jherrman thanks again for reporting! |
I'm getting the following error message when running
logdetective --help
:Traceback (most recent call last):
File "/usr/bin/logdetective", line 5, in
from logdetective.logdetective import main
File "/usr/lib/python3.12/site-packages/logdetective/logdetective.py", line 6, in
from logdetective.utils import process_log, initialize_model, retrieve_log_content, format_snippets
File "/usr/lib/python3.12/site-packages/logdetective/utils.py", line 7, in
from llama_cpp import Llama
File "/usr/lib64/python3.12/site-packages/llama_cpp/init.py", line 1, in
from .llama_cpp import *
File "/usr/lib64/python3.12/site-packages/llama_cpp/llama_cpp.py", line 1434, in
@ctypes_function(
^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.12/site-packages/llama_cpp/llama_cpp.py", line 122, in decorator
func = getattr(lib, name)
^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.12/ctypes/init.py", line 392, in getattr
func = self.getitem(name)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.12/ctypes/init.py", line 397, in getitem
func = self._FuncPtr((name_or_ordinal, self))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: /usr/lib64/libllama.so: undefined symbol: llama_model_apply_lora_from_file
System: Fedora 40
Hardware: ThinkPad T14
The text was updated successfully, but these errors were encountered: