Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when using more than 999 characters in prompt. #143

Open
ebildebil opened this issue Oct 6, 2024 · 2 comments
Open

Error when using more than 999 characters in prompt. #143

ebildebil opened this issue Oct 6, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@ebildebil
Copy link

ebildebil commented Oct 6, 2024

Which version of assistant are you using?

1.1.0 as configured by Nextcloud AIO

Which version of Nextcloud are you using?

29.0.7

Which browser are you using? In case you are using the phone App, specify the Android or iOS version and device please.

Chrome or Firefox latest

Describe the Bug

May be connected to #59 .

I have configured an external API for use with Nextcloud Assistant. In this case Mistral.
When I tried a prompt today I get the below error:
My promt has a total of: 1009 characters and 182 words.

Text generation error: An exception occurred while executing a query: SQLSTATE[22001]: String data, right truncated: 7 ERROR: value too long for type character varying(1000)

In the logs I see below error:

{"reqId":"V85FoTrYUb0uA2GqxpJj","level":3,"time":"2024-10-01T16:46:29+00:00","remoteAddr":"127.0.0.1","user":"bisu","app":"PHP","method":"POST","url":"/apps/assistant/f/process_prompt","message":"Undefined array key 7 at /var/www/html/lib/private/AppFramework/Http.php#128","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36","version":"29.0.7.1","data":{"app":"PHP"},"id":"66fc27c87b130"}

General googling suggests that it is likely a limitation in number of charecters that can be stored in the DB for that field? (I could be wrong)
If so, the newer models, have a context size of 128k, roughly 80k English words or even 2Millon tokens (Gemini Pro)

I have set 18000 as the prompt token limit.

When I edit my promt to 991 characters and 178 words, I do not get an error.

Expected Behavior

There should not be any errors.
It Should be able to handle large token sizes in relation to the models being used.

To Reproduce

Try a prompt with more than 1000 characters.

@ebildebil ebildebil added the bug Something isn't working label Oct 6, 2024
@julien-nc
Copy link
Member

Hey, this looks like a limitation of the Mistral API or the model you're using. I tried with OpenAI/GPT-3.5-turbo and it accepted a prompt with 3K chars. The Assistant+the Nextcloud task processing API are fine with long prompts
Which model are you using?

The Text generation error: An exception occurred... error you get is actually the response of the network request to MistralAI.

@ebildebil
Copy link
Author

Hi, I am using the Mistral-Large model. I tried this agaiin today and the prompt larger than 1k failed.

Text generation error: An exception occurred while executing a query: SQLSTATE[22001]: String data, right truncated: 7 ERROR: value too long for type character varying(1000)

Also, I am using nextcloud-aio, and have configured it to start Local-AI container. although in the admin page for AI, i use the mistral api url and api key, and have promt tokens set to 18000. Not sure if that matters.

Anyhow, I will try this in a couple of days with a different frontend that also interacts with the Mistral api and report back. (eg: lollms)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants