Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LocalAI API endpoint not working #129

Open
userbox020 opened this issue Sep 20, 2024 · 22 comments
Open

LocalAI API endpoint not working #129

userbox020 opened this issue Sep 20, 2024 · 22 comments
Labels
bug Something isn't working

Comments

@userbox020
Copy link

Which version of assistant are you using?

1.0

Which version of Nextcloud are you using?

v30

Which browser are you using? In case you are using the phone App, specify the Android or iOS version and device please.

Firefox and Google Latest

Describe the Bug

I have setting up LocalAI API endpoint working fine, and NextCloud even detect the models list of LocalAI but when I use any of the features to talk with the LLM it doesnt fetch anything, it get the follow error

GET /apps/assistant/chat/check_generation?taskId=20&sessionId=21 HTTP/1.1
Accept: application/json, text/plain, */*
Accept-Encoding: gzip, deflate, br, zstd
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Cookie: oc_sessionPassphrase=bgHjp8Tyr9s29ZBaTrH7iPAFHR1qe67PWMhdXBo9DKozrPQgxWP2lZyVfWrmlaIuyLddLrkLtacrstoEGQfwx55Sd%2BouuDeo5JZFoeezhAQpd6rLG4tFPCbiDbCQkMRy; nc_sameSiteCookielax=true; nc_sameSiteCookiestrict=true; ocj2thp9x9dg=a39aba2bb49eec84a8c772bb1b874c6f; nc_username=admin; nc_token=5H%2FNfg4NPPo%2BtewCKTl4%2FAtTRQQRviM9; nc_session_id=a39aba2bb49eec84a8c772bb1b874c6f
Host: localhost:8070
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36
X-Requested-With: XMLHttpRequest, XMLHttpRequest
requesttoken: OUgglgKesBwaxlFwTA2Fq4dbST6P5ssrktPuoLYErE4=:SQNNo3DJ6SR9nDQfCUPun98sE2n+sK9Pxb6jxsB1gw0=
sec-ch-ua: "Chromium";v="128", "Not;A=Brand";v="24", "Google Chrome";v="128"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"


HTTP/1.1 417 Expectation Failed
Date: Fri, 20 Sep 2024 09:07:03 GMT
Server: Apache/2.4.62 (Debian)
Referrer-Policy: no-referrer
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Permitted-Cross-Domain-Policies: none
X-Robots-Tag: noindex, nofollow
X-XSS-Protection: 1; mode=block
X-Powered-By: PHP/8.2.23
Content-Security-Policy: default-src 'none';base-uri 'none';manifest-src 'self';frame-ancestors 'none'
X-Request-Id: 1CXMQ5oeDFUkY6qEm2b4
Cache-Control: no-cache, no-store, must-revalidate
Feature-Policy: autoplay 'none';camera 'none';fullscreen 'none';geolocation 'none';microphone 'none';payment 'none'
Content-Length: 17
Keep-Alive: timeout=5, max=66
Connection: Keep-Alive
Content-Type: application/json; charset=utf-8

{"task_status":1}

Expected Behavior

Interact with the LLM normally

To Reproduce

Setup LocalAI, input the URL in AI settings and interact with the AI

@userbox020 userbox020 added the bug Something isn't working label Sep 20, 2024
@userbox020
Copy link
Author

image

@userbox020
Copy link
Author

image

@userbox020
Copy link
Author

 ChattyLLMInputForm.vue:574

{
  "stack": "X@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:33123\njt@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:44544\ng@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:49220\nEventHandlerNonNull*70715/Vt</<@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:49479\n70715/Vt<@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:48808\nhe@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:55714\n_request@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:58541\nrequest@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:57068\n70715/</we.prototype[t]@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:58836\n70715/o/<@http://localhost:8080/custom_apps/assistant/js/assistant-axios-lazy.js?v=fc9be637c540e816beb7:2:27421\n22974/pollGenerationTask/</this.pollMessageGenerationTimerId<@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:75703\nsetInterval handler*22974/pollGenerationTask/<@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:75674\npollGenerationTask@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:75619\nrunGenerationTask@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:74766\nasync*newMessage@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:73843\nasync*handleSubmit@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:70457\nasync*va@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:221685\na@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:214375\nva@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:221685\n85471/e.prototype.$emit@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:243546\nsubmit@http://localhost:8080/custom_apps/assistant/js/assistant-assistant-modal-lazy.js?v=5858c70ce6cd3b1bf029:1:67271\nva@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:221685\na@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:214375\nva@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:221685\n85471/e.prototype.$emit@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:243546\nonEnter@http://localhost:8080/custom_apps/assistant/js/assistant-vendors-node_modules_nextcloud_vue_dist_Components_NcAppNavigationNew_mjs-node_modules_nextcl-28eca6.js?v=8fe271131d67e837ec65:2:567606\n42796/R/<.on.keydown<@http://localhost:8080/custom_apps/assistant/js/assistant-vendors-node_modules_nextcloud_vue_dist_Components_NcAppNavigationNew_mjs-node_modules_nextcl-28eca6.js?v=8fe271131d67e837ec65:2:571115\nva@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:221685\na@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:214454\n85471/Do/s._wrapper@http://localhost:8080/custom_apps/assistant/js/assistant-main.js?v=06ec47aa-0:2:254591\n",
  "message": "Request failed with status code 417",
  "name": "AxiosError",
  "code": "ERR_BAD_REQUEST",
  "config": {
    "transitional": {
      "silentJSONParsing": true,
      "forcedJSONParsing": true,
      "clarifyTimeoutError": false
    },
    "adapter": [
      "xhr",
      "http",
      "fetch"
    ],
    "transformRequest": [
      null
    ],
    "transformResponse": [
      null
    ],
    "timeout": 0,
    "xsrfCookieName": "XSRF-TOKEN",
    "xsrfHeaderName": "X-XSRF-TOKEN",
    "maxContentLength": -1,
    "maxBodyLength": -1,
    "env": {},
    "headers": {
      "Accept": "application/json, text/plain, */*",
      "requesttoken": "tSslCxlJ4AS/CVIa0wZOUpo1uRefeiiG48WlInf+J0g=:mnFuOkMmrUjUXhhVvEAHBMIA+CX6CRDljbbXVAKWYhk=",
      "X-Requested-With": "XMLHttpRequest"
    },
    "params": {
      "taskId": 1,
      "sessionId": 1
    },
    "method": "get",
    "url": "/apps/assistant/chat/check_generation"
  },
  "request": {},
  "response": {
    "data": {
      "task_status": 1
    },
    "status": 417,
    "statusText": "Expectation Failed",
    "headers": {
      "cache-control": "no-cache, no-store, must-revalidate",
      "connection": "Keep-Alive",
      "content-length": "17",
      "content-security-policy": "default-src 'none';base-uri 'none';manifest-src 'self';frame-ancestors 'none'",
      "content-type": "application/json; charset=utf-8",
      "date": "Fri, 20 Sep 2024 21:04:42 GMT",
      "feature-policy": "autoplay 'none';camera 'none';fullscreen 'none';geolocation 'none';microphone 'none';payment 'none'",
      "keep-alive": "timeout=5, max=35",
      "referrer-policy": "no-referrer",
      "server": "Apache/2.4.62 (Debian)",
      "x-content-type-options": "nosniff",
      "x-frame-options": "SAMEORIGIN",
      "x-permitted-cross-domain-policies": "none",
      "x-powered-by": "PHP/8.2.23",
      "x-request-id": "uUTJzffketuPRiXStzqx",
      "x-robots-tag": "noindex, nofollow",
      "x-xss-protection": "1; mode=block"
    },
    "config": {
      "transitional": {
        "silentJSONParsing": true,
        "forcedJSONParsing": true,
        "clarifyTimeoutError": false
      },
      "adapter": [
        "xhr",
        "http",
        "fetch"
      ],
      "transformRequest": [
        null
      ],
      "transformResponse": [
        null
      ],
      "timeout": 0,
      "xsrfCookieName": "XSRF-TOKEN",
      "xsrfHeaderName": "X-XSRF-TOKEN",
      "maxContentLength": -1,
      "maxBodyLength": -1,
      "env": {},
      "headers": {
        "Accept": "application/json, text/plain, */*",
        "requesttoken": "tSslCxlJ4AS/CVIa0wZOUpo1uRefeiiG48WlInf+J0g=:mnFuOkMmrUjUXhhVvEAHBMIA+CX6CRDljbbXVAKWYhk=",
        "X-Requested-With": "XMLHttpRequest"
      },
      "params": {
        "taskId": 1,
        "sessionId": 1
      },
      "method": "get",
      "url": "/apps/assistant/chat/check_generation"
    },
    "request": {}
  },
  "status": 417
}

@ctft1
Copy link

ctft1 commented Sep 21, 2024

I confirm there is also something wrong on my Local-ai setup with Nextcloud. Everything works fine when is using the GUI from Local-AI, or even a curl API command : results come really fast (1-2 second, I have 2x GPU with CUDA enabled)

But when querying through the assistant the answer takes a very long time (usually at least 2-3 minutes). I had a look on a Local-AI with Debug enabled, and meanwhile Local-AI does... nothing. It just waits... and at some points return the answer really quickly.

I had a look to the Nextcloud Assistant page, and I can also see this 417 answer doing nothing, and finally gets a 200 answer (the top "new_message" is when the Chat Button is pressed to send the message)

Screenshot 2024-09-21

@ctft1
Copy link

ctft1 commented Sep 21, 2024

I forgot to mention my versions:

  • Nextcloud v30.0,0 from AIO Docker on Ubuntu (fresh install)
  • Nextcloud Hub 9
  • Nextcloud Assistant v2.0.4
  • OpenAI and LocalAI integration v3.1.0

@ctft1
Copy link

ctft1 commented Sep 21, 2024

I found more information about my issue.

The "Chat with AI" (and also other AI sections) queries are only processed once every 5 minutes (at 9:00 / 9:05 / 9:10 / ...), that explains the time before getting an answer... which of course is not the expected behavior.

Can someone have a look to reproduce? Thank you

@userbox020
Copy link
Author

@ctft1 its totally nextcloud or assistant app problem bro, because localAI working perfect and it integrates to any other frameworks with its API without any problem.
Hope we can have some help with this problem from nextcloud or assistant app before they release assistant v2

@kinimodmeyer
Copy link

kinimodmeyer commented Oct 7, 2024

@ctft1 that it get processed faster then the 5 minutes (which is your default cronjob schedule) you need to ALWAYS run:

occ background-job:worker 'OC\TaskProcessing\SynchronousBackgroundJob'

But i can confirm the Expectation Failed issue

@kinimodmeyer
Copy link

Since the update of the assistant today it works on my case.

@ramezquitao
Copy link

I just updated to 2.1.1 and for me the problem still continues

@userbox020
Copy link
Author

@ramezquitao they trolling bro, nextcloud just focus on sales not on fix errors

@hdnh2006
Copy link

hdnh2006 commented Nov 7, 2024

I just updated to 2.1.1 and for me the problem still continues

I can confirm the problem persists, this could be a wonderful feature for Nextcloud but as @userbox020 says, they are focusing all the efforts in other thing.

I have my own "ChatGPT", anyone can use it by the way: chat.privategpt.es and I cannot connect Nextcloud assistant using OpenAI nor my own app, both are failing and getting same errors:
Image

I hope they pay attention to us because this application should not be hard to build and it could help a lot.

@julien-nc
Copy link
Member

Are you running occ background-job:worker? Some things have changed in NC 30, the tasks are running in background jobs. More details:
#142 (comment)
https://docs.nextcloud.com/server/latest/admin_manual/ai/overview.html#ai-overview-improve-ai-task-pickup-speed

There is almost no warning about this change. We'll add it in the admin settings of Assistant and integration_openai.

Let's try to stay constructive here please.

@hdnh2006
Copy link

hdnh2006 commented Nov 8, 2024

Are you running occ background-job:worker? Some things have changed in NC 30, the tasks are running in background jobs. More details: #142 (comment) https://docs.nextcloud.com/server/latest/admin_manual/ai/overview.html#ai-overview-improve-ai-task-pickup-speed

There is almost no warning about this change. We'll add it in the admin settings of Assistant and integration_openai.

Let's try to stay constructive here please.

I don't know exactly what you mean. First, I am running a dockerized Nextcloud. I am not an expert in PHP, but it is clear that NC is not working as the user expects. I mean, if I send a request to OpenAI or any other AI service like chat.privategpt.es, the user expects to get a streaming response as accustomed.

In any case, it is clear the call from NC to the AI service is not arriving, as I see in the logs of chat.privategpt.es.

This is my docker compose file:

version: '3.7'

services:
  nextcloud:
    image: nextcloud
    container_name: nextcloud
    ports:
      - 8080:80
    volumes:
      - nextcloud_data:/var/www/html/data
      - nextcloud_config:/var/www/html/config
      - nextcloud_apps:/var/www/html/apps
      - /home/artivant/Public:/mnt/public
    environment:
      - NEXTCLOUD_ADMIN_USER=myuser
      - NEXTCLOUD_ADMIN_PASSWORD=mypass
    restart: unless-stopped

volumes:
  nextcloud_data:
    name: nextcloud_data_volume
  nextcloud_config:
    name: nextcloud_config_volume
  nextcloud_apps:
    name: nextcloud_apps_volume

Could you please try any AI service and reproduce this error?

@julien-nc
Copy link
Member

julien-nc commented Nov 8, 2024

@hdnh2006 I can't reproduce the issue.
I'm pretty sure you don't run a worker. Could you try to launch:

docker compose exec nextcloud occ background-job:worker -v "OC\TaskProcessing\SynchronousBackgroundJob"

and launch a task (or send a message in the chat UI) while the occ command is running?

I am not an expert in PHP, but it is clear that NC is not working as the user expects.

If the admin sets up a worker that executes the background jobs without delay, the assistant + integration_openai should work as a user expects 😁

@ramezquitao
Copy link

For me, solution give by @julien-nc , works. Thanks

@SchBenedikt
Copy link

SchBenedikt commented Jan 1, 2025

@julien-nc

occ background-job:worker -v "OC\TaskProcessing\SynchronousBackgroundJob"

Running your code while asking a question with "Chat wit AI", this is my terminal log:

Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Running job OC\TaskProcessing\SynchronousBackgroundJob with ID 4689148
Job class:            OC\TaskProcessing\SynchronousBackgroundJob
Arguments:            null
Type:                 queued

Last checked:         2025-01-01T20:32:35+00:00
Reserved at:          2025-01-01T20:32:35+00:00
Last executed:        1970-01-01T00:00:00+00:00
Last duration:        0
Job 4689148 has finished
Running job OC\TaskProcessing\SynchronousBackgroundJob with ID 4689149
Job class:            OC\TaskProcessing\SynchronousBackgroundJob
Arguments:            null
Type:                 queued

Last checked:         2025-01-01T20:32:35+00:00
Reserved at:          2025-01-01T20:32:35+00:00
Last executed:        1970-01-01T00:00:00+00:00
Last duration:        0
Job 4689149 has finished
Waiting for new jobs to be queued
Waiting for new jobs to be queued
Waiting for new jobs to be queued

It worked once or twice - but more often than not, it didn't.

(PS: I use Apache with mySQL)

@SchBenedikt
Copy link

->> It worked with sudo -u www-data php /var/www/nextcloud/occ background-job:worker 'OC\TaskProcessing\SynchronousBackgroundJob'

@winoiknow
Copy link

Solution give by @julien-nc works but I have to keep the terminal open. Is there a way to hardcode the worker process?

@winoiknow
Copy link

winoiknow commented Jan 1, 2025

@SchBenedikt Awesome! Works like a champ now. Thanks much.

@skleffmann
Copy link

Just running occ background-job:worker works and the AI Chat answers within a couple of seconds.
But using occ background-job:worker 'OC\TaskProcessing\SynchronousBackgroundJob' gives my the follwing error message:

Invalid job class: OCTaskProcessingSynchronousBackgroundJob

I am running linuxserver's NC container.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

9 participants