Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with 502 Bad Gateway and 499 Errors on /api/jobs/<job_id>/preview Endpoint #8760

Open
2 tasks done
Martin-Ball opened this issue Nov 30, 2024 · 1 comment
Open
2 tasks done
Labels
bug Something isn't working need info Need more information to investigate the issue

Comments

@Martin-Ball
Copy link

Actions before raising this issue

  • I searched the existing issues and did not find anything similar.
  • I read/searched the docs

Steps to Reproduce

I am experiencing persistent issues with the /api/jobs/<job_id>/preview endpoint in my CVAT setup. The following are the details:

All system was deployed on google cloud cluster of kubernetes, this worked very well during two weeks of hard labelling and export very very heavy weight annotations zip on google cloud storage (I love cvat after that, exceeded all my expectations) but I had an issue with a PVC inside of cluster and after resolve that I cross with this error and I can't resolve.

Error:
When I tried to preview image on jobs tab or inside a job has similar errors

Captura de pantalla 2024-11-30 172951
Captura de pantalla 2024-11-30 162959

Environment:

CVAT version: [Include your CVAT version or Docker image tag, e.g., cvat/server:latest]
Deployment: Kubernetes
Backend: NGINX, Uvicorn (Unix socket at /tmp/uvicorn.sock)
Issue Details:

When attempting to preview data for a specific job (e.g., /api/jobs/2/preview), I receive a 502 Bad Gateway from the server.
Additionally, the NGINX logs show frequent 499 errors, which indicate the client is closing the connection prematurely.
Log Snippets:

NGINX Access Log:

35.191.32.27 - - [30/Nov/2024:21:02:17 +0000] "GET /api/jobs/2/preview?org= HTTP/1.1" 499 0 "https://preventapp.site/jobs" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36"

Backend Response when accessed directly via Unix socket:
bash

curl -v --unix-socket /tmp/uvicorn.sock http://localhost/api/jobs/2/preview?org=

  • Connected to localhost (/tmp/uvicorn.sock) port 80 (#0)

GET /api/jobs/2/preview?org= HTTP/1.1
Host: localhost
User-Agent: curl/7.81.0
Accept: /
<
< HTTP/1.1 401 Unauthorized
< content-type: application/vnd.cvat+json
< {"detail":"Authentication credentials were not provided."}
NGINX Error Log:
lua

Permission Listing for Relevant Directories:

Folders for job data are created successfully after new task:

shell

$ ls -l /home/django/data/data/2
drwxrwxrwx 2 django django 4096 Nov 30 20:15 compressed
drwxrwxrwx 2 django django 4096 Nov 30 20:15 original
drwxrwxrwx 2 django django 4096 Nov 30 20:15 raw

Permissions for /tmp/uvicorn.sock (adjusted manually):
shell

$ chmod 777 /tmp/uvicorn.sock
$ ls -l /tmp/uvicorn.sock
srwxrwxrwx 1 django django 0 Nov 30 20:42 /tmp/uvicorn.sock

Steps Taken:

Increased proxy_read_timeout, proxy_send_timeout, and proxy_connect_timeout in nginx.conf.
Verified that /tmp/uvicorn.sock exists and is accessible.
Adjusted client_max_body_size in nginx.conf.
Tested backend response directly with curl (401 Unauthorized).
Confirmed permissions for directories and files in /home/django/data.
Expected Behavior:
The /api/jobs/<job_id>/preview endpoint should return a valid preview for the given job instead of resulting in a 502 Bad Gateway.

Questions:

Is there a specific CVAT image tag or version where this issue is addressed?
Are there additional configurations required to fix the 502 Bad Gateway or 499 errors?
Could the manual permission adjustments for /tmp/uvicorn.sock be related to this issue?
Any guidance or suggestions to resolve this issue would be greatly appreciated!

Expected Behavior

No response

Possible Solution

No response

Context

No response

Environment

No response

@Martin-Ball Martin-Ball added the bug Something isn't working label Nov 30, 2024
@azhavoro
Copy link
Contributor

azhavoro commented Dec 4, 2024

Any logs from the server pod?

@bsekachev bsekachev added the need info Need more information to investigate the issue label Dec 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working need info Need more information to investigate the issue
Projects
None yet
Development

No branches or pull requests

3 participants