Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coomer cannot sleep hard enough maybe needs more advanced sleep set up? #6597

Closed
left1000 opened this issue Dec 3, 2024 · 7 comments
Closed

Comments

@left1000
Copy link

left1000 commented Dec 3, 2024

I am getting standard 403 from coomer even using --sleep-request 8

[coomerparty][debug] Sleeping 7.98 seconds (request) [urllib3.connectionpool][debug] https://coomer.su:443 "GET /api/v1/onlyfans/user/ANYLARGECOOMERRIP?o=2500 HTTP/11" 403 1090 [coomerparty][error] HttpError: '403 Forbidden' for 'https://coomer.su/api/v1/onlyfans/user/ANYLARGECOOMERRIP?o=2500' [coomerparty][debug] Traceback (most recent call last): File "gallery_dl\job.pyc", line 151, in run File "gallery_dl\extractor\kemonoparty.pyc", line 82, in items File "gallery_dl\extractor\kemonoparty.pyc", line 558, in _pagination File "gallery_dl\extractor\kemonoparty.pyc", line 551, in _call File "gallery_dl\extractor\common.pyc", line 244, in request gallery_dl.exception.HttpError: '403 Forbidden' for 'https://coomer.su/api/v1/onlyfans/user/ANYLARGECOOMERRIP?o=2500'

solution? sleep's need a way to grow the more often they occur.... so like
--sleep-request 8 --sleep-request-growth-acceleration 2

or some such... it would either double the seconds to sleep or add 2 seconds whatever method you'd prefer...

This is something rclone does to deal with google drive that's where I got the idea...

The situation this becomes a serious problem is for onlyfans rips that have like 1000s of posts, tiny coomer rips without that many pages won't face this issue.

@left1000
Copy link
Author

left1000 commented Dec 3, 2024

Even --sleep-request 68 wasn't good enough, still ran into 403 errors... but if sleep request grew? it could sleep for like 10/20/40/80/160/320/640 seconds or something and that'd probably work...

edit: --sleep-request 168 failed trying 368 now

edit2: coomer just went offline to me so uh, either this issue is irrelevant or mandatory depending on if coomer is now blocking me personally or everyone :)

@left1000
Copy link
Author

left1000 commented Dec 4, 2024

went from getting 403 at ?o=2000 to getting it at ?o=50 even though I've gone from ---sleep-request 60 to --sleep-request 1600 so the site is likely just f'd although ... gallery-dl does not appear to have any way to resume.... so every time I get a crash out due to 403.... I have to start at o=0....

If after the 403 crash instead of crashing and restarting I could resume on whatever page of the coomer I left off on each page being o=50+X(pages) in case that's not clear from this issue...

Usually I just repeat a rip to resume it... but coomer is so fragile today that timing out for ten minutes and then resuming without crashing out would be ideal :(

[coomerparty][error] HttpError: '403 Forbidden' for 'https://coomer.su/api/v1/onlyfans/user/ONLYFANSPAGENAME?o=50' [coomerparty][debug] Traceback (most recent call last): File "gallery_dl\job.pyc", line 151, in run File "gallery_dl\extractor\kemonoparty.pyc", line 82, in items File "gallery_dl\extractor\kemonoparty.pyc", line 558, in _pagination File "gallery_dl\extractor\kemonoparty.pyc", line 551, in _call File "gallery_dl\extractor\common.pyc", line 244, in request gallery_dl.exception.HttpError: '403 Forbidden' for 'https://coomer.su/api/v1/onlyfans/user/ONLYFANSPAGENAME?o=50'

Surviving this error without crashing out of the command would be ideal... like ideally at least the first once or twice I hit this error I'd love it if gallery-dl just waited for my ---sleep-request timer... in other words... no need to resume if it simple never crashes out...

@mikf
Copy link
Owner

mikf commented Dec 4, 2024

You can resume from a specific offset by providing o=... as query parameter,so you don't have to go over all already downloaded posts over and over again:

Never mind, I'm stupid and broke this functionality in 74d855c

params["o"] = text.parse_int(params.get("o")) % 50

@left1000
Copy link
Author

left1000 commented Dec 5, 2024

oh also uh I'm either ip banned or temp ip banned from coomer :( that explains some of these issues. although uh... it sounds like in some ways the bug you're describing essentially turned my tests of the issue into a mini ddos... I'll wait a few days and see if it works for me again before doing any more complaining about coomer :)

mikf added a commit that referenced this issue Dec 5, 2024
@mikf
Copy link
Owner

mikf commented Dec 7, 2024

Using o query parameters is fixed: f33ca82

@left1000 left1000 closed this as completed Dec 7, 2024
@MAZ06
Copy link

MAZ06 commented Dec 9, 2024

@mikf This is not fixed. I think it only makes it last a little longer, but it still eventually fails. It's noticeable on bigger profiles with many pages. Seems to be random sometimes if it fails or not.

@left1000
Copy link
Author

@mikf This is not fixed. I think it only makes it last a little longer, but it still eventually fails. It's noticeable on bigger profiles with many pages. Seems to be random sometimes if it fails or not.

yeah, but that's not on gallery-dl, coomer is dying due to heavy traffic, but with the bugfix on our end we can resume halfway through a rip instead of starting at the start. that's why I marked it closed. I created the issue in the first place because there was literally nothing I could do to work around the issue, but now there is.

although maybe the code could be improved further, I don't personally know how

gallery-dl -v https://coomer.su/onlyfans/user/USERNAME?o=2050

but yeah if you get the 403 error on o=2050 you can resume using that page now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants