-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
wget subprocess crashes on Ubuntu 22.04 #124
Comments
newnode deliberately runs wget with a low maximum file size limit as a way to limit the size of downloads, because I couldn't find a wget option to do that that worked. But wget shouldn't generate a core file when that happens, so I'll try to fix that. The real fix is to implement this in some other way than spawning wget. |
Yes, after reviewing https_wget.c code, it does appear that you guys went on a quite a journey through a rabbit hole to make it work. Curl-multy API seems like an easier alternative at the moment. Interestingly, limiting the file size indeed does not work - quote option does not limit download size for a single file by design (as file size is not always known in the beginning of a download), and range requests can also be ignored by http server. libcurl appears to have the same limitation (for the same reasons). Have you considered using a timeout rather than file size as a limiting factor? Wget does support --read-timeout. |
On 5/5/22 21:14, Anatoly Ivanov wrote:
Yes, after reviewing https_wget.c code, it does appear that you guys
went on a quite a journey through a rabbit hole to make it work.
Right. At the time I started on that approach, wget seemed like the
easiest path. Have considered 2 or 3 alternatives, but there have
always been more urgent things to implement.
Have you considered using a timeout rather than file size as a
limiting factor? Wget does support --read-timeout.
We do implement a timeout, but for different reasons.
Keith
|
The proxy works, but wget subprocess crashes on Ubuntu 22.04
Backtrace on wget looks like this:
The text was updated successfully, but these errors were encountered: