Nginx, huge downloads protected by s2member

I’ve found this problem in the past, but forgot to mention it.

[09-Nov-2022 10:47:00] WARNING: [pool www] child 1358706, script ‘/var/www/openmtbmap.org/htdocs/index.php’ (request: “GET /index.php?s2member_file_download=10m/osx/gmapi_contours_europe10m.7z”) execution timed out (1250.592901 sec), terminating
[09-Nov-2022 10:47:00] WARNING: [pool www] child 1358706 exited on signal 15 (SIGTERM) after 2890.632954 seconds from start

See this warning - you expect there is something wrong with one of the following

PHP.ini:
max_execution_time = 27200

nginx.conf:
proxy_read_timeout 300;
proxy_connect_timeout 300;
proxy_send_timeout 300;

However if you set them to reasonable values - like 30000 (its in seconds)

  • you will still not solve this problem. Because somehow s2member requires a temp filesize for nginx equivalent or bigger to the size of the download.

    fastcgi_max_temp_file_size 18000M;
    fastcgi_read_timeout 27000;

the max_temp_file_size needs to be bigger or as big as your largest protected download. I have some 10-12GB downloads so I had to increase this to 13000M or bigger.
Also I think increase the read_timeout here (not sure if this is important too or not).

It must have something peculiar to do with the way s2member protects downloads - because googling this you will likely not find a solution.
In general it’s best to try to not protect downloads of big sizes with s2member - as no single browser supports a resume by default. The only way to resume s2member downloads is by using Free Download Manager with a browser plugin for pickup. Maybe some other download manager software are okay too - so far I only found FDM that works realiably for resume support. If s2member could be improved to offer resume for Chrome/Edge/Safari inbuilt resume support this would be great.

Thanks for the feedback on that, Felix. I’m making a note about it.

If self-hosting a protected file that large is a problem, maybe the S3 integration could be a solution?

:slight_smile:

hosting huge protected files on S3 is way too expensive. I’m not really sure what the problem is about resume support and s2member - if the standard download managers of Chrome, Safari,Edge and Firefox could resume s2member downloads - that would be a really big improvement. (and I think this is independant if you run fastcgi-proxy php/nginx or Apache PHP as frontend because the downloads seem to be handled by PHP anyhow.

1 Like

I see. Got it.

When accessing a file protected by s2Member, s2 is not just forwarding the user access to the file, it’s serving the file via PHP. So maybe for the browser it’s a bit different than a normal file download to resume, although you say Free Download Manager can do it, so it is possible…

I checked the code that handles the downloads (s2member/src/includes/classes/files-in.inc.php) and see that it includes the Accept-Ranges header, so it already allows for partial downloads and resuming (which explains why the Free Download Manager can do it).

I’d need to study this further…

1 Like

Would be great if you can find a solution to this. Yeah FDM can resume, because I think it picks up the download the same way (it only works if you have FDM browser plugin installed - which forwards the information to FDM program - and then uses that same information for resuming).

In general S3AWS and others are great for offering (many) small files for download - but for large downloads a host with a dedicated unlimited 1Gbit (or faster) connection like Hetzner is the way to go. I likely would pay high XXX USD / month, maybe over 1000USD/month if I used AWS, while only paying like 50USD/month for a dedicated server at hetzner.

1 Like