Thread tagged as: Error, Hosting, Members

Members App: Secure Downloads: Downloads fail (timeout?)


File downloads by logged members start normally, but fail after a few minutes with a "Connection to server was reset" error.

The files are saved in a secure bucket outside of www root and can be quite large (from 70KB to 1.2GB). Failed downloads can happen even with small files.

Could it be a timeout issue with the PHP script that streams the secure files? Session timeout is set to 24 minutes (see diagnostics report below), but the connection reset happens after just a few minutes (5 approximately).

From Diagnostics:

Perch: 2.7.9, PHP: 5.4.34, MySQL: 5.1.73, with PDO
Server OS: Linux, fpm-fcgi
Installed apps: content (2.7.9), assets (2.7.9), categories (2.7.9), perch_members (1.1), perch_backup (1.2)
App runtimes: <?php $apps_list = array( 'content', 'categories', 'perch_members' );
Image manipulation: GD
PHP limits: Max upload 64M, Max POST 64M, Memory: 512M, Total max file upload: 64M
Resource folder writeable: Yes

PHP: 5.4.34
Zend: 2.4.0
OS: Linux
SAPI: fpm-fcgi
Safe mode: not detected
MySQL client: 5.1.73
MySQL server: 5.1.73-2+squeeze+build1+1-log
Extensions: Core, date, ereg, libxml, openssl, pcre, sqlite3, zlib, bcmath, bz2, calendar, ctype, curl, dba, dom, hash, fileinfo, filter, ftp, gd, gettext, gmp, SPL, iconv, session, intl, json, mbstring, mcrypt, mysql, mysqli, PDO, pdo_mysql, pdo_pgsql, pdo_sqlite, pgsql, standard, posix, pspell, Reflection, imap, SimpleXML, soap, sockets, Phar, exif, sysvmsg, sysvsem, sysvshm, tokenizer, wddx, xml, xmlreader, xmlrpc, xmlwriter, xsl, zip, memcache, cgi-fcgi, mhash, ionCube Loader, Zend OPcache
GD: Yes
ImageMagick: No
PHP max upload size: 64M
PHP max form post size: 64M
PHP memory limit: 512M
Total max uploadable file size: 64M
Resource folder writeable: Yes
Session timeout: 24 minutes
Native JSON: Yes
Filter functions: Yes
Transliteration functions: Yes

Any help appreciated.

Stéphane Mégécaze

Stéphane Mégécaze 0 points

  • 6 years ago

It's more likely to be your PHP max upload size and PHP max form post size. Are you able to edit your server's php.ini (or equivalent) file? You'll need to up these allowances to cover the biggest file you're likely to transfer (1.2GB is equivalent to 1200M). Hopefully that'll fix it :)

Also, check your form template has the right group declared in the perch:input's accept="" attribute. These are grouped in perch/config/filetypes.ini and all you have to do is declare the header that's in square brackets. e.g.

<perch:input type="file" id="file" label="File" accept="image pdf office" />

That one got me good and proper last week! Good luck :)

Hello Martin,

Thank you for the suggestions!

The files are uploaded to the secure buckets via FTP, so there is no issue with max upload size or Perch templates.

It's an issue during download, as if the connection was reset after about 5 minutes (tested with Chrome and Safari latest). I suspect a PHP timeout issue, and wonder if some of you have been able to offer large files for download.

Drew McLellan

Drew McLellan 2638 points
Perch Support

When the download is terminated, do you get anything in your server's PHP error log?

Hello Drew,

Here is the server's error log:

Tue Jan 27 17:32:42 2015] [error] [client] [host] (104)Connection reset by peer: FastCGI: comm with server "/" aborted: read failed, referer:
[Tue Jan 27 17:32:42 2015] [error] [client] [host] Handler for fastcgi-script returned invalid result code 1, referer:
Drew McLellan

Drew McLellan 2638 points
Perch Support

What does your download.php file do?

It's very basic:


    // Remove the PHP time limit to avoid timeouts during download
    // (Doesn't solve the issue)

    // Language detection

    // Perch

    // config
    $bucket_name = 'active';
    $url_param   = 'file';

    // By default, deny downloads unless we've proved the member is allowed this file.
    $allow_download = false;

    // Check if member is logged in
    //if (perch_member_logged_in()) {
    //     $allow_download = true;

    if (perch_member_has_tag('active')) {
        $allow_download = true;

    if ($allow_download) {

        perch_members_secure_download(perch_get($url_param), $bucket_name);



and the services page calling the download script has links such as this one:

<a href="<?php if (perch_member_has_tag('active')) { ?>/services/download.php?file=/fw1617/<?php } else { echo "#"; } ?>" target="_blank" download="download" class="download-link"><?php echo _("Prints"); ?></a>
Drew McLellan

Drew McLellan 2638 points
Perch Support

That looks like it should work. To be honest I'm not sure what's causing the problem.

In theory, how large could the downloadable files be? Have you been able to offer 1GB downloads successfully? Trying to figure if the issue is with server settings (timeout, memory) or the download script.

I've got a file transfer system (both uploads and downloads) that works with up to 2GB. I had some help setting up the server but I remember there being something that was set on the server to allow the large downloads, just not exactly what… I'll ask my server friend and get back to you if we're able to work out/remember what we did.

Thanks Martin, that's encouraging! Curious to know which setting/rule had to be added to prevent the issues we're experiencing.

Clutching at straws a wee bit but could it be something to do with this?:

I doubt it as we are way below the bandwidth quota (the website hasn't launched yet), and it fails after 5 minutes only.

We have no access to php.ini as it's a shared server.

I've got a dedicated virtual server set up to handle the site, so maybe there was something there that got it working.

How about hosting the uploads on something like S3? I think you may have to upgrade Perch to Runway, but it would probably be cheaper overall than moving to a new server.

Thank you Martin and Drew.

The issue was caused by the hoster which killed scripts taking more than 300 seconds to run (max execution time).

We circumvented the problem by creating a symlink to the file in a randomly-named directory within the "www" root and letting the browser download the file directly (no PHP "streaming"). The symlink is then deleted via cron after a few minutes to avoid sharing of download links.

Our solution was inspired by this script

Wow! I'll be sure to keep an eye out for that! Thanks Stéphane and congratulations on the fix :)