Just today I had a situation where I wanted to make a large ZIP file available to download, but the host has a memory limit that I cannot change. However, memory should never be a problem in modern applications. In this case however, my code caused an exhaust of the RAM and thus, aborted the process when the requested ZIP file was too large.

Boom: Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted

Every developer hates when it happens, because these things can be hard to debug. In my case I knew it had something to do with reading and presenting the file to the browser. 

I used this code to dispatch the file from the server to the browser, which failed:

php
// ...
$data = file_get_contents($outputPath);
Assert::string($data);
$length = strlen($data);

$response = new Response();
$response->headers->set('Content-Type', 'application/zip');
$response->headers->set(
    'Content-Disposition',
    sprintf('attachment; filename="downloadZip-%s.zip"', $project->getIdentifier()),
);
$response->headers->set('Content-Length', (string) $length);
$response->setContent($data);

@unlink($outputPath);

return $response;

The problematic line was when I was attempting to read the whole ZIP and store it into the `$data` variable. With `file_get_contents` I have an easy tool to read files, but it will definetly write to RAM temporarily and create problems, when the ZIP is too big. 

So, I was checking out potential solutions and eventually figured out that I would rather have to read chunk by chunk of this file in order not to overload the RAM completely with one large file. It has the obvious downside of limiting the troughput slightly, but it works on my small RAM-host quite well. Note, also to myself: if increasing the RAM is the solution to a problem, then the problem still exists and it will eventually just come up again. Not that we tried that here... but still.

In this case, we could only decrease the RAM to begin with, so I luckily didn't had the possibility of this low hanging fix. A manager would have suggested it in the first place. :-)

So, my less beautiful solution that is not so much Symfony like looks now like this, and is working with files up to 5GB (I have not tested any larger file yet):

php
header('Pragma: public');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: public');
header('Content-type: application/zip');
header('Content-Transfer-Encoding: Binary');
header(sprintf('Content-Disposition: attachment; filename="downloadZip-%s.zip"', $project->getIdentifier()));
header('Content-Length: ' . filesize($outputPath));

$fp = @fopen($outputPath, 'rb');
if ($fp) {
    while (!feof($fp)) {
        echo fread($fp, 8192);
        flush();
        if (connection_status() !== 0) {
            @fclose($fp);
            
            return;
        }
    }
    @fclose($fp);
}

@unlink($outputPath);

This is probably not the very best possible solution, and I am still figuring out if there is a more Symfony-way to handle this, but for now I will look at happy faces of the clients using the software, knowing that they are able to proceed with their ZIP archives as expected.

I hope this post can help some developers that are in dire need of solutions a little bit. Please let me know by contacting me by E-Mail if you have any questions. I'll be happy to assist.