Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #3237
    Andrew Frueh
    Participant

    Hi there, love the plugin, has been a huge help on our site. Recently have run into a strange issue. When attempting to download a file through the URL created by “Download Attachments”, a 0 kB file is downloaded. The page still shows it as 352 Mb, and if I direct link to the file it works fine. This appears to only be an issue on very large files (over 100 Mb or more). Running WordPress 3.7.1 and using the 1.0.8 of the plugin. Have you seen this issue before? Is there a workaround that we can implement? Thanks in advance!

    #3239
    Bartosz
    Keymaster

    Hello Andrew,

    And thank you for reporting this.

    I’m wondering what might be the issue here. Before we dive into DA, please check one thing for us: insert into any post or page a simple link to one of the large downloadable files (.zip for example) and try to download it at the frontend of the site.

    #3240
    Andrew Frueh
    Participant

    Hi Bartosz,
    Thanks very much for the quick reply! As requested, I dropped a link into the same post using “Insert Media” and selecting the zip file. After publishing, that 300+ Mb link does indeed download correctly. So it would appear to not be a server or WordPress issue with the large file. I’ve checked the issue in FireFox and Chrome and it behaves the same in both. Let me know if there’s anything else you’d like me to test.

    #3242
    Bartosz
    Keymaster

    For now we tested this on local server and 500MB+ files and it worked well. On monday we’ll do some tests with shared hosting and VPS server and get back to you.

    #3243
    Andrew Frueh
    Participant

    Interesting. Thanks for looking into it. If there is any additional info I can get you about our specific setup, let me know.

    #3257
    Bartosz
    Keymaster

    Andrew, I just emailed you.

    #3265
    Andrew Frueh
    Participant

    Hey Bartosz,
    Thanks very much. No luck on the change to zlib.output_compression unfortunately, but that did help me a bunch in at least knowing where to look in the code to debug. It turned out that we only had the server allocating 67 megs of memory for PHP, so as soon as files were larger than that, the script failed. We decided instead of upping the memory allocation (because we’d always have hypothetical ceiling), we did the download calls slightly differently in the da_download_attachment function. Instead of fopen and reading the entire file, we used a single header call:

    header( ‘Location: http://nameofsite.com/wp-content/uploads/’.$attachment ) ;

    Obviously hardcoding the URL of the site is sloppy, but good enough for a quick fix. I wonder, is there a benefit to reading the whole file as you currently have it? Or might you be able to do something similar to the above and just header link to the path of the file so you don’t run into memory constraints?

    Thanks again for being so responsive and helping me work out a solution. Really appreciate it!

Viewing 7 posts - 1 through 7 (of 7 total)
  • You must be logged in to reply to this topic.