20 March 2014 at 21:55 #3237
Hi there, love the plugin, has been a huge help on our site. Recently have run into a strange issue. When attempting to download a file through the URL created by “Download Attachments”, a 0 kB file is downloaded. The page still shows it as 352 Mb, and if I direct link to the file it works fine. This appears to only be an issue on very large files (over 100 Mb or more). Running WordPress 3.7.1 and using the 1.0.8 of the plugin. Have you seen this issue before? Is there a workaround that we can implement? Thanks in advance!21 March 2014 at 16:08 #3239
And thank you for reporting this.
I’m wondering what might be the issue here. Before we dive into DA, please check one thing for us: insert into any post or page a simple link to one of the large downloadable files (.zip for example) and try to download it at the frontend of the site.21 March 2014 at 16:51 #3240
Thanks very much for the quick reply! As requested, I dropped a link into the same post using “Insert Media” and selecting the zip file. After publishing, that 300+ Mb link does indeed download correctly. So it would appear to not be a server or WordPress issue with the large file. I’ve checked the issue in FireFox and Chrome and it behaves the same in both. Let me know if there’s anything else you’d like me to test.21 March 2014 at 20:14 #3242
For now we tested this on local server and 500MB+ files and it worked well. On monday we’ll do some tests with shared hosting and VPS server and get back to you.21 March 2014 at 20:36 #3243
Interesting. Thanks for looking into it. If there is any additional info I can get you about our specific setup, let me know.24 March 2014 at 13:24 #3257
Andrew, I just emailed you.24 March 2014 at 17:59 #3265
Thanks very much. No luck on the change to zlib.output_compression unfortunately, but that did help me a bunch in at least knowing where to look in the code to debug. It turned out that we only had the server allocating 67 megs of memory for PHP, so as soon as files were larger than that, the script failed. We decided instead of upping the memory allocation (because we’d always have hypothetical ceiling), we did the download calls slightly differently in the da_download_attachment function. Instead of fopen and reading the entire file, we used a single header call:
header( ‘Location: http://nameofsite.com/wp-content/uploads/’.$attachment ) ;
Obviously hardcoding the URL of the site is sloppy, but good enough for a quick fix. I wonder, is there a benefit to reading the whole file as you currently have it? Or might you be able to do something similar to the above and just header link to the path of the file so you don’t run into memory constraints?
Thanks again for being so responsive and helping me work out a solution. Really appreciate it!
- You must be logged in to reply to this topic.
How to Get Support
After you register and login to the site, you may post all plugin support questions in the Support Forum.
If you need to provide private info, please create a ticket and then reply to it using Set as private reply option.