Jump to content

Problem uploading big files


Sirmadsen

Recommended Posts

Hi.

I'm having problems uploading big files on my community. below 1 GB is fine but as soon as it's above 1 GB I get this error "There was a problem processing the uploaded file. Please contact us for assistance." The upload goes all the way to 100% and then it loads for a couple of minutes and then throws the error.

I'm on a VPS server so I have access to all settings and set the php.ini as below

file_uploads = On
max_execution_time = 20000
max_input_time = -1
max_input_vars = 1000
memory_limit = -1
session.save_path = ""
upload_max_filesize = 5G
session.gc_maxlifetime = 1440
zlib.output_compression = Off
post_max_size = 10G

Not sure why it's not working...

Any tips?

Link to comment
Share on other sites

3 hours ago, Nesdam1981 said:

Hi.

I'm having problems uploading big files on my community. below 1 GB is fine but as soon as it's above 1 GB I get this error "There was a problem processing the uploaded file. Please contact us for assistance." The upload goes all the way to 100% and then it loads for a couple of minutes and then throws the error.

I'm on a VPS server so I have access to all settings and set the php.ini as below

file_uploads = On
max_execution_time = 20000
max_input_time = -1
max_input_vars = 1000
memory_limit = -1
session.save_path = ""
upload_max_filesize = 5G
session.gc_maxlifetime = 1440
zlib.output_compression = Off
post_max_size = 10G

Not sure why it's not working...

Any tips?

try using  M instead of G (e.g. 10G = 10000M)

Link to comment
Share on other sites

Quote

There was a problem processing the uploaded file. Please contact us for assistance.

1) Check the system logs in the AdminCP to see if anything has been logged.

2) If you are familiar with using your browser console, watch the network tab during the upload and look at the response received when the upload finishes. Is it an error/error page or something else? What's the HTTP response code?

If you aren't familiar with these things you're certainly welcome to submit a ticket so we can take a look directly.

Link to comment
Share on other sites

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Link to comment
Share on other sites

3 hours ago, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Glad I never tried to transfer to S3.  I ALSO use big files for downloads app. (video files). A back up plan of mine if my server space gets low has been S3 or a second server to host files.  Based on this it sounds like neither of those options would actually work. YIKES!! 

I am worried about what I will do if my disk space gets low on local server.

Link to comment
Share on other sites

  • 3 weeks later...
On 4/11/2018 at 9:17 AM, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

I'm having an issue with downloads app where anything over 35-40MB is giving me an error:

image.png.3072d06fdc6b0732ff8b6561aabb4fd9.png

My downloads app is using Amazon S3. Regular attachments in my forum are saving locally and uploading without issue. what could be causing this? 

Link to comment
Share on other sites

On 4/11/2018 at 9:17 AM, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Would it be possible to save them to the File System first instead, rather than having them read into memory, then transfer them out to S3 after that save is complete? Would that be an appropriate work around for that?

Link to comment
Share on other sites

From a technical standpoint, the file is stored locally on the file system and then transferred to S3. The issue you're running in to is that a lot of memory is used during this transfer.

Ultimately your only (short term) solution is to remove or raise the memory limit.

Link to comment
Share on other sites

On 5/4/2018 at 10:45 PM, SJ77 said:

Sounds like the issue described here

 

Something close to that.

On 5/4/2018 at 11:21 PM, AlexWright said:

Would it be possible to save them to the File System first instead, rather than having them read into memory, then transfer them out to S3 after that save is complete? Would that be an appropriate work around for that?

i agree, i have no issues even with cloudflare uploading large attachments. My downloads app which is tied to Amazon S3 is failing on files over 40MB. Ive chedcked server logs, increased memory and upload limits and still the same. I'm stumped. Even have a support ticket out.

21 hours ago, bfarber said:

From a technical standpoint, the file is stored locally on the file system and then transferred to S3. The issue you're running in to is that a lot of memory is used during this transfer.

Ultimately your only (short term) solution is to remove or raise the memory limit.

Memory limit raising isn't working for me. What would you suggest raising it to? I've done 256, 512 and still can't get it working correctly. Small files are fine, its files over 40MB for me and I only allow up to 100MB. Admin CP does not record what the issue (which I wish it did) is and logs show nothing on the server although I'm still running tests..

Link to comment
Share on other sites

Try removing the memory limit entirely temporarily, and see if it works then. If it does, then you know it was the memory limit blocking the transfer. If it still fails, you've ruled that out as the cause.

If the memory limit is blocking the transfer, it represents a Fatal PHP error, which isn't something that the software can catch and log (although in most cases this should log to your PHP or server error log, depending upon your configuration).

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...