Jump to content

Problem uploading big files


Sirmadsen

Recommended Posts

Posted

Hi.

I'm having problems uploading big files on my community. below 1 GB is fine but as soon as it's above 1 GB I get this error "There was a problem processing the uploaded file. Please contact us for assistance." The upload goes all the way to 100% and then it loads for a couple of minutes and then throws the error.

I'm on a VPS server so I have access to all settings and set the php.ini as below

file_uploads = On
max_execution_time = 20000
max_input_time = -1
max_input_vars = 1000
memory_limit = -1
session.save_path = ""
upload_max_filesize = 5G
session.gc_maxlifetime = 1440
zlib.output_compression = Off
post_max_size = 10G

Not sure why it's not working...

Any tips?

Posted
3 hours ago, Nesdam1981 said:

Hi.

I'm having problems uploading big files on my community. below 1 GB is fine but as soon as it's above 1 GB I get this error "There was a problem processing the uploaded file. Please contact us for assistance." The upload goes all the way to 100% and then it loads for a couple of minutes and then throws the error.

I'm on a VPS server so I have access to all settings and set the php.ini as below

file_uploads = On
max_execution_time = 20000
max_input_time = -1
max_input_vars = 1000
memory_limit = -1
session.save_path = ""
upload_max_filesize = 5G
session.gc_maxlifetime = 1440
zlib.output_compression = Off
post_max_size = 10G

Not sure why it's not working...

Any tips?

try using  M instead of G (e.g. 10G = 10000M)

Posted

That didn't do anything. Also still getting errors on some files below 1 GB actually. Somehow I've always had this problem with IPB/IPS while other software I use have no problem at all.

Can it have anything to do with cache?

Posted

For information, just tried to upload the same big file to a fresh wordpress installation in a sub domain with the same php.ini settings... It worked just fine.

Posted
2 hours ago, Nesdam1981 said:

Also works on a xenforo install. So it's definitely IPS related.

Perhaps support ticket explaining all you have done (more info they have may help find the cause quicker)

Posted

Maybe later. I'm using beta version right now so either that's the problem or maybe it will be fixed on full release. Anyhow, can't create support ticket while on beta version.

Posted
Quote

There was a problem processing the uploaded file. Please contact us for assistance.

1) Check the system logs in the AdminCP to see if anything has been logged.

2) If you are familiar with using your browser console, watch the network tab during the upload and look at the response received when the upload finishes. Is it an error/error page or something else? What's the HTTP response code?

If you aren't familiar with these things you're certainly welcome to submit a ticket so we can take a look directly.

Posted
7 hours ago, Nesdam1981 said:

Yes, but I doubt support will actually help when I'm running a beta version. I might try it anyway.

Beta is supported, per the notes when you download.  

Posted

No, I'm not using cloudflare. I decided to go back to using just the regular file system for now. I'm having issues with S3 as well. Taking forever to move the files back to file system though. If it even is...

Posted

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Posted
3 hours ago, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Glad I never tried to transfer to S3.  I ALSO use big files for downloads app. (video files). A back up plan of mine if my server space gets low has been S3 or a second server to host files.  Based on this it sounds like neither of those options would actually work. YIKES!! 

I am worried about what I will do if my disk space gets low on local server.

Posted
On 4/11/2018 at 9:23 AM, SJ77 said:

I am worried about what I will do if my disk space gets low on local server.

Actually getting very worried. Now there are no viable options. Ideas would be appreciated.

  • 3 weeks later...
Posted
On 4/11/2018 at 9:17 AM, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

I'm having an issue with downloads app where anything over 35-40MB is giving me an error:

image.png.3072d06fdc6b0732ff8b6561aabb4fd9.png

My downloads app is using Amazon S3. Regular attachments in my forum are saving locally and uploading without issue. what could be causing this? 

Posted
On 4/11/2018 at 9:17 AM, bfarber said:

If you are using S3, then this means when the file is uploaded to the server it has to be read into memory and transferred to S3. Naturally, with extremely large files this may not be feasible. We will look into further performance improvements with S3 integration in a future release, but for ultra large files like the ones you're talking about, local file system is likely going to work best.

Would it be possible to save them to the File System first instead, rather than having them read into memory, then transfer them out to S3 after that save is complete? Would that be an appropriate work around for that?

Posted

From a technical standpoint, the file is stored locally on the file system and then transferred to S3. The issue you're running in to is that a lot of memory is used during this transfer.

Ultimately your only (short term) solution is to remove or raise the memory limit.

Posted
On 5/4/2018 at 10:45 PM, SJ77 said:

Sounds like the issue described here

 

Something close to that.

On 5/4/2018 at 11:21 PM, AlexWright said:

Would it be possible to save them to the File System first instead, rather than having them read into memory, then transfer them out to S3 after that save is complete? Would that be an appropriate work around for that?

i agree, i have no issues even with cloudflare uploading large attachments. My downloads app which is tied to Amazon S3 is failing on files over 40MB. Ive chedcked server logs, increased memory and upload limits and still the same. I'm stumped. Even have a support ticket out.

21 hours ago, bfarber said:

From a technical standpoint, the file is stored locally on the file system and then transferred to S3. The issue you're running in to is that a lot of memory is used during this transfer.

Ultimately your only (short term) solution is to remove or raise the memory limit.

Memory limit raising isn't working for me. What would you suggest raising it to? I've done 256, 512 and still can't get it working correctly. Small files are fine, its files over 40MB for me and I only allow up to 100MB. Admin CP does not record what the issue (which I wish it did) is and logs show nothing on the server although I'm still running tests..

Posted

Try removing the memory limit entirely temporarily, and see if it works then. If it does, then you know it was the memory limit blocking the transfer. If it still fails, you've ruled that out as the cause.

If the memory limit is blocking the transfer, it represents a Fatal PHP error, which isn't something that the software can catch and log (although in most cases this should log to your PHP or server error log, depending upon your configuration).

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...