Jump to content

Featured Replies

Posted

Hello.

Why can't I upload large files >200 MB?

[Thu Dec 02 18:36:05.815738 2021] [fcgid:warn] [pid 30965] [client xx.xx.xxx.xx:xxxxx] mod_fcgid: stderr: PHP Fatal error:  Allowed memory size of 3246391296 bytes exhausted (tried to allocate 1063040304 bytes) in /var/www/site/data/www/site.ru/system/Http/Request/Sockets.php on line 269, referer: https://site.ru/topic/4447-test/

There are maximum limits everywhere

php.ini

max_input_vars = "10000"
opcache.max_accelerated_files = "100000"
output_buffering = "4096"
session.save_path = "/var/www/site/data/tmp"
upload_max_filesize = "3000M"
log_errors = "On"
mail.add_x_header = "On"
max_execution_time = "1800"
memory_limit = "3096M"
sendmail_path = "/usr/sbin/sendmail -t -i -f 'admin@site.ru'"
upload_tmp_dir = "/var/www/site/data/tmp"
date.timezone = "Europe/Moscow"
post_max_size = "3000M"
opcache.blacklist_filename = "/opt/opcache-blacklists/opcache-*.blacklist"
short_open_tag = "On"
display_errors = "off"

Edited by INDIG0

Solved by Stuart Silvester

Go to solution
  • Community Expert

Given the error message there, that would be a question for your hosting company.

  • Author

@Marc Stridgen

My sysadmin looked and still does not understand what the reason is, because there are maximum php limits everywhere.

Can the IPS somewhere else block the size of the uploads file in addition to the settings in user groups?

  • Community Expert

That message being logged is being logged by your PHP instances limits unfortunately, and isnt being returned by the software

  • Author
 

That message being logged is being logged by your PHP instances limits unfortunately, and isnt being returned by the software

Maybe there is a problem with the cloud? Because s3 is connected to store files.

  • Community Expert
 

Maybe there is a problem with the cloud? Because s3 is connected to store files.

Im not sure what you mean here. You are not using our cloud product at present

  • Author
 

Im not sure what you mean here. You are not using our cloud product at present

I mean amazon s3 and others.

Are you using Cloudflare?

  • Community Expert
  • Solution

Your upload_max_size, post_max_size and memory_limit are way too large.

Having these huge values (especially the post/upload) is actually counter-intuitive. When you have these set to such large values, the files being uploaded won't be processed in chunks.

If you set upload_max_size/post_max_size to something like 10M, you will have a much better chance of your upload succeeding.

  • Author
 

Are you using Cloudflare?

Yes, I use an analogue, but the tech support said that they did not have any pass limits.

@Stuart Silvesterok, tnx, I will try to test on other values

  • Community Expert
 

Yes, I use an analogue, but the tech support said that they did not have any pass limits.

Would always be good to disable that temporarily while testing, just to ensure that is the case

Guest
Reply to this topic...

Recently Browsing 0

  • No registered users viewing this page.