INDIG0 Posted December 6, 2021 Posted December 6, 2021 (edited) Hello. Why can't I upload large files >200 MB? [Thu Dec 02 18:36:05.815738 2021] [fcgid:warn] [pid 30965] [client xx.xx.xxx.xx:xxxxx] mod_fcgid: stderr: PHP Fatal error: Allowed memory size of 3246391296 bytes exhausted (tried to allocate 1063040304 bytes) in /var/www/site/data/www/site.ru/system/Http/Request/Sockets.php on line 269, referer: https://site.ru/topic/4447-test/ There are maximum limits everywhere php.ini max_input_vars = "10000" opcache.max_accelerated_files = "100000" output_buffering = "4096" session.save_path = "/var/www/site/data/tmp" upload_max_filesize = "3000M" log_errors = "On" mail.add_x_header = "On" max_execution_time = "1800" memory_limit = "3096M" sendmail_path = "/usr/sbin/sendmail -t -i -f 'admin@site.ru'" upload_tmp_dir = "/var/www/site/data/tmp" date.timezone = "Europe/Moscow" post_max_size = "3000M" opcache.blacklist_filename = "/opt/opcache-blacklists/opcache-*.blacklist" short_open_tag = "On" display_errors = "off" Edited December 6, 2021 by INDIG0
Marc Posted December 6, 2021 Posted December 6, 2021 Given the error message there, that would be a question for your hosting company.
INDIG0 Posted December 6, 2021 Author Posted December 6, 2021 @Marc Stridgen My sysadmin looked and still does not understand what the reason is, because there are maximum php limits everywhere. Can the IPS somewhere else block the size of the uploads file in addition to the settings in user groups?
Marc Posted December 6, 2021 Posted December 6, 2021 That message being logged is being logged by your PHP instances limits unfortunately, and isnt being returned by the software
INDIG0 Posted December 6, 2021 Author Posted December 6, 2021 1 minute ago, Marc Stridgen said: That message being logged is being logged by your PHP instances limits unfortunately, and isnt being returned by the software Maybe there is a problem with the cloud? Because s3 is connected to store files.
Marc Posted December 6, 2021 Posted December 6, 2021 13 minutes ago, INDIG0 said: Maybe there is a problem with the cloud? Because s3 is connected to store files. Im not sure what you mean here. You are not using our cloud product at present
INDIG0 Posted December 6, 2021 Author Posted December 6, 2021 59 minutes ago, Marc Stridgen said: Im not sure what you mean here. You are not using our cloud product at present I mean amazon s3 and others.
Solution Stuart Silvester Posted December 6, 2021 Solution Posted December 6, 2021 Your upload_max_size, post_max_size and memory_limit are way too large. Having these huge values (especially the post/upload) is actually counter-intuitive. When you have these set to such large values, the files being uploaded won't be processed in chunks. If you set upload_max_size/post_max_size to something like 10M, you will have a much better chance of your upload succeeding.
INDIG0 Posted December 6, 2021 Author Posted December 6, 2021 1 hour ago, Circo said: Are you using Cloudflare? Yes, I use an analogue, but the tech support said that they did not have any pass limits. @Stuart Silvesterok, tnx, I will try to test on other values
Marc Posted December 7, 2021 Posted December 7, 2021 14 hours ago, INDIG0 said: Yes, I use an analogue, but the tech support said that they did not have any pass limits. Would always be good to disable that temporarily while testing, just to ensure that is the case
Recommended Posts