I have some code that I have written, which constructs an image file from user uploaded partial files (a "chunker"). It is a simple for loop that does a file_get_contents and then a file_put_contents(with the FILE_APPEND flag set). When the total size is <4MB the code successfully produces an output file from the input files. If the total is >4MB, the append operation simply stops at the 4MB point. Interestingly, the client never even gets an HTTP response from the script in this case, as if the thread has been simply killed.
I've tried adjusting .htaccess:
# Do not remove this line, otherwise mod_rewrite rules will stop working
php_value memory_limit 300M
php_value session.gc_maxlifetime 10800
php_value max_input_time 10800
php_value max_execution_time 10800
php_value upload_max_filesize 300M
php_value post_max_size 300M
to no avail.
The loop in question:
for($i = 1; $i <= $count; $i++)
//echo $dir . $tempdir . $fileName . "_" . $i . "_" . $count .".png";
$cat_file_part = false;
while(!file_exists($dir . $tempdir . $fileName . "_" . $i . "_" . $count .".png"))
$cat_file_part = file_get_contents($dir . $tempdir . $fileName . "_" . $i . "_" . $count .".png");
file_put_contents($dir . $fileName . ".png", $cat_file_part, FILE_APPEND);
//$cat_file .= $cat_file_part;
As you can see, there is no branching that I can see might cause this behavior, so I assume it is a server side setting. Is this a limitation of my free 000webhost account, or am I missing a simple setting (or some basic PHP rule)?