File_Get_Contents =≫ PHP Fatal Error: Allowed Memory Exhausted

file_get_contents = PHP Fatal error: Allowed memory exhausted

Firstly you should understand that when using file_get_contents you're fetching the entire string of data into a variable, that variable is stored in the hosts memory.

If that string is greater than the size dedicated to the PHP process then PHP will halt and display the error message above.

The way around this to open the file as a pointer, and then take a chunk at a time. This way if you had a 500MB file you can read the first 1MB of data, do what you will with it, delete that 1MB from the system's memory and replace it with the next MB. This allows you to manage how much data you're putting in the memory.

An example if this can be seen below, I will create a function that acts like node.js

function file_get_contents_chunked($file,$chunk_size,$callback)
{
try
{
$handle = fopen($file, "r");
$i = 0;
while (!feof($handle))
{
call_user_func_array($callback,array(fread($handle,$chunk_size),&$handle,$i));
$i++;
}

fclose($handle);

}
catch(Exception $e)
{
trigger_error("file_get_contents_chunked::" . $e->getMessage(),E_USER_NOTICE);
return false;
}

return true;
}

and then use like so:

$success = file_get_contents_chunked("my/large/file",4096,function($chunk,&$handle,$iteration){
/*
* Do what you will with the {$chunk} here
* {$handle} is passed in case you want to seek
** to different parts of the file
* {$iteration} is the section of the file that has been read so
* ($i * 4096) is your current offset within the file.
*/

});

if(!$success)
{
//It Failed
}

One of the problems you will find is that you're trying to perform regex several times on an extremely large chunk of data. Not only that but your regex is built for matching the entire file.

With the above method your regex could become useless as you may only be matching a half set of data. What you should do is revert to the native string functions such as

  • strpos
  • substr
  • trim
  • explode

for matching the strings, I have added support in the callback so that the handle and current iteration are passed. This will allow you to work with the file directly within your callback, allowing you to use functions like fseek, ftruncate and fwrite for instance.

The way you're building your string manipulation is not efficient whatsoever, and using the proposed method above is by far a much better way.

Hope this helps.

file_put_contents and file_get_contents exhaust memory size

The amount that it shows you in tried to allocate xxxx bytes is the amount over and above the memory limit in PHP. This means you have exhausted your 128MB while you were trying to allocate an additional ~80MB.

Even if you can fit the file into memory, when you know the file is going to be that large, it will be a lot better for you to use a combination of fopen/fread/fwrite/fclose.

I assume that you're going more than just reading the contents and writing it to another file, though, right? Because if that's all you need, you can just use the copy function.

safe' json_decode( ,,, ) to prevent exhausting memory

You must be getting some massive JSON responses if they manage to exhaust your server's memory. Here are some metrics with a 1 MB file containing a multidimensional associated array (containing data prepared for entry into three MySQL tables with diverse data-types).

When I include and the file is loaded into memory as an array, my memory usage goes to 9 MB. If I get the raw data with file_get_contents(), it takes 1 MB memory as expected. Then, a PHP array has an approximate ratio of 1:9 to the strlen() of the data (originally output with var_export()).

When I run json_encode(), peak memory usage doesn't increase. (PHP allocates memory in blocks so there's often a bit of overhead, in this case enough to include the string data of the JSON; but it could bump you up one block more.) The resulting JSON data as a string takes 670 KB.

When I load the JSON data with file_get_contents into a string, it takes an expected 0.75 MB of memory. When I run json_decode() on it, it takes 7 MB of memory. I would then factor a minimum ratio of 1:10 for JSON-data-bytesize decoded to native PHP array-or-object for RAM requirement.

To run a test on your JSON data before decoding it, you could then do something like this:

if (strlen($my_json) * 10 > ($my_mb_memory * 1024 * 1024)) {
die ('Decoding this would exhaust the server memory. Sorry!');
}

...where $my_json is the raw JSON response, and $my_mb_memory is your allocated RAM that's converted into bytes for comparison with the incoming data. (You can of course also use intval(ini_get('memory_limit')) to get your memory limit as an integer.)

As pointed out below, the RAM usage will also depend on your data structure. For contrast, a few more quick test cases because I'm curious myself:

    1. If I create a uni-dimensional array with integers 1-60000, the saved PHP array size is 1 MB, but peak RAM usage is between 10.5 and 12.5 MB (curious oscillation), or a ratio of 1:12-ish.

    1. If I create a 1 MB file's worth data as 12000 random strings as a basic associative array, memory usage is only 5 MB when loaded; ratio of 1:5.

    1. If I create a 1 MB file's worth as a similar associative array, where half the entries are arrays as strings with a numeric index, memory usage is 7 MB, ratio 1:7.

So your actual RAM mileage may vary a good deal. Also be aware that if you pass that bulk of data around in circles and do a bit of this and that, your memory usage may get much (or exponentially, depending on your code economy) higher than what json_decode() alone will cause.

To debug memory usage, you can use memory_get_usage() and/or memory_get_peak_usage() at major intervals in your code to log or output the memory used in different parts of your code.

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 262144 bytes)

Thanks For Everyone who tried to help me to solve this. Anyway I found the problem. It's my mistake: I have used the same method name inside of UserView.class.php and User.class.php. The method name is username_exists. Here is the problematic code.

//this is the method in modal class

protected function username_exists($username){

if($this->if_tableEmpty() != true){

$this->username = $username;


$sql = "SELECT username FROM user_login WHERE username = ?";

$stmt = $this->connect()->prepare($sql);

$stmt->execute([$this->username]);

$result = $stmt->rowCount();

return $result;
}else{
return 0;
}



}

//This is the method in Controller class

public function username_exists($username){

$this->username = $username;

$result = $this->username_exists($this->username);

if($result > 0){

return true;

}else{

return false;

}

}

Thanks again to all of you!

PHP Fatal error: Out of memory (allocated 80740352) (tried to allocate 12352 bytes) in

The optimal memory_limit value depends on what you are doing with the uploaded files. Do you read the files into memory using file_get_contents or the GD library? In that case, increase memory_limit to at least the same as upload_max_filesize, preferably more.

If you are using GD, keep in mind that GD holds the entire image uncompressed in memory. This means that it takes memory in the range of width * height * bit-depth, e.g., 1024*768*32 = 25 165 824 bits = 3 MB for a screenshot, or as much as 55 MB for a 14 megapixel image.

Some operations may need to create a copy of the image, so consider setting memory_limit to the double of what you need to keep the image in memory. Also make sure to not load all images into memory at once if you don't have to. You can free the memory used by GD by calling imagedestroy on the handle when you are done working with the image.



Related Topics



Leave a reply



Submit