How to Get Around or Make PHP JSON_Decode Not Alter My Very Large Integer Values

How to get around or make PHP json_decode not alter my very large integer values?

Thanks @Scott Gottreu and @pospi.

The answer was in the last comment for the accepted answer on this question.

Use the preg_replace() function to surround all integer values with quotes.

json_decode(preg_replace('/("\w+"):(\d+)/', '\\1:"\\2"', $jsonString), true);

Actually after testing the above line it screws up JSON with floating point numbers in as values so to fix that issue I used the following to just enclose all numbers (integer or floating point numbers) in quotes:

json_decode(preg_replace('/("\w+"):(\d+(\.\d+)?)/', '\\1:"\\2"', $jsonString), true);

json_decode AND json_encode long integers without losing data

As long as your PHP version can actually handle large integers, meaning if you're running a 64-bit version of PHP (on something other than Windows), json_decode has no problem with it:

$json  = '{"foo":9223372036854775807}';
$obj = json_decode($json);
$json2 = json_encode($obj);

var_dump(PHP_INT_MAX, $obj, $json2);

int(9223372036854775807)
object(stdClass)#1 (1) {
["foo"]=>
int(9223372036854775807)
}
string(27) "{"foo":9223372036854775807}"

If the integer values you need to handle do exceed PHP's PHP_INT_MAX, you simply cannot represent them in PHP native types. So there's no way around the conundrum you have; you cannot use native types to track the correct type, and you cannot substitute other types (e.g. strings instead of integers), because that's ambiguous when encoding back to JSON.

In this case you will have to invent your own mechanism of tracking the correct types for each property and handle such serialisation with a custom encoder/decoder. For example, you'd need to write a custom JSON decoder which can decode to a custom class like new JsonInteger('9223372036854775808'), and your custom encoder would recognise this type and encode it to a JSON 9223372036854775808 value.

There's no such thing built into PHP.

json_decode is rounding floats, how can I prevent it?

Just wrap the values in quotes: json_decode('[["3.2","1"],["4.8","2"]]');

Handling big user IDs returned by FQL in PHP

json_decode() can convert large integers to strings, if you specify a flag in the function call:

$array = json_decode($json, true, 512, JSON_BIGINT_AS_STRING)

Json numeric value altered on php json_decode

When PHP displays numbers, it uses the php.ini precision setting to decide whether it should display all digits, or use scientific format..... this is a display setting, it doesn't change the value internally.

However, that value is too large for a signed integer in 32-bit PHP, so it will be treated as a float in PHP

From PHP 5.4.0 you have an option to use the option flags to determine how large integer values are to be handled

$decoded = json_decode($encoded, false, null, JSON_BIGINT_AS_STRING);

safe' json_decode( ,,, ) to prevent exhausting memory

You must be getting some massive JSON responses if they manage to exhaust your server's memory. Here are some metrics with a 1 MB file containing a multidimensional associated array (containing data prepared for entry into three MySQL tables with diverse data-types).

When I include and the file is loaded into memory as an array, my memory usage goes to 9 MB. If I get the raw data with file_get_contents(), it takes 1 MB memory as expected. Then, a PHP array has an approximate ratio of 1:9 to the strlen() of the data (originally output with var_export()).

When I run json_encode(), peak memory usage doesn't increase. (PHP allocates memory in blocks so there's often a bit of overhead, in this case enough to include the string data of the JSON; but it could bump you up one block more.) The resulting JSON data as a string takes 670 KB.

When I load the JSON data with file_get_contents into a string, it takes an expected 0.75 MB of memory. When I run json_decode() on it, it takes 7 MB of memory. I would then factor a minimum ratio of 1:10 for JSON-data-bytesize decoded to native PHP array-or-object for RAM requirement.

To run a test on your JSON data before decoding it, you could then do something like this:

if (strlen($my_json) * 10 > ($my_mb_memory * 1024 * 1024)) {
die ('Decoding this would exhaust the server memory. Sorry!');
}

...where $my_json is the raw JSON response, and $my_mb_memory is your allocated RAM that's converted into bytes for comparison with the incoming data. (You can of course also use intval(ini_get('memory_limit')) to get your memory limit as an integer.)

As pointed out below, the RAM usage will also depend on your data structure. For contrast, a few more quick test cases because I'm curious myself:

    1. If I create a uni-dimensional array with integers 1-60000, the saved PHP array size is 1 MB, but peak RAM usage is between 10.5 and 12.5 MB (curious oscillation), or a ratio of 1:12-ish.

    1. If I create a 1 MB file's worth data as 12000 random strings as a basic associative array, memory usage is only 5 MB when loaded; ratio of 1:5.

    1. If I create a 1 MB file's worth as a similar associative array, where half the entries are arrays as strings with a numeric index, memory usage is 7 MB, ratio 1:7.

So your actual RAM mileage may vary a good deal. Also be aware that if you pass that bulk of data around in circles and do a bit of this and that, your memory usage may get much (or exponentially, depending on your code economy) higher than what json_decode() alone will cause.

To debug memory usage, you can use memory_get_usage() and/or memory_get_peak_usage() at major intervals in your code to log or output the memory used in different parts of your code.

Decoding json in php and facebook api

Use

$userdata=json_decode($response,true);

For the long userids issue you can use the below -

$userdata= = json_decode(preg_replace('/"uid":(\d+)/', '"uid":"$1"', $response),true);

Also Could be used as

$userdata=  json_decode($response, true, 512, JSON_BIGINT_AS_STRING)

It will give you an array.

In addition you can check the thread for more variations

How to get around or make PHP json_decode not alter my very large integer values?



Related Topics



Leave a reply



Submit