Calculate Size in Bytes of Json Payload Including It in the Json Payload in PHP

Calculate size in bytes of JSON payload including it in the JSON payload in PHP

you can use this to calculate $content's size(DEMO):

$size = strlen(json_encode($content, JSON_NUMERIC_CHECK));

This will provide you with the whole length of json_encode()d string of $content. If you want to calculate the size in bytes(if using multibyte characters), this might be more helpful(DEMO):

$size = mb_strlen(json_encode($content, JSON_NUMERIC_CHECK), '8bit');

Size of json object? (in KBs/MBs)

An answer to the actual question should include the bytes spent on the headers and should include taking gzip compression into account, but I will ignore those things.

You have a few options. They all output the same answer when run:

If Using a Browser or Node (Not IE)

const size = new TextEncoder().encode(JSON.stringify(obj)).length
const kiloBytes = size / 1024;
const megaBytes = kiloBytes / 1024;

If you need it to work on IE, you can use a pollyfill

If Using Node

const size = Buffer.byteLength(JSON.stringify(obj))

(which is the same as Buffer.byteLength(JSON.stringify(obj), "utf8")).

Shortcut That Works in IE, Modern Browsers, and Node

const size = encodeURI(JSON.stringify(obj)).split(/%..|./).length - 1;

That last solution will work in almost every case, but that last solution will throw a URIError: URI malformed exception if you feed it input containing a string that should not exist, like let obj = { partOfAnEmoji: "quot;[1] }. The other two solutions I provided will not have that weakness.

(Credits: Credit for the first solution goes here.
Credit for the second solution goes to the utf8-byte-length package (which is good, you could use that instead).
Most of the credit for that last solution goes to here, but I simplified it a bit.
I found the test suite of the utf8-byte-length package super helpful, when researching this.)

Is there a limit on how much JSON can hold?

JSON is similar to other data formats like XML - if you need to transmit more data, you just send more data. There's no inherent size limitation to the JSON request. Any limitation would be set by the server parsing the request. (For instance, ASP.NET has the "MaxJsonLength" property of the serializer.)

Maximum json size for response to the browser

There is no max size limit of the http response (or the max size of Int or the limit of browser or the limit of server have been configured).

The best approach is use AJAX to load part of data while it need to be shown.

Binary Data in JSON String. Something better than Base64

There are 94 Unicode characters which can be represented as one byte according to the JSON spec (if your JSON is transmitted as UTF-8). With that in mind, I think the best you can do space-wise is base85 which represents four bytes as five characters. However, this is only a 7% improvement over base64, it's more expensive to compute, and implementations are less common than for base64 so it's probably not a win.

You could also simply map every input byte to the corresponding character in U+0000-U+00FF, then do the minimum encoding required by the JSON standard to pass those characters; the advantage here is that the required decoding is nil beyond builtin functions, but the space efficiency is bad -- a 105% expansion (if all input bytes are equally likely) vs. 25% for base85 or 33% for base64.

Final verdict: base64 wins, in my opinion, on the grounds that it's common, easy, and not bad enough to warrant replacement.

See also: Base91 and Base122

Is it worth the effort to try to reduce JSON size?

JSONH, aka hpack, https://github.com/WebReflection/JSONH does something very similar to your example:

[{
id: 12,
score: 34,
interval: 5678,
sub: 9012
}, {
id: 98,
score: 76,
interval: 5432,
sub: 1098
}, ...]

Would turn into:

[["id","score","interval","sub"],12,34,5678,9012,98,76,5432,1098,...]

Put byte array to JSON and vice versa

Here is a good example of base64 encoding byte arrays. It gets more complicated when you throw unicode characters in the mix to send things like PDF documents. After encoding a byte array the encoded string can be used as a JSON property value.

Apache commons offers good utilities:

 byte[] bytes = getByteArr();
String base64String = Base64.encodeBase64String(bytes);
byte[] backToBytes = Base64.decodeBase64(base64String);

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Base64_encoding_and_decoding

Java server side example:

public String getUnsecureContentBase64(String url)
throws ClientProtocolException, IOException {

//getUnsecureContent will generate some byte[]
byte[] result = getUnsecureContent(url);

// use apache org.apache.commons.codec.binary.Base64
// if you're sending back as a http request result you may have to
// org.apache.commons.httpclient.util.URIUtil.encodeQuery
return Base64.encodeBase64String(result);
}

JavaScript decode:

//decode URL encoding if encoded before returning result
var uriEncodedString = decodeURIComponent(response);

var byteArr = base64DecToArr(uriEncodedString);

//from mozilla
function b64ToUint6 (nChr) {

return nChr > 64 && nChr < 91 ?
nChr - 65
: nChr > 96 && nChr < 123 ?
nChr - 71
: nChr > 47 && nChr < 58 ?
nChr + 4
: nChr === 43 ?
62
: nChr === 47 ?
63
:
0;

}

function base64DecToArr (sBase64, nBlocksSize) {

var
sB64Enc = sBase64.replace(/[^A-Za-z0-9\+\/]/g, ""), nInLen = sB64Enc.length,
nOutLen = nBlocksSize ? Math.ceil((nInLen * 3 + 1 >> 2) / nBlocksSize) * nBlocksSize : nInLen * 3 + 1 >> 2, taBytes = new Uint8Array(nOutLen);

for (var nMod3, nMod4, nUint24 = 0, nOutIdx = 0, nInIdx = 0; nInIdx < nInLen; nInIdx++) {
nMod4 = nInIdx & 3;
nUint24 |= b64ToUint6(sB64Enc.charCodeAt(nInIdx)) << 18 - 6 * nMod4;
if (nMod4 === 3 || nInLen - nInIdx === 1) {
for (nMod3 = 0; nMod3 < 3 && nOutIdx < nOutLen; nMod3++, nOutIdx++) {
taBytes[nOutIdx] = nUint24 >>> (16 >>> nMod3 & 24) & 255;
}
nUint24 = 0;

}
}

return taBytes;
}


Related Topics



Leave a reply



Submit