How to Convert a Byte to Its Binary String Representation

How to convert a byte to its binary string representation

Use Integer#toBinaryString():

byte b1 = (byte) 129;
String s1 = String.format("%8s", Integer.toBinaryString(b1 & 0xFF)).replace(' ', '0');
System.out.println(s1); // 10000001

byte b2 = (byte) 2;
String s2 = String.format("%8s", Integer.toBinaryString(b2 & 0xFF)).replace(' ', '0');
System.out.println(s2); // 00000010

DEMO.

Convert a byte[] to its string representation in binary

You can convert any numeric integer primitive to its binary representation as a string with Convert.ToString. Doing this for each byte in your array and concatenating the results is very easy with LINQ:

var input = new byte[] { 254 }; // put as many bytes as you want in here
var result = string.Concat(input.Select(b => Convert.ToString(b, 2)));

Update:

The code above will not produce a result having exactly 8 characters per input byte, and this will make it impossible to know what the input was just by looking at the result. To get exactly 8 chars per byte, a small change is needed:

var result = string.Concat(input.Select(b => Convert.ToString(b, 2).PadLeft(8, '0')));

Convert Byte to binary in Java

String toBinary( byte[] bytes )
{
StringBuilder sb = new StringBuilder(bytes.length * Byte.SIZE);
for( int i = 0; i < Byte.SIZE * bytes.length; i++ )
sb.append((bytes[i / Byte.SIZE] << i % Byte.SIZE & 0x80) == 0 ? '0' : '1');
return sb.toString();
}

byte[] fromBinary( String s )
{
int sLen = s.length();
byte[] toReturn = new byte[(sLen + Byte.SIZE - 1) / Byte.SIZE];
char c;
for( int i = 0; i < sLen; i++ )
if( (c = s.charAt(i)) == '1' )
toReturn[i / Byte.SIZE] = (byte) (toReturn[i / Byte.SIZE] | (0x80 >>> (i % Byte.SIZE)));
else if ( c != '0' )
throw new IllegalArgumentException();
return toReturn;
}

There are some simpler ways to handle this also (assumes big endian).

Integer.parseInt(hex, 16);
Integer.parseInt(binary, 2);

and

Integer.toHexString(byte).subString((Integer.SIZE - Byte.SIZE) / 4);
Integer.toBinaryString(byte).substring(Integer.SIZE - Byte.SIZE);

Is it possible to convert a string to byte array in binary representation

You can do this much easier:

string s = "29";
var buffer = new byte[s.Length];
for (int i = 0; i < buffer.Length; i++) {
buffer[i] = (byte)(s[i] - '0');
}

Explanation:

  • We create a byte buffer with the same length as the input string since every character in the string is supposed to be a decimal digit.
  • In C#, a character is a numeric type. We subtract the character '0' from the character representing our digit to get its numeric value. We get this digit by using the String indexer which allows us to access single characters in a string.
  • The result is an integer that we cast to byte we can then insert into the buffer.

Console.WriteLine(buffer[0]) prints 2 because numbers are converted to a string in a decimal format for display. Everything the debugger, the console or a textbox displays is always a string the data has been converted to. This conversion is called formatting. Therefore, you do not see the result as binary. But believe me, it is stored in the bytes in the requested binary format.

You can use Convert.ToString and specify the desired numeric base as second parameter to see the result in binary.

foreach (byte b in buffer) {
Console.WriteLine($"{b} --> {Convert.ToString(b, toBase: 2).PadLeft(4, '0')}");
}

If you want to store it in this visual binary format, then you must store it in a string array

var stringBuffer = new string[s.Length];
for (int i = 0; i < stringBuffer.Length; i++) {
stringBuffer[i] = Convert.ToString(s[i] - '0', toBase: 2).PadLeft(4, '0');
}

Note that everything is stored in a binary format with 0s and 1s in a computer, but you never see these 0s and 1s directly. What you see is always an image on your screen. And this image was created from images of characters in a specific font. And these characters result from converting some data into a string, i.e., from formatting your data. The same data might look different on PCs using a different culture, but the underlying data is stored with the same pattern of 0s and 1s.

The difference between storing the numeric value of the digit as byte and storing this digit as character (possibly being an element of a string) is that a different encoding is used.

  • The byte stores it as a binary number equivalent to the decimal number. I.e., 9 (decimal) becomes 00001001 (binary).
  • The string or character stores the digit using the UTF-16 character table in .NET. This table is equivalent to the ASCII table for Latin letters without accents or umlauts, for digits and for the most common punctuation, except that it uses 16 bits per character instead of 7 bits (expanded to 8 when stored as byte). According to this table, the character '9' is represented by the binary 00111001 (decimal 57).

The string "1001" is stored in UTF-16 as

00000000 00110001  00000000 00110000  00000000 00110000  00000000 00110001

where 0 is encoded as 00000000 00110000 (decimal 48) and 1 is encoded as 00000000 00110001 (decimal 49). Also, additional data is stored for a string, as its length, a NUL character terminator and data related to its class nature.


Alternative ways to store the result would be to use an array of the BitArray Class or to use an array of array of bytes where each byte in the inner array would store one bit only, i.e., be either 0 or 1.

Convert a binary string representation to a byte array

In case you don't have this LINQ fetish, so common lately, you can try the normal way

string input ....
int numOfBytes = input.Length / 8;
byte[] bytes = new byte[numOfBytes];
for(int i = 0; i < numOfBytes; ++i)
{
bytes[i] = Convert.ToByte(input.Substring(8 * i, 8), 2);
}
File.WriteAllBytes(fileName, bytes);

LINQ is great but there must be some limits.

How to convert the string representation of a binary string froma text file back into the utf8 encoded text it came from?

Your problems are two fold:

  • you got the stringrepresentation of a bytearray (from a file, but thats kindof irrelevant)
  • you want to get the bytearray back to utf8 text

So the solution is two steps as well:

import ast

# convert string representation back into binary
string_rep = "b'\\xd0\\xbf\\xd1\\x80\\xd0\\xb8\\xd0\\xb2\\xd0\\xb5\\xd1\\x82'"
as_binary = ast.literal_eval(string_rep)

# convert binary to utf8
text = as_binary.decode("utf8")

to get 'привет' again.

The last part is a duplicate of Python3: Decode UTF-8 bytes converted as string

Convert bytes to bits in dart, and vice versa

Does something like this do the job?

void main() {
final list = [211, 13, 67];

for (final number in list) {
print('${intTo8bitString(number)} = $number');
}
// 11010011 = 211
// 00001101 = 13
// 01000011 = 67

for (final number in list) {
print('${intTo8bitString(number, prefix: true)} = $number');
}
// 0x11010011 = 211
// 0x00001101 = 13
// 0x01000011 = 67

print(binaryStringToInt('0x11010011')); // 211
print(binaryStringToInt('1101')); // 13
print(binaryStringToInt('01000011')); // 67
}

String intTo8bitString(int number, {bool prefix = false}) => prefix
? '0x${number.toRadixString(2).padLeft(8, '0')}'
: '${number.toRadixString(2).padLeft(8, '0')}';

final _pattern = RegExp(r'(?:0x)?(\d+)');

int binaryStringToInt(String binaryString) =>
int.parse(_pattern.firstMatch(binaryString)!.group(1)!, radix: 2);

Convert binary string representation of a byte to actual binary value in Python

Use the int function with a base of 2 to read a binary value as an integer.

n = int('01010101', 2)

Python 2 uses strings to handle binary data, so you would use the chr() function to convert the integer to a one-byte string.

data = chr(n)

Python 3 handles binary and text differently, so you need to use the bytes type instead. This doesn't have a direct equivalent to the chr() function, but the bytes constructor can take a list of byte values. We put n in a one element array and convert that to a bytes object.

data = bytes([n])

Once you have your binary string, you can open a file in binary mode and write the data to it like this:

with open('out.bin', 'wb') as f:
f.write(data)

How can I convert bytes object to decimal or binary representation in python?

Starting from Python 3.2, you can use int.from_bytes.

Second argument, byteorder, specifies endianness of your bytestring. It can be either 'big' or 'little'. You can also use sys.byteorder to get your host machine's native byteorder.

import sys
int.from_bytes(b'\x11', byteorder=sys.byteorder) # => 17
bin(int.from_bytes(b'\x11', byteorder=sys.byteorder)) # => '0b10001'


Related Topics



Leave a reply



Submit