big-endian vs. little-endian

big-endian vs. little-endian

May 16, 2023

Big-endian and little-endian are two byte ordering schemes used to store multibyte data types (such as integers and floating-point numbers) in computer memory.

Big-endian vs. little-endian #

In a big-endian architecture, the most significant byte (MSB) of a multibyte value is stored first (at the lowest memory address), followed by the next most significant byte, and so on. The least significant byte (LSB) is stored last (at the highest memory address).

In contrast, in a little-endian architecture, the least significant byte is stored first (at the lowest memory address), followed by the next least significant byte, and so on. The most significant byte is stored last (at the highest memory address).

For example, let’s consider the 16-bit integer value 0x1234 (which is 4660 in decimal). In big-endian byte ordering, this value would be stored in memory as follows:

Address | Value
--------|------
0x1000  | 0x12
0x1001  | 0x34

In little-endian byte ordering, the same value would be stored as follows:

Address | Value
--------|------
0x1000  | 0x34
0x1001  | 0x12

It’s important to note that the byte ordering of multibyte data types can affect the way data is transferred and interpreted across different computer systems. When transferring data between big-endian and little-endian machines, the data must be properly converted to ensure that it is interpreted correctly.

Endianness detection #

#include <iostream>

union endpoint {
	unsigned short n;
	char ch[2];
};
 
int main()
{
    endpoint endpoint_t;
    endpoint_t.n = 0x1122;
    if (endpoint_t.ch[0] == 0x11 && endpoint_t.ch[1] == 0x22)
        std::cout << "big ending" << std::endl;
    else
        std::cout << "little ending" << std::endl;
 
    return 0;
}

Hex to decimal in commandline #

$ printf "%d\n", 0x7d
$ echo 'ibase=16; 7D' | bc

Hex to ASCII in command line #

$ hex=7d
$ printf "\\x$hex\n"