When writing the following line in Javascript: var n = 0x1234
, is it always true that n == 4660
? This question could also be phrased as follows: Does 0x1234 represent a series of bytes with 0x12 as the first byte and 0x34 as the last byte? Or does 0x1234 represent a number in base 16, where the leftmost digit holds the most significance?
In the former situation, interpreting 0x1234 as big endian would result in 4660, while interpreting it as little endian would yield 13330.
In the latter scenario, 0x1234 consistently equals 1 * 4096 + 2 * 256 + 3 * 16 + 4, which simplifies to 4660.