二进制表示法和字节序 [英] Binary notation and Endianness

查看:63
本文介绍了二进制表示法和字节序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们可以说二进制的传统"写作方式吗?是Big Endian?

Can we say that our 'traditional' way of writing in binary is Big Endian?

例如二进制数1:

0b00000001 // Let's assume its possible to write numbers like that in code and b means binary

当我在代码中编写常量 0b00000001 时,无论机器是大端还是小端,这将始终引用整数1?

Also when I write a constant 0b00000001 in my code, this will always refer to integer 1 regardless if machine is big endian or little endian right?

在这种表示法中, LSB总是写为最右边的最后一个元素,而MSB总是 写为最左边的最右边的元素?

In this notation the LSB is always written as the last element from the right, and MSB is always written as the left most element right?

推荐答案

是的,人们通常以大端顺序书写数字(这意味着最先写入的数字具有最大的价值),并且接受数字解释的通用编程语言他们以同样的方式.

Yes, humans generally write numerals in big-endian order (meaning that the digits written first have the most significant value), and common programming languages that accept numerals interpret them in the same way.

因此,数字"00000001"表示1;它永远不会表示一亿(以十进制表示)或128(以二进制表示)或以其他基数表示的相应值.

Thus, the numeral "00000001" means one; it never means one hundred million (in decimal) or 128 (in binary) or the corresponding values in other bases.

许多C语义是根据数字的编写的.将数字转换为值后,C标准将描述如何将该值相加,相乘甚至表示为位(对于带符号的值有一定的自由度).通常,该标准未指定如何将这些位存储在内存中,而这正是机器表示形式中字节序发挥作用的地方.当将代表一个值的位分组为字节并将这些字节存储在内存中时,我们可能会看到这些字节在不同的机器上以不同的顺序写入.

Much of C semantics is written in terms of the value of a number. Once a numeral is converted to a value, the C standard describes how that value is added, multiplied, and even represented as bits (with some latitude regarding signed values). Generally, the standard does not specify how those bits are stored in memory, which is where endianness in machine representations comes into play. When the bits representing a value are grouped into bytes and those bytes are stored in memory, we may see those bytes written in different orders on different machines.

但是,C标准指定了一种通用的解释源代码中数字的方式,并且从最高有效数字首先出现的意义上讲,这种解释始终是big-endian的.

However, the C standard specifies a common way of interpreting numerals in source code, and that interpretation is always big-endian in the sense that the most significant digits appear first.

这篇关于二进制表示法和字节序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆