数字和为空的数字之间加/减的有悖常理的设计 [英] Counterintuitive design of addition/subtraction between numbers and nullable numbers
问题描述
让我们来看看下面的代码:
Let's see the following code:
int a = 1;
int? b = null;
var addition = a + b;
var subtraction = a - b;
我希望这两个除了
和减法
是 1
。但事实上这两个两个是空
。没有语法糖生成的代码是这样的:
I expect both addition
and subtraction
is 1
. But in fact both of the two are null
. The generated code without syntax sugars is like this:
int? addition = b.HasValue ? new int?(a + b.GetValueOrDefault()) : new int?();
从VB.NET的观点,这是违反直觉的更多:的THRE结果1 +没什么
是没有
。
我认为他们能做到这增加轻松的方式:
I think they could do the addition in this way easily:
int addition = a + b.GetValueOrDefault();
//or int? addition = a + b.GetValueOrDefault();
不过说实话,目前的设计不破的其关联。我的问题是:为什么他们设计这样的操作?什么是我的预期行为的缺点?
But to be honest, the current design doesn't break its associativity. My question is: why do they design the operator like this? What is the disadvantage of my expected behavior?
修改几点意见提到,因为空
不是 0
,这是绝对正确的。但是,这并不是说 1 +空== NULL
,我没有说空
是<$ C $的原因C> 0 。为什么不能 1 +空== 1
?需要注意的是:即使 1 +空== 1
和 1 + 0 == 1
,也不能推断空== 0
。如果是这样, 1 +空== NULL
和 2 +空== NULL
,我也可以推断出 1 == 2
,这是没有意义的。在空
是由编译器团队,为什么他们选择这条规则,增加空为int和字符串之间的不同定义的操作规则。
EDIT A few comments mentioned that "because null
is not 0
", this is absolutely correct. But it's not the reason that 1+null==null
, I didn't say null
is 0
. Why can't 1+null==1
? Note that: even 1+null==1
and 1+0==1
, we can't infer null==0
. If so, 1+null==null
and 2+null==null
, I can also infer 1==2
, which makes no sense. Operation rules on null
is defined by the compiler team, why do they choose this rule, different between adding null to int and string.
修改考虑空
是我不知道,那么你怎么解释,加入空
为字符串1,使原来的1,而不是空
的(我不知道)?记住空
不是的String.Empty
!
EDIT consider null
as "i have no idea", then how did you explain that adding null
to a string "1" gives the original "1" instead of null
(i have no idea)? Remember null
is not string.Empty
!
< STRONG>修改我不相信没有数学的担忧在这里,因为在一些语言 1 +空
收益 1
(试行的JavaScript)。所以它只是一个(也许个人)的选择,由语言设计者(S)。 IMO 1 +空== 1
更容易比目前的设计中使用。这可能是一个初级的意见基础的问题。
EDIT I do believe there is no mathematics concerns here, because in some languages 1 + null
returns 1
(try javascript). So it's just a (maybe personal) choice, by the language designer(s). IMO 1 + null == 1
is easier to use than the current design. It's probably a primary-opinion based question.
推荐答案
如果 1 +空== 1
,然后再考虑以下内容:
If 1 + null == 1
, then consider the following:
int a = 1;
int? b = null;
int result = a + b;
int difference = (a + b) - a;
bool zeroEqualsNull = (difference == b);
结果
将是 INT
,而不是 INT?
,因为我们知道 INT +空== INT
result
would be an int
, not an int?
as we know int + null == int
显然,区别
可重新安排简单地 b
。然后,我们比较 B == b
,其中应的给予真正
。
Clearly, difference
can be re-arranged to simply b
. We're then comparing b == b
, which should give true
.
然而 B
是空
,而不是 0
。如果 0 == NULL
我们会在一个可怕的局面。为什么甚至使用 INT?
在所有在这种情况下?
Yet b
is null
, not 0
. And if 0 == null
we'd in a terrible situation. Why even use int?
at all in that case?
当然,除非,区别
竟然是 INT
- 这意味着无论是加法或减法操作必须在某些情况下,返回空
。
Unless, of course, difference
turned out to be int?
- which means either the addition or subtraction operation must, in some cases, return null
.
PS指向JavaScript作为一种语言应该表现如何是一个坏主意的引用;)
P.S. Pointing to javascript as a reference for how a language should behave is a bad idea ;)
这篇关于数字和为空的数字之间加/减的有悖常理的设计的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!