C#和JavaScript之间的Timespan差异的解释 [英] Explanation for Timespan Differences Between C# and JavaScript
问题描述
这是基于计算自1970年以来的C#中的毫秒,产生与JavaScript不同的日期和 C#JavaScript的Date.getTime()版本。
对于所有这些计算,假设它们在中央标准时间完成,所以在UTC后6个小时(此偏移量稍后再次出现)。
For all of these calculations, assume they are being done in Central Standard Time, so 6 hours behind UTC (this offset will come up again later).
我明白JavaScript 日期
对象基于Unix Epoch(1970年1月1日午夜)。所以,如果我这样做:
I understand that JavaScript Date
objects are based on the Unix Epoch (Midnight on Jan 1, 1970). So, if I do:
//remember that JS months are 0-indexed, so February == 1
var d = new Date(2014,1,28);
d.getTime();
我的输出将是:
1393567200000
1393567200000
这代表Unix Epoch以来的毫秒数。这一切都很好,很好。在链接的问题中,人们询问将此功能转换为C#,天真实现通常如下所示:
Which represents the number of milliseconds since the Unix Epoch. That's all well and good. In the linked questions, people were asking about translating this functionality into C# and the "naive" implementation usually looks something like this:
//the date of interest in UTC
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0, DateTimeKind.Utc);
//the Unix Epoch
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
//the difference between the two
TimeSpan t = (e - s);
var x = t.TotalMilliseconds;
Console.WriteLine(x);
哪个生成输出:
1393545600000
1393545600000
这是21,600,000毫秒或6小时的差异:与UTC的时区的精确偏移量这些计算完成。
That's a difference of 21,600,000 milliseconds, or 6 hours: the exact offset from UTC for the time zone in which these calculations were done.
要获得C#实现来匹配JavaScript,这是实现:
To get the C# implementation to match the JavaScript, this is the implemenation:
//DateTimeKind.Unspecified
DateTime st=new DateTime(1970,1,1);
//DateTimeKind.Unspecified
DateTime e = new DateTime(2014,2,28);
//translate e to UTC, but leave st as is
TimeSpan t= (e.ToUniversalTime()-st);
var x = t.TotalMilliseconds;
Console.WriteLine(x);
哪个会输出匹配JavaScript输出的输出:
Which will give me output matching the JavaScript output:
1393567200000
1393567200000
我还没有找到的是解释为什么我们离开 DateTime
表示Unix Epoch,一个 DateTimeKind
未指定
能够匹配JavaScript。我们不应该使用 DateTimeKind.Utc
获得正确的结果?我不明白什么细节?这是一个纯粹的学术问题,我只是好奇为什么这样工作。
What I have yet to find is an explanation for why we leave the DateTime
representing the Unix Epoch with a DateTimeKind
of Unspecified
to be able to match JavaScript. Shouldn't we get the correct result using DateTimeKind.Utc
? What detail am I not understanding? This is a purely academic question for me, I'm just curious about why this works this way.
推荐答案
正如你所说, .getTime( )
返回自1970年1月1日00:00:00 UTC以来的毫秒数。
这意味着 .getTime
是(如你所知),包括计算中的UTC的偏移量。
Which means that .getTime
is (as you noticed) including the offset from UTC in the calculation.
为了使C#代码反映出来,您从中减去的时间必须包含时区信息,而1970年1月1日00:00:00必须为UTC时间。
In order to make the C# code reflect this, the time you're subtracting from must include time zone information, while 1 January 1970 00:00:00 must be a UTC time.
通过几个例子,这可能更容易理解。给定:
This might be easier to understand with a few examples. Given:
DateTime e = new DateTime(2014, 2, 28, 0, 0, 0);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0);
-
e - s
不正确,因为s
不是UTC时间。 -
e.ToUniversalTime() - s。 ToUniversalTime()
不正确,因为e
不再包含UTC的偏移量(如JavaScript中的计算) -
e.ToUniversalTime() - s
是正确,因为我们使用UTC时间和我们正在减去包括UTC的偏移量。
e - s
is incorrect becauses
is not a UTC time.e.ToUniversalTime() - s.ToUniversalTime()
is incorrect becausee
no longer includes the offset from UTC (like the calculation in JavaScript does)e.ToUniversalTime() - s
is correct because we're using the UTC time and the time we're subtracting includes the offset from UTC.
当我处理 DateTime.Ticks
直接:
e.Ticks // 635291424000000000
s.Ticks // 621355968000000000
e.Ticks - s.Ticks // 13935456000000000 ("naive" implementation)
e.ToUniversalTime().Ticks - s.Ticks // 13935636000000000 (correct output)
同样,最后一个例子符合我们的所有要求。 Unix纪元是UTC,而我们处理的时间仍然有原始的偏移。
Again, the last example meets all of our requirements. The Unix epoch is in UTC, while the time we're dealing with still has its original offset.
这篇关于C#和JavaScript之间的Timespan差异的解释的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!