自1970年以来在C#中计算的毫秒数比JavaScript有不同的日期 [英] Computing milliseconds since 1970 in C# yields different date than JavaScript
问题描述
我需要在C#中计算JavaScript getTime 方法。
I need to compute the JavaScript getTime method in C#.
为了简单起见,我选择了UTC的固定日期,并比较了C#:
For simplicity, I chose a fixed date in UTC and compared the C#:
C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000
和JavaScript结果:
and the JavaScript results:
JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000
任何提示我做错了什么?
Any hints on what I'm doing wrong?
谢谢!
推荐答案
如果你的输入是UTC,你应该这样做:
If you meant for the input to be at UTC, you should be doing this instead:
var ts = Date.UTC(2011,11,31,0,0,0);
正如SLaks指出的那样,几个月的运行时间为0-11,但即使如此,您必须将日期初始化为UTC如果你想在UTC的响应。在您的代码中,您正在初始化本地日期,然后将其转换为UTC。根据代码运行的计算机的时区,结果会有所不同。使用 Date.UTC
,您会收到一个时间戳,而不是一个 Date
对象,它将是相同的结果不管在哪里运行。
As SLaks pointed out, months run 0-11, but even then - you must initialize the date as UTC if you want the response in UTC. In your code, you were initializing a local date, and then converting it to UTC. The result would be different depending on the time zone of the computer where the code is running. With Date.UTC
, you get back a timestamp - not a Date
object, and it will be the same result regardless of where it runs.
从Chrome的调试控制台:
From Chrome's debugging console:
这是从您的.NET代码返回的相同的值,看起来很好,除非我会返回一个 long
,而不是 string
。
This is the same value returned from your .NET code, which looks just fine, except I would return a long
, not a string
.
这篇关于自1970年以来在C#中计算的毫秒数比JavaScript有不同的日期的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!