My data structure consists of different intervals, with the first column representing time in Unix format and subsequent columns showing corresponding values. Here is a snippet of the data:
a1521207300,555.45
1,554.53
2,554.07
3,553.9
4,552.67
To convert this Unix time to a Date
object, I removed the ornamental 'a' using slice()
:
var rawTime = data[0].interval;
var timeValue = Math.round(rawTime.slice(1));
console.log(timeValue)
console.log(new Date(timeValue))
I also attempted using parseInt()
instead of round()
. However, it unexpectedly resulted in a January 18, 1970 date. This puzzled me because the expected date should be March 16, 2018. Upon further investigation, I found that JavaScript is capable of directly accepting Unix dates, as seen in this answer.
I cross-referenced the Unix time on an online conversion site at www.onlineconversion.com/unix_time.htm which confirmed it was indeed a timestamp for March 16, 2018.
Question: Despite the correct Unix date for my March 2018 data, why does it show up as a date from the 1970s? Is the 'a' causing this discrepancy? How should I accurately handle this timestamp given that it's only 10 digits long and well within Date
's capability range?