Seconds vs. Milliseconds: The Great Divide
One of the most common sources of bugs in date handling is the confusion between Unix timestamps in seconds and milliseconds. While the original Unix definition uses seconds, many modern systems (like JavaScript and Java) use milliseconds to capture greater precision.
The Quick Check
If you have a raw number and don't know the unit, count the digits (for current dates):
- 10 Digits (~1.7 billion): Seconds. (e.g., Python, PHP, Go, Ruby, Postgres)
- 13 Digits (~1.7 trillion): Milliseconds. (e.g., JavaScript, Java, Elasticsearch)
- 16 Digits: Microseconds. (e.g., Python
time.time()float precision) - 19 Digits: Nanoseconds. (e.g., Go
time.Now().UnixNano())
Ecosystem Breakdown
| Language/System | Default Unit | Notes |
|---|---|---|
| JavaScript / TypeScript | Milliseconds | Date.now() returns ms. |
| Python | Seconds (Float) | time.time() returns float seconds. |
| Java | Milliseconds | System.currentTimeMillis(). |
| Go | Seconds | time.Now().Unix(). Nano also avail. |
| PHP | Seconds | time(). |
| PostgreSQL | Seconds | EXTRACT(EPOCH FROM now()) returns float seconds. |
Conversion Patterns
// ❌ Wrong: Passing seconds directly const date = new Date(1704067200); // Result: 1970-01-20 (Just after epoch) // ✅ Correct: Multiply by 1000 const date = new Date(1704067200 * 1000); // Result: 2024-01-01
import datetime timestamp_ms = 1704067200000 # ❌ Wrong: Treating MS as seconds # dt = datetime.datetime.fromtimestamp(timestamp_ms) # Result: Year 55963 (OSError on some systems) # ✅ Correct: Divide by 1000 dt = datetime.datetime.fromtimestamp(timestamp_ms / 1000.0)
The "Year 55963" Problem
If you see a date that looks like it belongs in a sci-fi novel (e.g., year 50,000+), you likely treatedmilliseconds as seconds.
1.7 trillion seconds is roughly 54,000 years. Conversely, if your date is in January 1970, you treated seconds as milliseconds.
Frequently Asked Questions
Why do some systems use milliseconds?
JavaScript and Java use milliseconds to support higher precision than standard Unix time, which was originally defined in seconds.
How can I tell if a timestamp is seconds or milliseconds?
Look at the magnitude. Current timestamps in seconds are 10 digits (e.g., 1700000000), while milliseconds are 13 digits (e.g., 1700000000000).
What happens if I mix them up?
Treating seconds as milliseconds results in a date in 1970 (just after the epoch). Treating milliseconds as seconds results in a date thousands of years in the future (around year 55000).