One of the most common timestamp bugs in development is simple: mixing up seconds and milliseconds. A Unix timestamp like 1710000000 is usually in seconds. A value like 1710000000000 is usually in milliseconds.
If you need a quick way to verify the format and convert it correctly, use this Unix Timestamp Converter.
Why this bug happens
Different languages and platforms represent time differently. PHP and many Unix systems often use 10-digit timestamps in seconds. JavaScript commonly uses 13-digit timestamps in milliseconds. If you treat one like the other, the converted date will be wildly wrong.
Common symptoms
- A date appears decades in the future or past
- An API returns an unexpected year
- Frontend and backend times do not match
- Debug logs show a valid number but the displayed date is incorrect
How to check it fast
- Paste the value into the Unix Timestamp Converter.
- Check whether it is treated as seconds or milliseconds.
- Compare the UTC and local output.
For timezone-display issues, see UTC vs local time: why your timestamp looks wrong even when the math is right.
Unix Timestamp FAQ
What is the difference between timestamp seconds and milliseconds?
Seconds-based timestamps are usually 10 digits long, while milliseconds-based timestamps are often 13 digits long.
Why does this bug cause wildly wrong dates?
Treating seconds as milliseconds, or the reverse, changes the interpreted date by a huge amount.
How can you check timestamp format quickly?
Paste the value into a Unix timestamp converter and compare the interpreted output.