Convert Unix timestamps (seconds or milliseconds) to human-readable dates and vice versa. See the current timestamp live. Supports multiple date format outputs.
A Unix timestamp is the number of seconds (or milliseconds) that have elapsed since January 1, 1970, 00:00:00 UTC (the Unix epoch). It's widely used in programming for storing and comparing dates.
Unix timestamps in seconds are 10 digits (e.g., 1700000000). Milliseconds are 13 digits (e.g., 1700000000000). JavaScript's Date.now() returns milliseconds. Most APIs use seconds.
Yes. The tool displays both your local time zone and UTC. Unix timestamps themselves are always UTC-based.
32-bit systems store timestamps as a signed 32-bit integer, which overflows on January 19, 2038. Most modern systems use 64-bit timestamps, which won't overflow for billions of years.