Convert any Unix timestamp (seconds or milliseconds) to a readable date and time in your local timezone or UTC.
Convert a human-readable date and time back to its Unix timestamp in both seconds and milliseconds.
See the current Unix timestamp updating in real-time — useful for debugging time-sensitive code.
Handles both second-precision (10-digit) and millisecond-precision (13-digit) Unix timestamps automatically.
A Unix timestamp is the number of seconds (or milliseconds) elapsed since January 1, 1970 00:00:00 UTC, also known as the Unix epoch. It is the standard way to represent time in programming.
Second timestamps are 10 digits (e.g. 1700000000) and millisecond timestamps are 13 digits (e.g. 1700000000000). This tool automatically detects which format you are using.
On 32-bit systems, the Unix timestamp overflows on January 19, 2038. Modern 64-bit systems extend this to the year 292 billion, so it is not a practical concern today.