Convert Unix timestamps to human-readable dates or dates to timestamps. See UTC, local, ISO 8601, and RFC 2822 formats with copy buttons.
1773525892Updates every second
A Unix timestamp is the number of seconds elapsed since January 1, 1970 00:00:00 UTC (the Unix epoch). It is widely used in programming, databases, and APIs because it is timezone-independent and easy to compare. Our converter shows the current timestamp (updating every second), lets you convert any timestamp to UTC, local time, ISO 8601, and RFC 2822 formats, and converts a chosen date and time back to a Unix timestamp.
Use timestamps when storing event times in databases, scheduling cron jobs, or debugging API responses. The relative time display (e.g., "2 hours ago" or "in 3 days") helps you quickly understand when an event occurred or will occur.
The Unix epoch (January 1, 1970 00:00:00 UTC) serves as the universal reference point for time in computing. Unix timestamps count the number of seconds from that moment, making them timezone-agnostic and trivially comparable. A timestamp of 0 represents the epoch itself, while 1704067200 represents January 1, 2024 00:00:00 UTC.
Millisecond timestamps (13 digits instead of 10) are common in JavaScript and Java. If your timestamp looks unusually large, it may be in milliseconds—divide by 1,000 to get the standard seconds-based value. Some systems also use microsecond or nanosecond precision for high-frequency event logging.
Always store timestamps in UTC to avoid ambiguity. Convert to the user's local timezone only at the display layer. Be aware of daylight saving time transitions, which can cause the same local time to map to two different UTC values or skip an hour entirely. Libraries like date-fns-tz and the native Intl.DateTimeFormat API help with reliable timezone conversions.
Systems that store Unix timestamps as 32-bit signed integers will overflow on January 19, 2038. After that date, the timestamp wraps to a negative number, which is interpreted as December 13, 1901. Modern 64-bit systems are unaffected and can represent dates billions of years into the future.
If your timestamp has 13 digits (e.g., 1704067200000), it is in milliseconds. Divide by 1,000 to get the standard seconds-based Unix timestamp before converting. JavaScript's Date.now() returns milliseconds by default.
The converter handles dates from 1970 onward accurately. Dates before the Unix epoch are represented as negative timestamps and are supported by most modern platforms. However, historical calendar changes (such as the Julian-to-Gregorian transition) are outside the scope of Unix time.