Timestamp Converter — Unix Epoch & Human Date

Convert between Unix timestamps (seconds or milliseconds) and human-readable dates in any timezone. Runs entirely in your browser — no uploads, no tracking, works offline.

Current Unix time

Unix timestamp → Human date

ISO 8601 (UTC)
UTC
Local ()
Relative

Human date → Unix timestamp

Unix seconds
Unix milliseconds
ISO 8601 (UTC)

Free Unix Timestamp Converter — Epoch ↔ Human Date

This is a free online timestamp converter that translates between Unix timestamps and human-readable dates in both directions, without ever sending your input to a server. Paste an epoch integer and get it back as ISO 8601, UTC, your local timezone, and a relative “3 minutes ago” phrase. Pick a date in the calendar and get Unix seconds, Unix milliseconds, and an ISO 8601 string — ready to paste into a cookie, a JWT, a database query, or a log filter.

A live ticker at the top shows the current Unix time so you can grab “now” with one click. The tool auto-detects whether you pasted seconds (10 digits today) or milliseconds (13 digits today) by digit count, and you can override that with the Seconds / Milliseconds toggle if the value is ambiguous. On the reverse side, flip the Local / UTC toggle to control how the date picker is interpreted — the same clock-time resolves to two different Unix timestamps depending on the timezone, and getting this wrong is the single most common source of “off by one hour” bugs.

Unix timestamp, in one paragraph

A Unix timestamp is the number of seconds that have passed since 00:00:00 UTC on 1 January 1970 — a moment known as the Unix epoch. It’s a single integer that pins down any point in time, independent of timezone or calendar. Because it’s just an integer, it’s trivial to store in a database column, subtract to compute durations, compare for sorting, and transmit over the wire. Almost every operating system, filesystem, API, database, logging pipeline, and authentication token uses Unix timestamps as the canonical “when did this happen” value.

Seconds or milliseconds?

Both formats point to the same instant — they just differ in precision. Pick the one your target system expects:

  • Unix seconds (10 digits in the current era): the classic Unix convention. Standard in HTTP Set-Cookie expires, JWT exp / iat / nbf claims, OAuth tokens, crontab, most Linux tooling, and PostgreSQL’s to_timestamp().
  • Unix milliseconds (13 digits in the current era): the JavaScript and Java default because Date.now() and System.currentTimeMillis() both return milliseconds. You’ll also see it in Kafka timestamps, Elasticsearch’s @timestamp, and most log aggregators.

If you’re unsure which one a value is, count the digits: 10 → seconds, 13 → milliseconds, 16 → microseconds, 19 → nanoseconds. This tool’s Auto unit toggle does exactly that by default.

How to use the timestamp converter

Converting a Unix timestamp to a human date

  1. Paste the integer into the Unix timestamp → Human date field (or click Use now to load the current epoch).
  2. Leave the unit on Auto unless you want to force seconds or milliseconds.
  3. Copy any of the four output rows — ISO 8601 (UTC), UTC, Local, or Relative — with one click.

Converting a date to a Unix timestamp

  1. Pick a date and time in the Human date → Unix timestamp picker (or click Use now).
  2. Choose how to interpret the clock-time: Local (your browser’s timezone) or UTC.
  3. Copy Unix seconds, Unix milliseconds, or the ISO 8601 string — whichever your target system expects.

Timezones and daylight saving time

Unix timestamps are always in UTC — an epoch value has no timezone baked into it. Confusion enters the picture when you convert in either direction:

  • When converting Unix → human, the Local row uses your browser’s configured timezone (detected via Intl.DateTimeFormat().resolvedOptions().timeZone, shown in parentheses on the label) and correctly applies whatever DST offset was in effect at that specific moment — not the current one.
  • When converting human → Unix, the Local / UTC toggle controls how the clock-time you picked is interpreted. “2026-03-14 02:30” means two different Unix timestamps depending on whether you mean that wall-clock in New York, in London, or in UTC — and on DST transition days it can even mean a time that doesn’t exist or exists twice.

When you’re filing a bug report, writing a migration, or pasting a value into someone else’s system, prefer ISO 8601 with an explicit offset (for example 2026-04-22T15:30:00Z) — it’s the only format that’s completely unambiguous across teams and timezones.

The Year 2038 problem

Older systems store Unix timestamps as signed 32-bit integers, which overflow at 03:14:07 UTC on 19 January 2038 — the largest value a signed 32-bit integer can hold is 2,147,483,647 seconds. One second later, the counter wraps to a large negative number and jumps back to 1901. Modern 64-bit operating systems, mainstream programming languages, and every recent database use wider integer types (or BIGINT / time_t on 64-bit Linux) and won’t overflow for another 292 billion years. Legacy embedded hardware, old C code, and some file formats still need migration — if you’re auditing a system, test with timestamps above 2,147,483,647 to flush out the bugs.

Common use cases

  • Reading logs — convert a raw epoch in a server log or a journalctl entry into a human date.
  • Writing database queries — paste a human date and grab the Unix integer for a WHERE created_at > … filter.
  • Debugging JWTs — check whether a token’s exp or iat claim is in the future or the past.
  • Setting cookies — compute the exact seconds-since-epoch for a Set-Cookie expiry.
  • Comparing events across timezones — convert two local times to UTC seconds and subtract for the real gap.
  • Seeding test data — paste arbitrary epoch values into fixtures to simulate historical events.

Privacy and offline use

All conversion happens locally in your browser with plain JavaScript Date objects — no value is ever sent to a server, logged, or cached. Once the page has loaded, you can disconnect from the internet and the tool keeps working. That matters when you’re converting timestamps from a confidential log file or a production database and don’t want the values leaving your machine.

FAQ

Frequently Asked Questions

A Unix timestamp is the number of seconds that have elapsed since 00:00:00 UTC on January 1, 1970 — a moment known as the Unix epoch. It is a single integer that uniquely identifies any point in time, independent of timezone or calendar formatting. Operating systems, file systems, databases, logs, and APIs rely on Unix timestamps because they are compact, unambiguous, and easy to compare or subtract.

Both are common. Unix seconds (10 digits today) are the classic Unix convention and standard in HTTP cookies, JWT exp claims, and most Linux tooling. Unix milliseconds (13 digits today) are the JavaScript and Java default because Date.now() returns ms. Pick the format your target system expects — they represent the same instant, just at different precision. This tool auto-detects which you pasted by digit count and lets you override with the Seconds / Milliseconds toggle.

Older systems stored Unix timestamps as 32-bit signed integers, which can only represent seconds up to 03:14:07 UTC on 19 January 2038 (2^31 - 1 seconds after the epoch). After that, the counter overflows to a negative number and jumps back to 1901. Modern 64-bit systems use wider integers and will not overflow for another 292 billion years, but legacy embedded systems, old C code, and some databases still need to be migrated.

The Unix operating system was being designed in 1969-1971 at Bell Labs, and the team picked 1 January 1970 at midnight UTC as a convenient round date that was close to the system's first deployment. It stuck. Every Unix timestamp in existence today is counted from that moment, regardless of OS or programming language.

Unix timestamps are always in UTC — DST does not affect them, because UTC does not observe DST. When you convert a Unix timestamp to "Local" time, this tool uses your browser's timezone and correctly accounts for whatever DST rules apply at that moment (not the current DST offset). When you enter a human date in "Local" mode, we convert it using the local DST rules at that specific date.

Yes. All conversion happens locally in your browser with pure JavaScript. Nothing is sent to a server, stored, or logged. The arithmetic uses standard JavaScript Date objects, which are accurate to the millisecond for any date between 1 January 271,821 BC and 19 September 275,760 AD. You can disconnect from the internet after the page loads and the tool keeps working.

Your browser's timezone, detected via Intl.DateTimeFormat().resolvedOptions().timeZone. That is whatever you (or your OS) have configured — for example, America/Los_Angeles or Europe/London. The timezone is shown in parentheses on the Local row so you can confirm it. If you need a specific timezone, convert to ISO 8601 or UTC instead.

This tool is designed for two specific inputs: Unix integers (seconds or milliseconds) and HTML datetime-local values. For arbitrary date strings like "April 22, 2026 3:30 PM" or "22/04/2026", paste them into an existing field or parse them in your language of choice — free-form date parsing is ambiguous and full of timezone traps, which is exactly what this tool helps you avoid.