Unix Timestamp Converter

Current Unix Timestamp

Loading...

Loading...

Milliseconds -
ISO 8601 -

Convert Timestamp to Date

Enter a Unix timestamp (seconds or milliseconds) to convert to a human-readable date.

Convert Date to Timestamp

Select a date and time to convert to Unix timestamp.

About Unix Timestamps

What is Unix Time?

Unix time (also known as Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC.

Seconds vs Milliseconds

Unix timestamps are typically in seconds (10 digits), but JavaScript and some APIs use milliseconds (13 digits).

Year 2038 Problem

32-bit systems storing timestamps will overflow on January 19, 2038. Modern systems use 64-bit integers.

Common Values

0 = Jan 1, 1970
1000000000 = Sep 9, 2001
2000000000 = May 18, 2033

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. It provides a universal way to represent time that works across all programming languages and systems.

Why do some timestamps have 10 digits and others have 13?

10-digit timestamps are in seconds (standard Unix time), while 13-digit timestamps are in milliseconds (commonly used in JavaScript and some APIs). Our tool auto-detects which format you're using and converts accordingly.

What is the Year 2038 problem?

On 32-bit systems, Unix time is stored as a signed 32-bit integer, which will overflow on January 19, 2038 at 03:14:07 UTC. Modern 64-bit systems use 64-bit integers, which won't overflow for billions of years.

How do I get the current Unix timestamp in code?

In JavaScript use Date.now() for milliseconds or Math.floor(Date.now()/1000) for seconds. In Python use time.time(). In PHP use time(). In Bash use date +%s.

Horizontal Banner (Responsive) 728x90 / 320x100

How to Use This Tool

  1. View Current Timestamp: The current Unix timestamp displays automatically at the top, updating every second. Copy it directly for use in your code or API calls.
  2. Convert Timestamp to Date: Enter a Unix timestamp (10 digits for seconds, 13 digits for milliseconds) in the converter field. The tool auto-detects the format and displays the human-readable date in both UTC and your local timezone.
  3. Convert Date to Timestamp: Select a date and time using the date picker, then click convert. Choose whether you need the result in seconds (standard Unix time) or milliseconds (JavaScript style).
  4. Copy Results: Click the copy button next to any value to copy it to your clipboard. Use the timestamp in database queries, API requests, log analysis, or anywhere consistent time representation is needed.

Technical Details

Unix time (POSIX time, Epoch time) counts seconds since January 1, 1970, 00:00:00 UTC—the "Unix Epoch." This provides a timezone-agnostic, monotonically increasing integer for time comparison and storage. The format originated in early Unix systems and became the de facto standard for computer timekeeping.

Standard Unix timestamps are 32-bit signed integers (10 digits), supporting dates from December 13, 1901 to January 19, 2038. The Year 2038 problem occurs when this overflows. Modern systems use 64-bit integers, extending the range to 292 billion years. JavaScript and many APIs use millisecond timestamps (13 digits). This tool auto-detects the format: 10-digit input is treated as seconds, 13-digit as milliseconds. Leap seconds are ignored in Unix time—each day is exactly 86,400 seconds.

Common Mistakes to Avoid

  • Mixing Seconds and Milliseconds: Timestamp 1700000000 (seconds) vs 1700000000000 (milliseconds) represent vastly different times. Check digit count: 10 digits = seconds, 13 digits = milliseconds. Many bugs stem from this confusion.
  • Ignoring Timezone in Display: Unix timestamps are always UTC-based. When displaying to users, convert to their local timezone. A timestamp of 1700000000 is the same instant worldwide, but shows different clock times per timezone.
  • 32-bit Overflow After 2038: Systems still using 32-bit signed integers for timestamps will overflow on January 19, 2038. Ensure your database columns, programming language types, and APIs support 64-bit timestamps for future dates.

Related Tools

Need to convert times across different timezones? Use our Timezone Converter for worldwide time calculations. For standardized date strings in APIs, try the ISO 8601 Formatter.

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. It provides a universal way to represent time that works across all programming languages and systems.

Why do some timestamps have 10 digits and others have 13?

10-digit timestamps are in seconds (standard Unix time), while 13-digit timestamps are in milliseconds (commonly used in JavaScript and some APIs). Our tool auto-detects which format you're using and converts accordingly.

What is the Year 2038 problem?

On 32-bit systems, Unix time is stored as a signed 32-bit integer, which will overflow on January 19, 2038 at 03:14:07 UTC. Modern 64-bit systems use 64-bit integers, which won't overflow for billions of years.

How do I get the current Unix timestamp in code?

In JavaScript use Date.now() for milliseconds or Math.floor(Date.now()/1000) for seconds. In Python use time.time(). In PHP use time(). In Bash use date +%s.