What is a Unix Timestamp? Complete Guide [2026]

What Is a Unix Timestamp?

A Unix timestamp — also called Unix time, POSIX time, or epoch time — is a single integer that represents a specific moment in time. More precisely, it is the number of seconds that have elapsed since midnight UTC on January 1, 1970. That reference point is known as the Unix epoch.

As of 2026, the current Unix timestamp is approximately 1,745,000,000 — meaning roughly 1.745 billion seconds have passed since the epoch. That number increases by one every second, without interruption, in every timezone on Earth simultaneously.

Quick definition: 1745000000 means "1,745,000,000 seconds after midnight UTC on January 1, 1970." Converting it back gives you a precise date and time accurate to the second.

The simplicity is the point. Instead of storing a string like "April 19, 2026 14:32:00 UTC", you store an integer. Integers are compact, universally sortable, and require no timezone parsing logic. You can compare two timestamps by subtraction. You can calculate the duration between two events as trivially as subtracting one number from another.

Why January 1, 1970? The Epoch Origin

The choice of January 1, 1970 is somewhat arbitrary. Early Unix developers at Bell Labs needed a reference point that was recent enough to keep integer values small, while also being in the past so that all valid timestamps would be positive numbers. The exact date emerged from practical constraints of 1970s hardware and was later formalized in the POSIX standard.

It was not chosen for any mathematical, astronomical, or cultural significance. It simply happened to be a convenient round date close to when the Unix operating system was being developed. Different time systems use different epochs entirely — GPS time, for example, starts from January 6, 1980, while the .NET DateTime type counts 100-nanosecond intervals from January 1, 0001.

Unix time became dominant not because 1970 is special, but because Unix became dominant. When systems needed to exchange time data, they converged on this common standard, and that convergence made it self-reinforcing. Today, virtually every programming language, database, and operating system natively understands Unix timestamps.

How Unix Timestamps Work

A Unix timestamp is timezone-neutral. The value 1745000000 represents exactly the same instant whether you are in New York, London, or Tokyo. When you display that timestamp to a user, you apply their local timezone offset at presentation time — but the stored value never changes.

This is a crucial distinction. Storing timestamps as UTC integers sidesteps an entire class of bugs caused by daylight saving time transitions, timezone abbreviation ambiguity (is "CST" Central Standard Time or China Standard Time?), and locale-dependent date parsing. The timestamp is always unambiguous.

Negative timestamps are valid and represent moments before the epoch. The timestamp -86400 represents exactly one day before midnight January 1, 1970 — which is December 31, 1969. Most modern systems handle negative timestamps correctly, though some older software assumed timestamps would always be positive.

Comparing two timestamps is straightforward: a higher number always means a later point in time. Sorting a list of events by timestamp is identical to sorting by integer value. This makes timestamps ideal for database ordering, log file analysis, and event sequencing.

Timestamp Formats and Precision

Standard Unix timestamps count whole seconds, giving one-second resolution. But many modern systems require finer granularity, and several extended formats exist:

Format Unit Example Value Common Usage
Unix timestamp Seconds 1745000000 Databases, POSIX APIs, JWT tokens
Millisecond timestamp Milliseconds 1745000000000 JavaScript, browser APIs, logging
Microsecond timestamp Microseconds 1745000000000000 High-resolution profiling, databases
Nanosecond timestamp Nanoseconds 1745000000000000000 System clocks, distributed systems

The most common source of confusion is JavaScript's Date.now(), which returns milliseconds rather than seconds. A JavaScript timestamp is exactly 1,000 times larger than the equivalent Unix timestamp in seconds. If you receive a 13-digit number where you expected a 10-digit one, you are almost certainly looking at milliseconds.

The rule of thumb: a 10-digit number is a second-precision Unix timestamp. A 13-digit number is milliseconds. Always check which unit an API or library uses before doing arithmetic on timestamp values.

The Year 2038 Problem (Y2K38)

The Year 2038 problem is a real engineering concern, analogous in structure to the Y2K bug of 2000. It stems from how older systems store Unix timestamps.

A 32-bit signed integer can hold values from roughly -2.1 billion to +2.1 billion. The maximum value is exactly 2,147,483,647. Add one second to that, and the integer overflows — wrapping around to the most negative 32-bit value, which represents a date in December 1901. On January 19, 2038 at 03:14:07 UTC, any system still using a 32-bit signed integer to store Unix time will experience this overflow.

Y2K38 date: January 19, 2038 at 03:14:07 UTC — the moment a 32-bit signed Unix timestamp overflows to negative. Systems still using 32-bit time_t fields will malfunction at this point.

The impact depends entirely on what type is used to store the timestamp. Modern 64-bit systems are unaffected — a 64-bit signed integer can represent dates roughly 292 billion years into the future, making overflow a non-issue in practice. The risk is concentrated in:

For most web developers working with modern languages and 64-bit systems, Y2K38 is not an immediate concern. But if you are maintaining embedded systems or legacy database schemas, auditing your timestamp storage types before 2038 is worthwhile.

Common Use Cases

Unix timestamps appear throughout modern software because they solve the "what time did this happen?" problem simply and portably. Here are the most common places you will encounter them:

Database Records

The created_at and updated_at columns you see in virtually every database table are often stored as Unix timestamps or database-native timestamp types that convert to them. Storing as an integer rather than a formatted string avoids locale-dependent parsing and makes range queries fast (WHERE created_at > 1740000000 is a simple integer comparison).

API Responses

REST APIs frequently return timestamps as integers rather than formatted strings. The GitHub API, Stripe API, and many others use Unix timestamps in their JSON payloads. This avoids ambiguity about date formats — there is no question of whether 04/05/2026 is April 5th or May 4th when the value is 1744416000.

JWT Tokens

JSON Web Tokens use Unix timestamps for their exp (expiration), iat (issued at), and nbf (not before) claims. The JWT standard specifies these as NumericDate values — which are Unix timestamps in seconds. If you have ever decoded a JWT and seen a 10-digit number next to "exp", that is a Unix timestamp.

Log Files

System and application logs often use Unix timestamps for event times because they are compact and sort naturally as text. A log line starting with 1745000000 is unambiguous and sortable without date parsing.

Cache Expiration

Cache systems like Redis allow you to set expiry times as Unix timestamps. The cache layer compares the stored expiry against the current time and evicts entries whose timestamp has passed. The math is as simple as if (expiry_ts < current_ts) { evict(); }.

Convert Timestamps Instantly

Need to convert a Unix timestamp to a human-readable date, or find the current epoch time? The SnapUtils Unix Timestamp Converter handles seconds, milliseconds, and custom formats in your browser — no install required.

Open Unix Timestamp Converter

Working with Timestamps in Code

Here is how to get, convert, and format Unix timestamps in the three most common backend languages.

JavaScript

JavaScript's Date object works in milliseconds, so you need to divide or multiply by 1000 when interoperating with second-precision Unix timestamps.

// Get current timestamp in milliseconds (JavaScript native)
const ms = Date.now();                          // e.g. 1745000000000

// Convert to seconds (standard Unix timestamp)
const seconds = Math.floor(Date.now() / 1000); // e.g. 1745000000

// Convert a Unix timestamp (seconds) back to a Date object
const date = new Date(1745000000 * 1000);

// Format as ISO 8601 string
console.log(date.toISOString()); // "2025-04-18T20:13:20.000Z"

// Format as local date string
console.log(date.toLocaleString()); // varies by locale

// Get timestamp from a specific date
const specificDate = new Date('2026-01-01T00:00:00Z');
const ts = Math.floor(specificDate.getTime() / 1000); // 1767225600

Python

Python's time module returns a float with fractional seconds by default. Use int() to get a clean integer, or use the datetime module for more control.

import time
from datetime import datetime, timezone

# Get current Unix timestamp (float, fractional seconds)
ts_float = time.time()          # e.g. 1745000000.123456

# Integer seconds
ts_int = int(time.time())       # e.g. 1745000000

# Convert timestamp to datetime object (UTC)
dt = datetime.fromtimestamp(1745000000, tz=timezone.utc)
print(dt)  # 2025-04-18 20:13:20+00:00

# Format as string
print(dt.strftime('%Y-%m-%d %H:%M:%S'))  # 2025-04-18 20:13:20

# Convert datetime to timestamp
dt2 = datetime(2026, 1, 1, 0, 0, 0, tzinfo=timezone.utc)
print(int(dt2.timestamp()))  # 1767225600

PHP

PHP's time() function has returned a Unix timestamp since PHP 4. The date() and DateTime class provide formatting and conversion.

<?php
// Get current Unix timestamp
$ts = time();                           // e.g. 1745000000

// Format timestamp as human-readable date
echo date('Y-m-d', $ts);                // 2025-04-18
echo date('Y-m-d H:i:s', $ts);         // 2025-04-18 20:13:20

// Convert a date string to a Unix timestamp
$ts2 = strtotime('2026-01-01 00:00:00 UTC');  // 1767225600

// Using DateTime for timezone-safe conversion
$dt = new DateTime('@1745000000');
$dt->setTimezone(new DateTimeZone('America/New_York'));
echo $dt->format('Y-m-d H:i:s T');  // 2025-04-18 16:13:20 EDT
?>

Unix Timestamp vs ISO 8601

ISO 8601 strings like 2026-04-19T14:32:00Z are human-readable and self-documenting. Unix timestamps are compact and fast to compare. The right choice depends on context:

Consideration Unix Timestamp ISO 8601 String
Storage size 4–8 bytes (integer) 20–25 bytes (string)
Human readability Not readable without conversion Immediately readable
Comparison speed Direct integer comparison Requires parsing first
Timezone safety Always UTC, no ambiguity Depends on offset suffix
Sub-second precision Requires millisecond variant Supports fractional seconds natively
Database indexing Excellent (integer index) Good (timestamp index)

A common pattern: store timestamps as integers in databases and APIs for compactness and comparability, and convert to ISO 8601 only at the presentation layer when displaying to users.

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp is a single integer representing the number of seconds elapsed since midnight UTC on January 1, 1970 (the Unix epoch). It is a timezone-neutral, universally comparable way to represent any moment in time. The current Unix timestamp is approximately 1.745 billion and increases by one each second.

What is the current Unix timestamp?

The current Unix timestamp is approximately 1.745 billion seconds (as of 2026) and increases by one every second. Because it is always increasing, the precise value depends on when you read it. Use Date.now() in JavaScript, time.time() in Python, or time() in PHP to get the current value programmatically. You can also use the SnapUtils Unix Timestamp Converter to see the live current timestamp.

What is the difference between Unix time and UTC?

Unix time is a count of seconds elapsed since the epoch — it is a number, not a timezone. UTC (Coordinated Universal Time) is a time standard that defines what "midnight on January 1, 1970" actually means. Unix time is anchored to UTC, meaning the epoch is defined in UTC. But while UTC is a timezone standard used to express times of day, Unix time is simply an integer that represents a point in time without any timezone notation attached.

What causes the Year 2038 problem?

The Year 2038 problem is caused by 32-bit signed integers reaching their maximum value. A 32-bit signed integer tops out at 2,147,483,647. Unix timestamp 2,147,483,647 corresponds to January 19, 2038 at 03:14:07 UTC. One second later, the integer overflows and wraps to the most negative 32-bit value, causing affected systems to misinterpret the time as December 13, 1901. Systems using 64-bit integers are not affected — 64-bit timestamps can represent dates hundreds of billions of years in the future.

How do I convert a Unix timestamp to a date in JavaScript?

Use new Date(timestamp * 1000).toISOString(). The multiplication by 1000 converts from seconds to milliseconds, which is what JavaScript's Date constructor expects. For example, new Date(1745000000 * 1000).toISOString() returns "2025-04-18T20:13:20.000Z". If your timestamp is already in milliseconds (13 digits), omit the multiplication: new Date(timestampMs).toISOString().

Try the Unix Timestamp Converter

Convert any Unix timestamp to a readable date, find the current epoch time, or convert a date back to a timestamp. Works in your browser instantly — no signup, no install.

Open Unix Timestamp Converter

Related Tools