Number Convert
Back to Blog

Unix Timestamps Explained: How Computers Track Time

NumberConvert Team10 min read

A comprehensive guide to Unix timestamps, the epoch, and how computers measure time. Learn about timestamp conversions, the Year 2038 problem, and practical programming applications.

Listen to this article

Browser text-to-speech

Introduction: The Hidden Clock Behind Every Digital System

Every time you send a message, make a purchase online, or check a social media post, there is a silent counter running in the background. This counter does not care about time zones, daylight saving changes, or whether it is Tuesday or Saturday. It simply counts seconds. This is the Unix timestamp, and it is one of the most elegant solutions to a surprisingly complex problem: how do computers agree on what time it is?

If you have ever wondered why a website shows 'posted 3 hours ago' accurately regardless of where you are, or how databases keep events in perfect chronological order, the answer lies in this single number. In this guide, we will demystify Unix timestamps, explore their origins, and show you exactly how to work with them in your code.

What is a Unix Timestamp?

A Unix timestamp (also called Unix time, POSIX time, or Epoch time) is a way of representing a specific moment in time as a single integer. Specifically, it is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC.

For example:

  • The timestamp 0 represents January 1, 1970, 00:00:00 UTC
  • The timestamp 1000000000 represents September 9, 2001, 01:46:40 UTC
  • The timestamp 1704067200 represents January 1, 2024, 00:00:00 UTC

This seemingly simple concept solves an enormous problem in computing. Instead of storing dates as strings like 'December 31, 2024' (which varies by language, culture, and format), every system can agree on a single number that means exactly the same thing everywhere.

The Unix Epoch: Why January 1, 1970?

The choice of January 1, 1970, as the starting point (called the 'epoch') was not arbitrary. When Unix was being developed at Bell Labs in the late 1960s and early 1970s, the engineers needed to pick a reference date. They chose a date that was:

  1. Recent enough to minimize storage requirements for typical use cases
  2. Round and memorable (the start of a new decade)
  3. Aligned with UTC to provide a universal reference point

The original Unix system used a 32-bit signed integer to store this value, which seemed more than adequate at the time. After all, 32 bits could count over 2 billion seconds from the epoch—enough to last until the year 2038. (More on this later.)

Interestingly, the epoch was originally set to January 1, 1971, but was later changed to 1970 to simplify calculations and provide more headroom for the timestamp values.

Why Computers Use Seconds Since Epoch

You might wonder why we do not just store dates in a human-readable format. There are several compelling reasons for using Unix timestamps:

1. Simplicity and Consistency

A single integer is straightforward to store, compare, and transmit. There is no ambiguity about date formats (is 01/02/03 January 2nd or February 1st?), no issues with different calendars, and no localization concerns.

2. Easy Arithmetic

Calculating the difference between two timestamps is a simple subtraction. Want to know how many seconds between two events? Just subtract one timestamp from another. Need to add 24 hours to a time? Add 86400 (the number of seconds in a day).

// Calculate time difference
const start = 1704067200; // Jan 1, 2024 00:00:00 UTC
const end = 1704153600;   // Jan 2, 2024 00:00:00 UTC
const difference = end - start; // 86400 seconds (1 day)

3. Time Zone Independence

Unix timestamps are always in UTC. This means a timestamp generated in Tokyo represents the exact same moment as one generated in New York or London. Local time conversions happen only when displaying the time to users.

4. Efficient Storage and Comparison

Databases can index timestamps efficiently, making queries like 'find all records from the last 24 hours' extremely fast. Comparing integers is one of the most optimized operations in computing.

Converting Timestamps to Human-Readable Dates

While timestamps are perfect for computers, humans prefer to see dates in familiar formats. Here is how to convert between them in various programming languages:

JavaScript

// Timestamp to Date
const timestamp = 1704067200;
const date = new Date(timestamp * 1000); // JS uses milliseconds
console.log(date.toISOString()); // 2024-01-01T00:00:00.000Z
console.log(date.toLocaleString()); // Localized format

// Date to Timestamp
const now = new Date();
const currentTimestamp = Math.floor(now.getTime() / 1000);
console.log(currentTimestamp);

Python

import datetime

# Timestamp to Date
timestamp = 1704067200
date = datetime.datetime.fromtimestamp(timestamp, tz=datetime.timezone.utc)
print(date.isoformat())  # 2024-01-01T00:00:00+00:00

# Date to Timestamp
now = datetime.datetime.now(datetime.timezone.utc)
current_timestamp = int(now.timestamp())
print(current_timestamp)

PHP

// Timestamp to Date
$timestamp = 1704067200;
$date = date('Y-m-d H:i:s', $timestamp);
echo $date; // 2024-01-01 00:00:00

// Date to Timestamp
$currentTimestamp = time();
echo $currentTimestamp;

SQL (PostgreSQL)

-- Timestamp to Date
SELECT to_timestamp(1704067200);
-- Result: 2024-01-01 00:00:00+00

-- Date to Timestamp
SELECT EXTRACT(EPOCH FROM NOW())::integer;

The Year 2038 Problem (Y2K38)

Remember how the original Unix system used a 32-bit signed integer? Here is where things get interesting—and potentially concerning.

A 32-bit signed integer can store values from -2,147,483,648 to 2,147,483,647. Since Unix timestamps start from January 1, 1970, the maximum positive value represents January 19, 2038, at 03:14:07 UTC.

One second later, the counter overflows. On systems that have not been updated, the timestamp will wrap around to -2,147,483,648, which would be interpreted as December 13, 1901. This is known as the Year 2038 Problem or Y2K38.

Why This Matters

Unlike the Y2K bug (which was largely about display formatting), Y2K38 affects the fundamental way many systems store and process time. Systems at risk include:

  • Embedded systems (medical devices, industrial controllers, automotive systems)
  • Legacy databases still using 32-bit timestamp fields
  • Older programming languages or libraries with 32-bit time functions
  • File systems with 32-bit timestamps for file modification times

The Solution

The solution is straightforward: use 64-bit integers instead. A 64-bit signed integer can count seconds for approximately 292 billion years in both directions from the epoch—far longer than our sun will exist.

Most modern systems have already made this transition:

  • Linux moved to 64-bit timestamps in kernel 5.6 (2020)
  • Modern programming languages (Python 3, modern JavaScript engines) use 64-bit or floating-point representations
  • Databases like PostgreSQL have always used 64-bit timestamps

However, the challenge lies in the countless embedded systems and legacy applications that cannot easily be updated.

Milliseconds Timestamps: JavaScript and Java

While Unix traditionally uses seconds, many modern platforms prefer millisecond precision. JavaScript and Java are the most notable examples.

JavaScript Timestamps

JavaScript's Date.getTime() returns the number of milliseconds since the epoch:

const now = new Date();
console.log(now.getTime()); // e.g., 1704067200000

// Converting between seconds and milliseconds
const unixSeconds = 1704067200;
const jsMilliseconds = unixSeconds * 1000;

const jsMs = Date.now();
const backToUnix = Math.floor(jsMs / 1000);

Java Timestamps

Similarly, Java's System.currentTimeMillis() returns milliseconds:

// Get current time in milliseconds
long milliseconds = System.currentTimeMillis();

// Convert to seconds
long seconds = milliseconds / 1000;

// Convert seconds to milliseconds
long backToMs = seconds * 1000;

When to Use Which

  • Seconds are sufficient for most date/time operations (events, logs, schedules)
  • Milliseconds are needed for high-precision timing (performance measurement, animations)
  • Microseconds or nanoseconds are used in specialized applications (scientific computing, high-frequency trading)

Practical Uses for Timestamps in Programming

Unix timestamps appear throughout modern software development. Here are some common applications:

1. Database Records

Storing created_at and updated_at fields as timestamps ensures consistent ordering and easy querying:

CREATE TABLE events (
    id SERIAL PRIMARY KEY,
    name VARCHAR(255),
    created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
);

-- Find events from the last hour
SELECT * FROM events 
WHERE created_at > EXTRACT(EPOCH FROM NOW()) - 3600;

2. API Rate Limiting

Track when users make requests to enforce rate limits:

const RATE_LIMIT = 100; // requests per hour
const windowStart = Math.floor(Date.now() / 1000) - 3600;

// Count requests since windowStart
if (requestCount > RATE_LIMIT) {
    return { error: 'Rate limit exceeded' };
}

3. Cache Invalidation

Set expiration times for cached data:

const CACHE_TTL = 300; // 5 minutes in seconds

function getCachedData(key) {
    const cached = cache.get(key);
    const now = Math.floor(Date.now() / 1000);
    
    if (cached && cached.expires > now) {
        return cached.data;
    }
    return null;
}

function setCachedData(key, data) {
    const now = Math.floor(Date.now() / 1000);
    cache.set(key, {
        data: data,
        expires: now + CACHE_TTL
    });
}

4. JWT Token Expiration

JSON Web Tokens use timestamps for the exp (expiration) claim:

const jwt = require('jsonwebtoken');

const token = jwt.sign(
    { userId: 123 },
    secretKey,
    { expiresIn: '1h' } // Sets exp claim to now + 3600
);

5. Event Scheduling

Schedule future events with precise timing:

const eventTime = Math.floor(Date.now() / 1000) + 86400; // Tomorrow
scheduleEvent({
    action: 'send_reminder',
    executeAt: eventTime
});

Common Pitfalls and Best Practices

Always Specify the Unit

When working with timestamps, always be explicit about whether you are using seconds or milliseconds. A common bug is mixing the two:

// BUG: Mixing seconds and milliseconds
const unixTimestamp = 1704067200; // seconds
const wrongDate = new Date(unixTimestamp); // Wrong! JS expects milliseconds
const correctDate = new Date(unixTimestamp * 1000); // Correct

Store in UTC, Display Locally

Always store timestamps in UTC (which is what Unix timestamps inherently are). Convert to local time only when displaying to users:

const timestamp = 1704067200;
const utcDate = new Date(timestamp * 1000);

// Display in user's local timezone
console.log(utcDate.toLocaleString('en-US', { 
    timeZone: 'America/New_York' 
}));

Handle Time Zone Conversions Carefully

When accepting dates from users, convert to UTC immediately:

// User selects a local date/time
const localDateString = '2024-01-15T10:00:00';
const userTimezone = 'Europe/London';

// Parse and convert to UTC timestamp
const date = new Date(localDateString + ' ' + userTimezone);
const timestamp = Math.floor(date.getTime() / 1000);

Conclusion

Unix timestamps are a fundamental building block of modern computing. Their simplicity—just counting seconds from a fixed point—belies their power and ubiquity. From database records to API authentication, from cache expiration to event scheduling, timestamps make it possible for systems worldwide to agree on exactly when something happened.

Understanding how timestamps work will make you a better developer, helping you avoid common bugs related to time zones, precision, and format conversion. Whether you are debugging a mysterious date issue or designing a new system from scratch, this knowledge is invaluable.

Ready to start working with timestamps? Try our Unix Timestamp Converter to convert between timestamps and human-readable dates instantly. For converting dates to timestamps, check out our Date to Unix Timestamp tool. And if you are working with JavaScript or Java millisecond timestamps, our Unix Milliseconds Converter has you covered.

Explore all our time conversion tools to simplify your date and time calculations.

See what our calculators can do for you

Ready to take control of your finances?

Explore our free financial calculators and tools to start making informed decisions today.

Explore Our Tools
Unix Timestamps Explained: How Computers Tra... | FinToolset