Advanced 10 min read

The BigInt Problem: Why JSON Breaks Numbers Greater Than 2^53

Learn why JSON silently corrupts large numbers and how to fix it. Essential reading for financial apps, crypto, and any system handling IDs > 9 quadrillion.

#bigint #precision #javascript #debugging

TL;DR

  • Problem: JavaScript numbers lose precision above 2^53 (9,007,199,254,740,992)
  • Symptom: 9007199254740993 becomes 9007199254740992
  • Affected: Snowflake IDs, blockchain hashes, financial cents, database PKs
  • Solution: Serialize big numbers as strings in JSON
  • Alternative: Use json-bigint library for transparent handling

The Bug That Costs Companies Millions

Pop quiz: What does this code print?

the-bug.js
javascript
const json = '{"id": 9007199254740993}';
const data = JSON.parse(json);

console.log(data.id);
// Expected: 9007199254740993
// Actual:   9007199254740992  ← WRONG!

If you said "9007199254740993", you'd be wrong. JavaScript silently changes it to 9007199254740992. No error. No warning. Just silent data corruption.

Real-world impact: This bug has caused:
  • Wrong transactions in financial systems
  • Duplicate records from ID collisions
  • Failed blockchain transaction lookups
  • Twitter/X API integration failures (Snowflake IDs)

A $50,000 Bug: My First Encounter

Let me tell you about the first time this bug bit me—hard. It was 2019, and I was building a payment reconciliation system. Transaction IDs from our payment processor were 64-bit integers, and everything worked fine in testing.

Then we hit production. After processing a few million transactions, we started seeing "duplicate" records. Turns out, two different transaction IDs were being parsed to the same JavaScript number. The reconciliation system couldn't match payments to orders, and we had $50,000 in limbo for 48 hours while we debugged.

The fix took 10 minutes once we understood the problem: serialize all IDs as strings. The debugging took two days. That's why I'm writing this guide—so you don't have to learn this the hard way.

Why Does This Happen?

JavaScript uses IEEE 754 double-precision floating-point for all numbers. This format has 53 bits for the significand (mantissa), which means it can only safely represent integers up to 2^53 - 1.

max-safe-integer.js
javascript
console.log(Number.MAX_SAFE_INTEGER);
// 9007199254740991

console.log(Number.MAX_SAFE_INTEGER + 1);
// 9007199254740992

console.log(Number.MAX_SAFE_INTEGER + 2);
// 9007199254740992  ← Same as +1!

console.log(Number.MAX_SAFE_INTEGER + 3);
// 9007199254740994  ← Skipped 9007199254740993!

Beyond MAX_SAFE_INTEGER, JavaScript can only represent even numbers. Odd numbers get rounded to the nearest even. This is called precision loss.

The Math Behind It

A 64-bit double has this structure:

  • 1 bit for sign
  • 11 bits for exponent
  • 52 bits for mantissa (plus 1 implicit bit = 53 total)

With 53 bits, you can represent integers from 0 to 2^53 - 1 = 9,007,199,254,740,991 exactly. Anything larger requires rounding.

Where You'll Encounter This

1. Twitter/X Snowflake IDs

Twitter uses 64-bit Snowflake IDs that regularly exceed MAX_SAFE_INTEGER:

twitter-ids.js
javascript
// Real Twitter tweet ID
const tweetId = 1234567890123456789n;  // Using BigInt

// If you receive this as JSON number, it's corrupted:
const badJson = '{"tweet_id": 1234567890123456789}';
console.log(JSON.parse(badJson).tweet_id);
// 1234567890123456800  ← Wrong! Last digits corrupted
Twitter's solution: The Twitter API returns IDs as both numbers AND strings: {"id": 1234567890123456789, "id_str": "1234567890123456789"}

2. Database Primary Keys

Many databases use 64-bit integers for primary keys:

schema.sql
sql
-- PostgreSQL BIGSERIAL
CREATE TABLE orders (
    id BIGSERIAL PRIMARY KEY,  -- Can exceed 2^53
    total_cents BIGINT         -- $90 trillion max... but JS can't handle it
);

-- A high-volume system might hit this:
-- id = 9007199254740993  ← JavaScript will corrupt this

3. Financial Calculations

Storing money as cents (integers) is best practice, but large amounts break:

financial.js
javascript
// $90,071,992,547,409.93 in cents
const amountCents = 9007199254740993;

// After JSON round-trip:
const json = JSON.stringify({ amount: amountCents });
const parsed = JSON.parse(json);

console.log(parsed.amount);
// 9007199254740992  ← You just lost $0.01

// At scale, this adds up to real money lost

4. Blockchain & Crypto

Blockchain transaction IDs, block numbers, and token amounts often exceed 2^53:

blockchain.js
javascript
// Ethereum block number (will exceed 2^53 eventually)
// Token amounts in wei (18 decimals)
const weiAmount = 1000000000000000001n;  // 1 ETH + 1 wei

// JSON.parse() would corrupt this to:
// 1000000000000000000  ← Lost 1 wei

Solutions

Solution 1: Use Strings (Recommended)

The simplest and most reliable solution: serialize big numbers as strings.

string-solution.js
javascript
// Server-side: Send as string
const response = {
    id: "9007199254740993",  // String, not number
    amount: "1000000000000000001"
};

// Client-side: Parse as BigInt when needed
const data = JSON.parse(jsonString);
const id = BigInt(data.id);
const amount = BigInt(data.amount);

console.log(id);  // 9007199254740993n  ← Correct!
This is what major APIs do:
  • Twitter: id_str field
  • Stripe: All amounts as integers, IDs as strings
  • Discord: Snowflake IDs as strings

Solution 2: json-bigint Library

For transparent BigInt handling without changing your JSON structure:

terminal
bash
npm install json-bigint
json-bigint-example.js
javascript
const JSONBig = require('json-bigint');

const json = '{"id": 9007199254740993, "name": "test"}';

// Native JSON.parse - CORRUPTS the number
const bad = JSON.parse(json);
console.log(bad.id);  // 9007199254740992 ← Wrong

// json-bigint - Preserves the number
const good = JSONBig.parse(json);
console.log(good.id.toString());  // "9007199254740993" ← Correct!

// Stringify back correctly
console.log(JSONBig.stringify(good));
// '{"id":9007199254740993,"name":"test"}'

Solution 3: Custom Reviver Function

For specific fields you know are big integers:

custom-reviver.js
javascript
const json = '{"id": 9007199254740993, "name": "test"}';

// Custom reviver that converts specific fields to BigInt
const data = JSON.parse(json, (key, value) => {
    // Check if this looks like a big integer
    if (key === 'id' && typeof value === 'number') {
        // WARNING: Value is already corrupted at this point!
        // This only works if the JSON has the number as a string
        return value;
    }
    return value;
});

// Better approach: expect strings for big numbers
const safeJson = '{"id": "9007199254740993", "name": "test"}';
const safeData = JSON.parse(safeJson, (key, value) => {
    if (key === 'id' && typeof value === 'string') {
        return BigInt(value);
    }
    return value;
});

console.log(safeData.id);  // 9007199254740993n

Solution 4: Use a Schema with Validation

Combine with JSON Schema to enforce string representation:

schema.json
json
{
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "type": "object",
  "properties": {
    "id": {
      "type": "string",
      "pattern": "^[0-9]+$",
      "description": "64-bit integer as string to prevent precision loss"
    },
    "amount": {
      "type": "string",
      "pattern": "^-?[0-9]+$",
      "description": "Financial amount in cents as string"
    }
  }
}

Detecting the Problem

Add this check to your codebase to catch precision issues early:

precision-check.js
javascript
function isSafeInteger(value) {
    return Number.isSafeInteger(value);
}

function validateJsonNumbers(obj, path = '') {
    const issues = [];
    
    for (const [key, value] of Object.entries(obj)) {
        const currentPath = path ? `${path}.${key}` : key;
        
        if (typeof value === 'number' && !Number.isSafeInteger(value)) {
            issues.push({
                path: currentPath,
                value: value,
                message: 'Number exceeds safe integer range'
            });
        } else if (typeof value === 'object' && value !== null) {
            issues.push(...validateJsonNumbers(value, currentPath));
        }
    }
    
    return issues;
}

// Usage
const data = JSON.parse(jsonString);
const issues = validateJsonNumbers(data);

if (issues.length > 0) {
    console.warn('Precision issues detected:', issues);
}

How Other Languages Handle This

Language Default Behavior Safe Solution
JavaScript Corrupts numbers > 2^53 Use strings or json-bigint
Python Handles arbitrary precision Works out of the box
Java Use Long (64-bit) or BigInteger Jackson handles correctly
Go json.Number or int64 Use json.Number for safety
Rust i64/u64 work correctly serde handles it
python_example.py
python
import json

# Python handles big integers correctly
data = json.loads('{"id": 9007199254740993}')
print(data['id'])  # 9007199254740993 ← Correct!

# Python integers have arbitrary precision
big = 99999999999999999999999999999999
print(json.dumps({"big": big}))
# {"big": 99999999999999999999999999999999}

Best Practices

Rules to live by:
  1. Always use strings for IDs, regardless of size
  2. Document your API — specify which fields are strings
  3. Validate on receipt — check for precision loss
  4. Use BigInt in JavaScript when doing math on large numbers
  5. Test with edge cases — include MAX_SAFE_INTEGER + 1 in tests

Test Cases to Add

bigint-tests.js
javascript
describe('BigInt JSON handling', () => {
    it('should preserve numbers at MAX_SAFE_INTEGER', () => {
        const id = Number.MAX_SAFE_INTEGER;
        const json = JSON.stringify({ id: String(id) });
        const parsed = JSON.parse(json);
        expect(BigInt(parsed.id)).toBe(BigInt(id));
    });
    
    it('should preserve numbers above MAX_SAFE_INTEGER', () => {
        const id = "9007199254740993";  // MAX_SAFE_INTEGER + 2
        const json = JSON.stringify({ id });
        const parsed = JSON.parse(json);
        expect(parsed.id).toBe(id);
    });
    
    it('should detect precision loss', () => {
        const unsafe = 9007199254740993;
        expect(Number.isSafeInteger(unsafe)).toBe(false);
    });
});

What's Next?

Now you understand one of the most insidious bugs in web development. Here's where to go next:

Go audit your codebase. There's probably a BigInt bug waiting to bite you.

About the Author

AT

Adam Tse

Founder & Lead Developer · 10+ years experience

Full-stack engineer with 10+ years of experience building developer tools and APIs. Previously worked on data infrastructure at scale, processing billions of JSON documents daily. Passionate about creating privacy-first tools that don't compromise on functionality.

JavaScript/TypeScript Web Performance Developer Tools Data Processing