Storage 14 min read

JSON at the Edge in 2026: Local-First Storage, Sync, and the Rise of 'Sovereign Data'

Learn the local-first JSON stack: SQLite JSONB, DuckDB-Wasm, CRDTs for sync, and the 'sovereign data' security pattern. The 2026 guide to offline-first apps.

#local-first #sqlite #duckdb #crdt #offline #edge-computing #wasm

The "Always Online" assumption is dying. By 2026, users expect apps to work perfectly offline and sync instantly when back online. This shift has turned JSON from a simple "API response format" into a primary "Local Storage format." Here's the stack that actually works.

TL;DR

  • Local-first is the new standard: Store the "Source of Truth" locally; treat the cloud as sync/backup
  • DuckDB-Wasm + JSON: Run analytical queries on JSON datasets directly in the browser
  • The Sync Problem: Use CRDTs (Automerge/Yjs) or sync engines (Replicache, PowerSync) for JSON state
  • Sovereign Edge: Store sensitive JSON locally to comply with privacy laws and reduce latency

Why Local-First Matters (2026)

With the maturity of DuckDB-Wasm and SQLite-Wasm, we're no longer limited to basic Key-Value stores like IndexedDB. We're running full analytical engines on JSON blobs inside the user's browser or on edge nodes (Cloudflare Workers, Deno Deploy).

The benefits are compelling:

  • Instant reads: No network latency for local data
  • Offline capability: Full functionality without connectivity
  • Privacy by default: Sensitive data never leaves the device
  • Reduced server costs: Less traffic, smaller backends
Local-First JSON Architecture: The 2026 Stack ๐Ÿ“ฑ DEVICE A (Source of Truth) SQLite JSONB Local Storage: User data (encrypted) App state (JSON blobs) Offline queue Works offline โœ“ Instant reads โœ“ ๐Ÿ”„ SYNC LAYER Sync Strategies: JSON Patch (RFC 6902) Simple client-server updates CRDTs (Automerge / Yjs) Complex offline, multi-device sync Sync Engines (Replicache, PowerSync) Production-ready solutions Conflict resolution โœ“ Delta sync โœ“ โ˜๏ธ CLOUD (Backup/Sync) โ˜๏ธ Cloud Role: Backup (not primary) Cross-device sync relay Encrypted blobs only โš ๏ธ NOT the source of truth ๐Ÿ”’ "Sovereign Data" Security Pattern 1. Local Encryption 2. Selective Sync 3. Local Validation 4. Auth-Bound Keys User owns their data locally โ€” GDPR/CCPA compliant by design โšก Performance: Big JSON in Browser Don't: JSON.parse(100MB) on main thread Do: Web Workers + streaming parsers Do: DuckDB-Wasm for local analytics Do: Zstd-Wasm compression for large payloads
Local-first architecture: Device holds the source of truth, cloud is just a sync/backup layer

The Local-First JSON Stack

Storage: The Binary Shift

While the developer sees JSON, the storage layer is increasingly binary for performance:

  • SQLite JSONB: Standard on mobile/desktop for storing semi-structured local state
  • DuckDB: Used for heavy-duty local analysis of JSON logs or large data exports
local-storage-setup.ts
typescript
import initSqlJs from 'sql.js';

// Initialize SQLite in the browser
const SQL = await initSqlJs({
  locateFile: file => `https://sql.js.org/dist/${file}`
});

const db = new SQL.Database();

// Create table with JSONB-style storage
db.run(`
  CREATE TABLE user_data (
    id TEXT PRIMARY KEY,
    data TEXT NOT NULL,  -- JSON stored as text
    updated_at TEXT DEFAULT (datetime('now'))
  )
`);

// Store JSON data locally
function saveUserData(id: string, data: object) {
  db.run(
    'INSERT OR REPLACE INTO user_data (id, data) VALUES (?, ?)',
    [id, JSON.stringify(data)]
  );
}

Synchronization: JSON as a Delta

In 2026, we don't sync the "whole file." We sync JSON Patches or CRDT Operations.

  • JSON Patch (RFC 6902): Great for simple client-server updates
  • Automerge / Yjs: The standard for collaborative JSON editing and complex offline sync
crdt-sync.ts
typescript
import * as Automerge from '@automerge/automerge';

// Initialize a CRDT document
let doc = Automerge.init<{ todos: Array<{ id: string; text: string; done: boolean }> }>();

// Make local changes
doc = Automerge.change(doc, 'Add todo', d => {
  d.todos.push({ id: crypto.randomUUID(), text: 'Learn CRDTs', done: false });
});

// Sync with another device
function syncWithPeer(localDoc: typeof doc, remoteChanges: Uint8Array) {
  // Apply remote changes - conflicts are automatically resolved
  const [newDoc] = Automerge.applyChanges(localDoc, [remoteChanges]);
  return newDoc;
}

// Get changes to send to server/peers
const changes = Automerge.getChanges(Automerge.init(), doc);

Architecting for "Sovereign Edge"

"Sovereign Data" means the user owns their data locally. This is a massive security and privacy win.

The Security Pattern

  1. Local Encryption: Encrypt the local JSON store with a key derived from user auth
  2. Selective Sync: Only sync non-sensitive metadata or encrypted blobs to the cloud
  3. Local Validation: Run JSON Schema validation locally to prevent corrupt state
  4. Auth-Bound Keys: Purge or re-key local data if the user logs out
sovereign-data-pattern.ts
typescript
// Derive encryption key from user credentials
async function deriveStorageKey(password: string, salt: Uint8Array): Promise<CryptoKey> {
  const keyMaterial = await crypto.subtle.importKey(
    'raw',
    new TextEncoder().encode(password),
    'PBKDF2',
    false,
    ['deriveKey']
  );
  
  return crypto.subtle.deriveKey(
    { name: 'PBKDF2', salt, iterations: 100000, hash: 'SHA-256' },
    keyMaterial,
    { name: 'AES-GCM', length: 256 },
    false,
    ['encrypt', 'decrypt']
  );
}

// Encrypt JSON before storing locally
async function encryptAndStore(key: CryptoKey, data: object): Promise<ArrayBuffer> {
  const iv = crypto.getRandomValues(new Uint8Array(12));
  const encoded = new TextEncoder().encode(JSON.stringify(data));
  
  const encrypted = await crypto.subtle.encrypt(
    { name: 'AES-GCM', iv },
    key,
    encoded
  );
  
  // Prepend IV to encrypted data
  const result = new Uint8Array(iv.length + encrypted.byteLength);
  result.set(iv);
  result.set(new Uint8Array(encrypted), iv.length);
  
  return result.buffer;
}

Performance: Handling "Big JSON" in the Browser

If you're storing 100MB of JSON locally, standard JSON.parse() will freeze your UI.

The Rules

  • Streaming Parsers: Use libraries that handle JSON as a stream
  • Web Workers: Always move JSON storage and heavy querying to a background worker
  • IndexedDB as Cache: Use IndexedDB as the persistence layer for your binary chunks
web-worker-json.ts
typescript
// main.ts - Main thread
const worker = new Worker(new URL('./json-worker.ts', import.meta.url));

worker.postMessage({ type: 'QUERY', sql: 'SELECT * FROM events WHERE type = ?' , params: ['click'] });

worker.onmessage = (e) => {
  if (e.data.type === 'RESULT') {
    updateUI(e.data.rows);
  }
};

// json-worker.ts - Worker thread (doesn't block UI)
import * as duckdb from '@duckdb/duckdb-wasm';

let db: duckdb.AsyncDuckDB;

self.onmessage = async (e) => {
  if (e.data.type === 'QUERY') {
    const result = await db.query(e.data.sql, e.data.params);
    self.postMessage({ type: 'RESULT', rows: result.toArray() });
  }
};

DuckDB-Wasm for Local Analytics

Instead of writing manual JS loops to filter/aggregate local JSON, use DuckDB-Wasm:

duckdb-local-analytics.ts
typescript
import * as duckdb from '@duckdb/duckdb-wasm';

// Load DuckDB in the browser
const JSDELIVR_BUNDLES = duckdb.getJsDelivrBundles();
const bundle = await duckdb.selectBundle(JSDELIVR_BUNDLES);
const worker = await duckdb.createWorker(bundle.mainWorker!);
const logger = new duckdb.ConsoleLogger();
const db = new duckdb.AsyncDuckDB(logger, worker);
await db.instantiate(bundle.mainModule, bundle.pthreadWorker);

// Query JSON data with SQL
const conn = await db.connect();

// Load JSON file directly
await conn.query(`
  CREATE TABLE events AS 
  SELECT * FROM read_json_auto('events.json')
`);

// Run analytical queries locally
const result = await conn.query(`
  SELECT 
    event_type,
    COUNT(*) as count,
    AVG(duration_ms) as avg_duration
  FROM events
  WHERE timestamp > '2026-01-01'
  GROUP BY event_type
  ORDER BY count DESC
`);

console.log(result.toArray());

The "Senior Pro" Checklist: Local-First JSON

  • โ˜ Background Sync: Implement a robust retry/backoff strategy for JSON state syncing
  • โ˜ Conflict Resolution: Have a deterministic policy (Last-Writer-Wins vs. CRDT) for when the same JSON key changes on two devices
  • โ˜ Schema Migration: Local data lives forever. You need a plan to migrate local JSON schemas when your app updates
  • โ˜ Storage Quotas: Monitor browser storage limits; use compression (Zstd-Wasm) if your JSON payloads are large
  • โ˜ Wasm Offloading: Use DuckDB-Wasm for filtering/aggregating local JSON rather than writing manual JS loops

Pitfalls to Avoid

The "Large JSON" UI Freeze

Never block the main thread with JSON.stringify on a huge state object.

avoid-main-thread-block.ts
typescript
// โŒ Don't do this on the main thread
const hugeState = JSON.stringify(appState); // Freezes UI for seconds

// โœ… Do this instead
const worker = new Worker('./stringify-worker.js');
worker.postMessage(appState);
worker.onmessage = (e) => {
  const stringified = e.data;
  // Now safe to use
};

Sync Loops

Ensure your sync logic doesn't trigger an infinite update loop between client and server. Use vector clocks or version numbers.

Auth Drift

Ensure local data is purged or re-keyed if the user logs out or changes permissions.

auth-drift-handling.ts
typescript
// On logout: clear sensitive local data
async function handleLogout() {
  // Clear encryption keys from memory
  encryptionKey = null;
  
  // Clear sensitive tables
  db.run('DELETE FROM user_data');
  db.run('DELETE FROM offline_queue');
  
  // Clear IndexedDB
  await indexedDB.deleteDatabase('app-local-storage');
}

References

Continue Learning

About the Author

AT

Adam Tse

Founder & Lead Developer ยท 10+ years experience

Full-stack engineer with 10+ years of experience building developer tools and APIs. Previously worked on data infrastructure at scale, processing billions of JSON documents daily. Passionate about creating privacy-first tools that don't compromise on functionality.

JavaScript/TypeScript Web Performance Developer Tools Data Processing