Fix JSON Error: Duplicate Keys in Object

Duplicate keys in a JSON object won't always throw a parse error — but they cause silent data loss that's even harder to debug. Here's how to find and fix them.

JSONTech TeamMarch 1, 20254 min read

The Problem

Unlike most JSON errors, duplicate keys often don't produce a visible error message. The JSON spec (RFC 8259) says object keys "SHOULD be unique" but doesn't require parsers to reject duplicates. Most parsers silently accept them — and quietly discard data.

// ⚠️ Parses without error, but data is lost
{
  "name": "Alice",
  "role": "admin",
  "role": "viewer"
}

After parsing, role will be "viewer". The "admin" value is silently discarded. Some strict validators will flag this, but JSON.parse()in JavaScript won't.

What Happens With Duplicate Keys

The behavior varies by parser, which makes this especially dangerous in cross-platform systems:

  • JavaScript (JSON.parse): Last value wins — the second role overwrites the first.
  • Python (json.loads): Last value wins by default.
  • Go (encoding/json): Last value wins.
  • Some strict parsers: Throw an error or warning on duplicate keys.

The "last value wins" behavior means different parsers reading the same JSON may produce different results if they encounter duplicates — a recipe for subtle, hard-to-track bugs.

Common Causes

1. Manual Editing Mistakes

When editing large JSON files by hand, it's easy to add a key that already exists elsewhere in the object, especially if the object spans many lines.

// ❌ Broken — "email" appears twice
{
  "name": "Alice",
  "email": "alice@example.com",
  "age": 30,
  "department": "engineering",
  "email": "alice.new@example.com"
}
// ✅ Fixed — keep the correct value, remove the duplicate
{
  "name": "Alice",
  "email": "alice.new@example.com",
  "age": 30,
  "department": "engineering"
}

2. Merging JSON Objects Incorrectly

When merging two JSON objects by concatenating their contents — for example, combining user data from two sources — duplicate keys are almost inevitable.

3. Code Generation Bugs

If your code builds JSON objects by appending key-value pairs without checking for existing keys, duplicates can sneak in.

How to Detect Duplicate Keys

  1. Use a strict JSON linter. Tools like jsonlint or ESLint with JSON plugins can flag duplicate keys.
  2. Use our JSON Validator. It detects duplicate keys and highlights exactly where they appear.
  3. Search the file.In your editor, search for the key name. If it appears more than once at the same nesting level, you've found a duplicate.
  4. Use Python with a custom hook:
import json

def check_duplicates(pairs):
    keys = {}
    for key, value in pairs:
        if key in keys:
            raise ValueError(f"Duplicate key: {key}")
        keys[key] = value
    return keys

json.loads(data, object_pairs_hook=check_duplicates)

How to Fix It

  1. Identify which value is correct. When you find a duplicate, decide which value should be kept — usually the most recent or most complete one.
  2. Remove the duplicate entry. Delete the redundant key-value pair entirely.
  3. If both values matter, merge them. For example, if both values are arrays, combine them into a single array. If they represent different concepts, rename one of the keys.
// ❌ Two "tags" keys
{
  "tags": ["javascript"],
  "tags": ["json"]
}

// ✅ Merged into one
{
  "tags": ["javascript", "json"]
}

Prevention Tips

  • Always generate JSON programmatically using objects/dictionaries — these data structures inherently prevent duplicate keys.
  • When merging JSON data, use a proper deep-merge function rather than string concatenation.
  • Add a JSON lint step to your CI pipeline that checks for duplicate keys.
  • Use TypeScript or a schema validator (like JSON Schema) to enforce unique keys at the type level.

Fix it automatically: Paste your broken JSON into our JSON Repair tool — it handles this error and dozens more. Or validate your JSON first with our JSON Validator.

Related Tools