object-fill-missing-keys7.10.28

Add missing keys into plain objects, according to a reference object

§ Quick Take

import { strict as assert } from "assert";
import fillMissing from "object-fill-missing-keys";

// deleting key 'c', with value 'd'
assert.deepEqual(
  fillMissing(
    {
      // input object that could have came from JSON
      b: "b",
    },
    {
      // schema reference object
      a: false,
      b: false,
      c: false,
    }
  ),
  {
    // patched result
    a: false,
    b: "b",
    c: false,
  }
);

§ Examples

§ Purpose

This library fills missing keys in a plain object according to a supplied reference object. It is driving the json-comb-core method enforceKeyset().

§ API

fillMissingKeys(incompleteObj, schemaObj, [opts])

In other words, it's a function which takes three input arguments, third one being optional (marked by square brackets).

Input arguments are not mutated, inputs are cloned before being used. That's important.

§ API - Input

Input argumentTypeObligatory?Description
incompleteObjPlain objectyesPlain object. Can have nested values.
schemaObjPlain objectyesSchema object which contains a desired set of values. Can be nested or hold arrays of things.
optsPlain objectnoOptional Options Object, see below for its API

§ An Optional Options Object

options object's keyType of its valueDefaultDescription
placeholderAnythingBoolean falseUsed only in combination with doNotFillThesePathsIfTheyContainPlaceholders as a means to compare do all children keys contain placeholder values. It won't patch up your reference schema objects (for performance reasons). Always make sure your reference schema object has all values set to be a desired placeholder (default placeholder is usually Boolean false).
doNotFillThesePathsIfTheyContainPlaceholdersArray of zero or more strings[]Handy to activate this for ad-hoc keys in data structures to limit the data bloat.
useNullAsExplicitFalseBooleantrueWhen filling the keys, when this setting is on if there is existing key with null value it won't get the value assigned to anything, even if the reference object would otherwise set it to a nested something. Under bonnet it's setting same-named options key for object-merge-advanced.

§ opts.doNotFillThesePathsIfTheyContainPlaceholders

This setting is handy to limit the lengths of your JSON files. Sometimes, you have some ad-hoc keys that are either very large nested trees of values AND/OR they are rarely used. In those cases, you want to manually trigger the normalisation of that key.

It's done this way.

Find out the path of the key you want to limit normalising on. Path notation is following the one used in object-path opens in a new tab: if it's object, put the key name, if it's array, put that element's ID. For example: orders.latest.0.first_name would be:

{
  orders: {
    latest: [ // <---- notice it's a nested array within a plain object
      {
        first_name: "Bob", // <------ this key is `orders.latest.0.first_name`
        last_name: "Smith"
      },
      {
        first_name: "John",
        last_name: "Doe"
      }
    ]
  }
}

Put the path you want to skip normalising into opts.doNotFillThesePathsIfTheyContainPlaceholders array. For example:

const res = fillMissingKeys(
{
// <---- input
a: {
b: false, // <---- we don't want to automatically normalise this key
x: "x",
},
z: "z",
},
{
// <---- reference schema object
a: {
b: {
c: false,
d: false,
},
x: false,
},
z: false,
},
{
doNotFillThesePathsIfTheyContainPlaceholders: ["a.b"],
}
);
console.log(`res = ${JSON.stringify(res, null, 4)}`);
// res = {
// a: {
// b: false, // <---------------- observe, the keys were not added because it had a placeholder
// x: 'x',
// },
// z: 'z',
// }

To trigger normalisation on an ignored path, you have to set the value on that path to be falsy, but not a placeholder. If you are using default placeholder, false, just set the value in the path as true. If you're using a custom placeholder, different as false, set it to false. The normalisation will see not a placeholder and will start by comparing/filling in missing branches in your object.

For example, we want to fill the value for a.b.c, but we are not sure what's the data structure. We want a placeholder to be set during normalisation under path a.b. We set a.b to true:

const res = fillMissingKeys(
{
a: {
b: true, // <-- not placeholder but lower in data hierarchy (boolean)
x: "x",
},
z: "z",
},
{
a: {
b: {
c: false,
d: false,
},
x: false,
},
z: false,
},
{
doNotFillThesePathsIfTheyContainPlaceholders: ["a.b"],
}
);
console.log(`res = ${JSON.stringify(res, null, 4)}`);
// res = {
// a: {
// b: {
// c: false, // <---- values added!
// d: false, // <---- values added!
// },
// x: 'x',
// },
// z: 'z',
// }

If any of the branches in given doNotFillThesePathsIfTheyContainPlaceholders paths contain only placeholders and are normalised, they will be truncated (set to a placeholder you provide in the opts, or if you don't supply one, set to a default false):

const res = fillMissingKeys(
{
// <--- input object
a: {
b: {
// <--- this object in "b"'s value will be removed and set to placeholder "false"
c: false,
d: false,
},
x: {
// <--- this too
y: false,
},
},
z: "z",
},
{
// <--- schema object
a: {
b: {
c: false,
d: false,
},
x: false,
},
z: false,
},
{
// <--- settings
doNotFillThesePathsIfTheyContainPlaceholders: ["lalala", "a.b", "a.x"],
}
);
console.log(`res = ${JSON.stringify(res, null, 4)}`);
// res = {
// a: {
// b: false,
// x: false,
// },
// z: 'z',
// }

§ opts.useNullAsExplicitFalse

By default, if a value is null, this means it's an explicit false, which is used to completely diffuse any incoming "truthy" values. It's an ultimate "falsy" value.

For example:

const res2 = fillMissingKeys(
{
// <--- object we're working on
a: null,
},
{
// <--- reference schema
a: ["z"],
},
{
// <--- options
useNullAsExplicitFalse: true,
}
);
console.log(
`${`\u001b[${33}m${`res2`}\u001b[${39}m`} = ${JSON.stringify(res2, null, 4)}`
);
// => {
// a: null,
// }

But if you turn it off, usual rules of merging apply and null, being towards the bottom of the value priority scale, gets trumped by nearly every other type of value (not to mention a non-empty array ['z'] in an example below):

const res1 = fillMissingKeys(
{
// <--- object we're working on
a: null,
},
{
// <--- reference schema
a: ["z"],
},
{
// <--- options
useNullAsExplicitFalse: false,
}
);
console.log(
`${`\u001b[${33}m${`res1`}\u001b[${39}m`} = ${JSON.stringify(res1, null, 4)}`
);
// => {
// a: ['z'],
// }

§ How this works

This library performs the key creation part in the JSON files' normalisation operation. JSON file normalisation is making a set of JSON files to have the same key set.

Here's how it slots in the normalisation process:

First, you take two or more plain objects, normally originating from JSON files' contents.

Then, you calculate the schema reference out of them. It's a superset object of all possible keys used across the objects (your JSON files).

Finally, you go through your plain objects second time, one-by-one and fill missing keys using this library. It takes the plain object and your generated schema reference (and optionally a custom placeholder if you don't like Boolean false) and creates missing keys/arrays in that plain object.

Alternatively, you can use this library just to add missing keys. Mind you, for performance reasons; schema is expected to have all key values equal to placeholders. This way, when creation happens, it can be merged over, and those placeholder values come into right places as placeholders. This means, if you provide a schema with some keys having values as non-placeholder, you'll get those values written onto your objects.

Previously we kept "insurance" function which took a schema reference object and overwrote all its values to the opts.placeholder, but then we understood that "normal" reference schemas will always come with right key values anyway, and such operation would waste resources.

§ Licence

MIT opens in a new tab

Copyright © 2010–2020 Roy Revelt and other contributors

Related packages:

📦 object-merge-advanced 10.11.29
Recursive, deep merge of anything (objects, arrays, strings or nested thereof), which weighs contents by type hierarchy to ensure the maximum content is retained
📦 object-boolean-combinations 2.11.66
Consumes a defaults object with booleans, generates all possible variations of it
📦 object-no-new-keys 2.9.11
Check, does a plain object (AST/JSON) has any unique keys, not present in a reference object (another AST/JSON)
📦 object-flatten-referencing 4.11.27
Flatten complex nested objects according to a reference objects
📦 object-delete-key 1.9.38
Delete keys from all arrays or plain objects, nested within anything, by key or by value or by both, and clean up afterwards. Accepts wildcards.
📦 object-all-values-equal-to 1.8.25
Does the AST/nested-plain-object/array/whatever contain only one kind of value?
📦 object-set-all-values-to 3.9.67
Recursively walk the input and set all found values in plain objects to something