Skip to content

begin reducing circular deps #2095

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

iambumblehead
Copy link

@iambumblehead iambumblehead commented Apr 16, 2025

first commit does this: import CollectionProtoAssign function; reduce Seq->Collection->Seq by allowing Seq to import this rather than inherit from Collection

… allowing Seq to import this rather than inherit from Collection
@iambumblehead
Copy link
Author

no substantive changes to this point and feel free to ignore

@iambumblehead
Copy link
Author

Interested to try removing CollectionImpl reference from Seq definition using something like this and calling functions that do not depend on CollectionImpl

-export class SeqImpl extends CollectionImpl {
+export class SeqImpl {
     /* ... */
+   fromEntrySeq () { return collectionFromEntrySeq(this) }
}

+CollectionProtoAssign(SeqImpl.prototype)

@iambumblehead
Copy link
Author

iambumblehead commented Apr 17, 2025

collectionImpl <- seq <- collection
collectionImpl <- collection

@iambumblehead
Copy link
Author

class ToKeyedSequence extends KeyedSeqImpl
class FromEntriesSequence extends KeyedSeqImpl
class KeyedSeqImpl extends SeqImpl
class ObjectSeq extends KeyedSeqImpl
class KeyedSeqImpl extends SeqImpl
class ObjectSeq extends KeyedSeqImpl
mixin(KeyedSeqImpl, KeyedCollectionPrototype)
class KeyedSeqImpl extends SeqImpl {
class ObjectSeq extends KeyedSeqImpl

@bdurrer
Copy link
Contributor

bdurrer commented Apr 17, 2025

In my failed attempt to do this, we noted that some few functions produce dependencies between ordered and unordered collections.
My suggestion here would be to remove these functions and/or replace them by generic functions that live outside of both (ordered and unordered) classes. This would be a breaking change, obviously, but IMO the migration is easy for the users of the library

@iambumblehead
Copy link
Author

iambumblehead commented Apr 17, 2025

I should probably refrain from making any more statements here as I continue to gain more understanding... I see there are two types of "dependency tree" that compete with each other; One comes from Class inheritence chains, which strongly influence the import tree to look one way. The other comes from instance methods that reference external class constructors and related utilities.

My intuition is flattening the inheritence chains will relieve significant confusion.

@iambumblehead
Copy link
Author

I don't feel (yet) that the interface needs to be changed

@iambumblehead
Copy link
Author

thanks for the input @bdurrer

@jdeniau
Copy link
Member

jdeniau commented Apr 17, 2025

Thank you for your effort. I do not know if you manage to do this but it's nice to see your motivation 😃

For the record, previous attempts failed because of the toXXX methods. To implement Map.toList, you need List, and list can be converted with toSeq, so it need Seq, etc.

You can check #1961 were we discussed about it.

A possible way to fix that may be to do like zod did in their latest release : two different packages, the "full" default one, and a "mini" one that removes the chaining possibility. See #1961 (comment)

It makes the api more complex as we have two package to maintain, and the users have to make that choice, but it seems simple to implement, and fixes size issue for users that just want to use Map's without the full dependencies.

Be aware too that the objective of immutable v6 will be to migrate all the codebase to Typescript.
I have not started this move for now for Map, List, Record, etc. for now.
I don't know if it does conflict with a possible implementation that you do, but I think that you should have that in mind 😉

If you still need to work on that, can you convert this PR to draft, and convert it back when it's ready to review ?

@iambumblehead iambumblehead marked this pull request as draft April 17, 2025 12:21
@iambumblehead iambumblehead changed the title WIP: begin reducing circular deps begin reducing circular deps Apr 17, 2025
@iambumblehead
Copy link
Author

I can't think of way to programmatically generate methods from functions or functions from methods in a way that is practical here. For example, take this collection filter function and method,

const collectionFilter = (collection, predicate, context) => {
  return reify(collection, filterFactory(collection, predicate, context, true));
};
const collectionInheritingObjectOrClassWithMethods = {
  filter(predicate, context) {
    return collectionFilter(this, predicate, context);
  }
};

There are at least 8 Classes defining this 'filter' method. Even if we define them twice, it seems like there could be a better way where they are declared only one time and somehow transformed for the other use-case.

Any good ideas?

@iambumblehead
Copy link
Author

anything I try looks clumsy

obj[narrow(methodfn.name)] = function () {
  methodfn.apply(null, [this].concat(arguments))
}

@jdeniau
Copy link
Member

jdeniau commented Apr 17, 2025

That's why CollectionImpl is for, but I think that you want to get rid of that, aren't you?

I think that there is a mixin in immutable that does things like that too, but it doesn't seems to be really "TypeScript-strictness" compliant.

@jdeniau
Copy link
Member

jdeniau commented Apr 17, 2025

anything I try looks clumsy

obj[narrow(methodfn.name)] = function () {
  methodfn.apply(null, [this].concat(arguments))
}

Wow yes it's a huge loss in readability 😱

@iambumblehead
Copy link
Author

iambumblehead commented Apr 17, 2025

probably not compatible w/ typescipt and perhaps not acceptable to you but, sharing anyway just a sketch

const transformToMethod = (obj, fn) => {
  const methodname = fn.name

  if (methodname.endsWith('_')) {
    switch (fn.length) {
    case 1: obj[methodname] = function (...a1) { return fn(this, a1) }; break;
    case 2: obj[methodname] = function (a1, ...a2) { return fn(this, a1, a2); }; break;
    case 3: obj[methodname] = function (a1, a2, ...a3) { return fn(this, a1, a2, a3); }; break;
    case 4: obj[methodname] = function (a1, a2, a3, ...a4) { return fn(this, a1, a2, a3, a4); }; break;
    }
  } else {
    switch (fn.length) {
    case 0:
    case 1: obj[methodname] = function () { return fn(this); }; break;
    case 2: obj[methodname] = function (a1) { return fn(this, a1); }; break;
    case 3: obj[methodname] = function (a1, a2) { return fn(this, a1, a2); }; break;
    case 4: obj[methodname] = function (a1, a2, a3) { return fn(this, a1, a2, a3); }; break;
    }
  }

  return obj
}

The above transformation may be unconventional, but would return the necessary method expressions without downside of adding ...spread or concat to each one. Furthermore, results could be cache-constructed once only.

@iambumblehead
Copy link
Author

iambumblehead commented Apr 17, 2025

ToKeyedSequence extends KeyedSeqImpl to get this inheritence chain

  • KeyedSeqImpl extends SeqImpl
  • SeqImpl extends CollectionImpl

And during or after that sequence, methods are "mixed in" to both

  • mixin(CollectionImpl, { ...variousMethods })
  • mixin(KeyedSeqImpl, KeyedCollectionPrototype)

The mixins obfuscate the order of definition and results may differ in cjs vs esm, due to esm resolving modules in reverse order as cjs.

Assigning all methods at once in a single location would simplify things eg,

assignProto(ToKeyedSequence, [
  collectionImplFunctions,
  seqImplFunctions,
  keyedSeqImplFunctions,
  collectionImplFunctionsFromMixin,
  keyedSeqImplFunctionsFromMixin
])

@iambumblehead
Copy link
Author

I don't have a strong feeling about how to proceed. Maybe a different approach should be used altogether.

I understand better why the interface would need to change as well and not sure if that's awesome or not.

@iambumblehead
Copy link
Author

Planning to continue developing changes here, but want to avoid sending un-wanted notifications each time something is pushed up. 'Closing.

@iambumblehead
Copy link
Author

this patch might be useful for anyone attempting this later.; removes the dependency between merge.js and Seq.js

diff --git a/src/functional/merge.js b/src/functional/merge.js
index b2d231a0..1bbe1213 100644
--- a/src/functional/merge.js
+++ b/src/functional/merge.js
@@ -1,8 +1,6 @@
 import { isImmutable } from '../predicates/isImmutable';
-import { isIndexed } from '../predicates/isIndexed';
-import { isKeyed } from '../predicates/isKeyed';
 import { IndexedCollection, KeyedCollection } from '../Collection';
-import { Seq } from '../Seq';
+import { IS_RECORD_SYMBOL, IS_INDEXED_SYMBOL, IS_KEYED_SYMBOL } from '../const';
 import hasOwnProperty from '../utils/hasOwnProperty';
 import isDataStructure from '../utils/isDataStructure';
 import shallowCopy from '../utils/shallowCopy';
@@ -82,18 +80,29 @@ function deepMergerWith(merger) {
   return deepMerger;
 }
 
+const anyIsKeyed = (any) => {
+  return Boolean(
+    any &&
+      typeof any === 'object' &&
+      !Array.isArray(any) &&
+      (!isImmutable(any) || any[IS_KEYED_SYMBOL] || any[IS_RECORD_SYMBOL])
+  );
+};
+
+const anyIsIndexed = (any) => {
+  return Boolean(any && (any[IS_INDEXED_SYMBOL] || Array.isArray(any)));
+};
+
 /**
  * It's unclear what the desired behavior is for merging two collections that
  * fall into separate categories between keyed, indexed, or set-like, so we only
  * consider them mergeable if they fall into the same category.
  */
 function areMergeable(oldDataStructure, newDataStructure) {
-  const oldSeq = Seq(oldDataStructure);
-  const newSeq = Seq(newDataStructure);
   // This logic assumes that a sequence can only fall into one of the three
   // categories mentioned above (since there's no `isSetLike()` method).
   return (
-    isIndexed(oldSeq) === isIndexed(newSeq) &&
-    isKeyed(oldSeq) === isKeyed(newSeq)
+    anyIsIndexed(oldDataStructure) === anyIsIndexed(newDataStructure) &&
+    anyIsKeyed(oldDataStructure) === anyIsKeyed(newDataStructure)
   );
 }

@iambumblehead
Copy link
Author

diff --git a/src/methods/merge.js b/src/methods/merge.js
index c89dad4c..c29da373 100644
--- a/src/methods/merge.js
+++ b/src/methods/merge.js
@@ -2,6 +2,10 @@ import { KeyedCollection } from '../Collection';
 import { NOT_SET } from '../TrieUtils';
 import { update } from '../functional/update';
 
+const collectionPlainSize = (cx) => {
+  return cx._values.size
+}
+
 export function merge(...iters) {
   return mergeIntoKeyedWith(this, iters);
 }
@@ -25,7 +29,7 @@ function mergeIntoKeyedWith(collection, collections, merger) {
     return collection;
   }
   if (
-    collection.toSeq().size === 0 &&
+    collectionPlainSize(collection) === 0 &&
     !collection.__ownerID &&
     iters.length === 1
   ) {

@iambumblehead
Copy link
Author

An experimental module is working locally here. It is made from lower-level node-mapiluations found in Map.js and has no dependence on the other data types.

Node.js
import { hash } from './Hash';

import { probeIsSame } from './probe';

import {
  utilArrCopy,
  utilArrSetAt,
  utilArrSpliceIn,
  utilArrSpliceOut,
} from './util';

import transformToMethods from './utils/transformToMethods';

import { SHIFT, SIZE, MASK, OwnerID, SetRef } from './TrieUtils';

import {
  NOT_SET,
  SHAPE_NODEVALUE,
  SHAPE_NODEARRAYMAP,
  SHAPE_NODEHASHARRAYMAP,
  SHAPE_NODEHASHCOLLISION,
  SHAPE_NODEBITMAPINDEXED
} from './const';

const MAX_ARRAY_MAP_SIZE = SIZE / 4;
const MAX_BITMAP_INDEXED_SIZE = SIZE / 2;
const MIN_HASH_ARRAY_MAP_SIZE = SIZE / 4;

const nodeHashArrayMapGet = (nham, shift, keyHash, key, notSetValue) => {
  if (keyHash === undefined) {
    keyHash = hash(key);
  }
  const idx = (shift === 0 ? keyHash : keyHash >>> shift) & MASK;
  const node = nham.nodes[idx];
  return node
    ? node.get(shift + SHIFT, keyHash, key, notSetValue)
    : notSetValue;
};

const nodeHashArrayMapUpdate = (
  nham,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  if (keyHash === undefined) {
    keyHash = hash(key);
  }
  const idx = (shift === 0 ? keyHash : keyHash >>> shift) & MASK;
  const removed = value === NOT_SET;
  const nodes = nham.nodes;
  const node = nodes[idx];

  if (removed && !node) {
    return nham;
  }

  const newNode = nodeUpdate(
    node,
    ownerID,
    shift + SHIFT,
    keyHash,
    key,
    value,
    didChangeSize,
    didAlter
  );
  if (newNode === node) {
    return nham;
  }

  let newCount = nham.count;
  if (!node) {
    newCount++;
  } else if (!newNode) {
    newCount--;
    if (newCount < MIN_HASH_ARRAY_MAP_SIZE) {
      return nodesPack(ownerID, nodes, newCount, idx);
    }
  }

  const isEditable = ownerID && ownerID === nham.ownerID;
  const newNodes = utilArrSetAt(nodes, idx, newNode, isEditable);

  if (isEditable) {
    nham.count = newCount;
    nham.nodes = newNodes;
    return nham;
  }

  return nodeHashArrayMap(ownerID, newCount, newNodes);
};

const nodeHashCollisionUpdate = (
  nhc,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  if (keyHash === undefined) {
    keyHash = hash(key);
  }

  const removed = value === NOT_SET;

  if (keyHash !== nhc.keyHash) {
    if (removed) {
      return nhc;
    }
    SetRef(didAlter);
    SetRef(didChangeSize);
    return nodeMergeInto(nhc, ownerID, shift, keyHash, [key, value]);
  }

  const entries = nhc.entries;
  let idx = 0;
  const len = entries.length;
  for (; idx < len; idx++) {
    if (probeIsSame(key, entries[idx][0])) {
      break;
    }
  }
  const exists = idx < len;

  if (exists ? entries[idx][1] === value : removed) {
    return nhc;
  }

  SetRef(didAlter);
  // eslint-disable-next-line @typescript-eslint/no-unused-expressions -- TODO enable eslint here
  (removed || !exists) && SetRef(didChangeSize);

  if (removed && len === 2) {
    return nodeValue(ownerID, nhc.keyHash, entries[idx ^ 1]);
  }

  const isEditable = ownerID && ownerID === nhc.ownerID;
  const newEntries = isEditable ? entries : utilArrCopy(entries);

  if (exists) {
    if (removed) {
      // eslint-disable-next-line @typescript-eslint/no-unused-expressions -- TODO enable eslint here
      idx === len - 1 ? newEntries.pop() : (newEntries[idx] = newEntries.pop());
    } else {
      newEntries[idx] = [key, value];
    }
  } else {
    newEntries.push([key, value]);
  }

  if (isEditable) {
    nhc.entries = newEntries;
    return nhc;
  }

  return nodeHashCollision(ownerID, nhc.keyHash, newEntries);
};

const nodeHashCollision = (cache => (ownerID, keyHash, entries) => {
  const nhc = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEHASHCOLLISION,
    get: nodeEntryGet,
    iterate: nodeEntriesIterate,
    update: nodeHashCollisionUpdate,
  })));

  nhc.ownerID = ownerID;
  nhc.keyHash = keyHash;
  nhc.entries = entries;

  return nhc
})();

const nodesExpand = (ownerID, nodes, bitmap, including, node) => {
  let count = 0;
  const expandedNodes = new Array(SIZE);
  for (let ii = 0; bitmap !== 0; ii++, bitmap >>>= 1) {
    expandedNodes[ii] = bitmap & 1 ? nodes[count++] : undefined;
  }
  expandedNodes[including] = node;

  return nodeHashArrayMap(ownerID, count + 1, expandedNodes);
};

const nodeMergeInto = (node, ownerID, shift, keyHash, entry) => {
  if (node.keyHash === keyHash) {
    return nodeHashCollision(ownerID, keyHash, [node.entry, entry]);
  }

  const idx1 = (shift === 0 ? node.keyHash : node.keyHash >>> shift) & MASK;
  const idx2 = (shift === 0 ? keyHash : keyHash >>> shift) & MASK;

  let newNode;
  const nodes =
    idx1 === idx2
      ? [nodeMergeInto(node, ownerID, shift + SHIFT, keyHash, entry)]
      : ((newNode = nodeValue(ownerID, keyHash, entry)),
        idx1 < idx2 ? [node, newNode] : [newNode, node]);

  return nodeBitmapIndexed(ownerID, (1 << idx1) | (1 << idx2), nodes);
};

const nodeValueGet = (nv, shift, keyHash, key, notSetValue) => {
  return probeIsSame(key, nv.entry[0]) ? nv.entry[1] : notSetValue;
};

const nodeValueUpdate = (
  nv,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  const removed = value === NOT_SET;
  const keyMatch = probeIsSame(key, nv.entry[0]);
  if (keyMatch ? value === nv.entry[1] : removed) {
    return nv;
  }

  SetRef(didAlter);

  if (removed) {
    SetRef(didChangeSize);
    return; // undefined
  }

  if (keyMatch) {
    if (ownerID && ownerID === nv.ownerID) {
      nv.entry[1] = value;
      return nv;
    }

    return nodeValue(ownerID, nv.keyHash, [key, value]);
  }

  SetRef(didChangeSize);
  return nodeMergeInto(nv, ownerID, shift, hash(key), [key, value]);
};

const nodeValueIterate = (nv, fn) => {
  return fn(nv.entry);
};

const nodeValue = (cache => (ownerID, keyHash, entry) => {
  const nv = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEVALUE,
    get: nodeValueGet,
    iterate: nodeValueIterate,
    update: nodeValueUpdate,
  })));

  nv.ownerID = ownerID;
  nv.keyHash = keyHash;
  nv.entry = entry;

  return nv
})();

const nodeIsLeaf = (node) => {
  return (
    node.shape === SHAPE_NODEVALUE || node.shape === SHAPE_NODEHASHCOLLISION
  );
};

const nodesPack = (ownerID, nodes, count, excluding) => {
  let bitmap = 0;
  let packedII = 0;
  const packedNodes = new Array(count);
  for (let ii = 0, bit = 1, len = nodes.length; ii < len; ii++, bit <<= 1) {
    const node = nodes[ii];
    if (node !== undefined && ii !== excluding) {
      bitmap |= bit;
      packedNodes[packedII++] = node;
    }
  }

  return nodeBitmapIndexed(ownerID, bitmap, packedNodes);
};

const nodeUpdate = (
  node,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  if (!node) {
    if (value === NOT_SET) {
      return node;
    }
    SetRef(didAlter);
    SetRef(didChangeSize);

    return nodeValue(ownerID, keyHash, [key, value]);
  }
  return node.update(
    ownerID,
    shift,
    keyHash,
    key,
    value,
    didChangeSize,
    didAlter
  );
};

const popCount = (x) => {
  x -= (x >> 1) & 0x55555555;
  x = (x & 0x33333333) + ((x >> 2) & 0x33333333);
  x = (x + (x >> 4)) & 0x0f0f0f0f;
  x += x >> 8;
  x += x >> 16;
  return x & 0x7f;
};

const nodeBitmapIndexedGet = (nbi, shift, keyHash, key, notSetValue) => {
  if (keyHash === undefined) {
    keyHash = hash(key);
  }
  const bit = 1 << ((shift === 0 ? keyHash : keyHash >>> shift) & MASK);
  const bitmap = nbi.bitmap;
  return (bitmap & bit) === 0
    ? notSetValue
    : nbi.nodes[popCount(bitmap & (bit - 1))].get(
        shift + SHIFT,
        keyHash,
        key,
        notSetValue
      );
};

const nodeBitmapIndexedUpdate = (
  nbi,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  if (keyHash === undefined) {
    keyHash = hash(key);
  }
  const keyHashFrag = (shift === 0 ? keyHash : keyHash >>> shift) & MASK;
  const bit = 1 << keyHashFrag;
  const bitmap = nbi.bitmap;
  const exists = (bitmap & bit) !== 0;

  if (!exists && value === NOT_SET) {
    return nbi;
  }

  const idx = popCount(bitmap & (bit - 1));
  const nodes = nbi.nodes;
  const node = exists ? nodes[idx] : undefined;
  const newNode = nodeUpdate(
    node,
    ownerID,
    shift + SHIFT,
    keyHash,
    key,
    value,
    didChangeSize,
    didAlter
  );

  if (newNode === node) {
    return nbi;
  }

  if (!exists && newNode && nodes.length >= MAX_BITMAP_INDEXED_SIZE) {
    return nodesExpand(ownerID, nodes, bitmap, keyHashFrag, newNode);
  }

  if (exists && !newNode && nodes.length === 2 && nodeIsLeaf(nodes[idx ^ 1])) {
    return nodes[idx ^ 1];
  }

  if (exists && newNode && nodes.length === 1 && nodeIsLeaf(newNode)) {
    return newNode;
  }

  const isEditable = ownerID && ownerID === nbi.ownerID;
  const newBitmap = exists ? (newNode ? bitmap : bitmap ^ bit) : bitmap | bit;
  const newNodes = exists
    ? newNode
      ? utilArrSetAt(nodes, idx, newNode, isEditable)
      : utilArrSpliceOut(nodes, idx, isEditable)
    : utilArrSpliceIn(nodes, idx, newNode, isEditable);

  if (isEditable) {
    nbi.bitmap = newBitmap;
    nbi.nodes = newNodes;
    return nbi;
  }

  return nodeBitmapIndexed(ownerID, newBitmap, newNodes);
};

const nodeBitmapIndexed = (cache => (ownerID, bitmap, nodes) => {
  const nbi = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEBITMAPINDEXED,
    get: nodeBitmapIndexedGet,
    iterate: nodesIterate,
    update: nodeBitmapIndexedUpdate,
  })));

  nbi.ownerID = ownerID;
  nbi.bitmap = bitmap;
  nbi.nodes = nodes;

  return nbi
})();

const nodeEntriesIterate = (node, fn, reverse) => {
  const entries = node.entries;
  for (let ii = 0, maxIndex = entries.length - 1; ii <= maxIndex; ii++) {
    if (fn(entries[reverse ? maxIndex - ii : ii]) === false) {
      return false;
    }
  }
};

const nodeEntryGet = (nam, shift, keyHash, key, notSetValue) => {
  const entries = nam.entries;
  for (let ii = 0, len = entries.length; ii < len; ii++) {
    if (probeIsSame(key, entries[ii][0])) {
      return entries[ii][1];
    }
  }
  return notSetValue;
};

const nodeArrayMapUpdate = (
  nam,
  ownerID,
  shift,
  keyHash,
  key,
  value,
  didChangeSize,
  didAlter
) => {
  const removed = value === NOT_SET;

  const entries = nam.entries;
  let idx = 0;
  const len = entries.length;
  for (; idx < len; idx++) {
    if (probeIsSame(key, entries[idx][0])) {
      break;
    }
  }
  const exists = idx < len;

  if (exists ? entries[idx][1] === value : removed) {
    return nam;
  }

  SetRef(didAlter);
  // eslint-disable-next-line @typescript-eslint/no-unused-expressions -- TODO enable eslint here
  (removed || !exists) && SetRef(didChangeSize);

  if (removed && entries.length === 1) {
    return; // undefined
  }

  if (!exists && !removed && entries.length >= MAX_ARRAY_MAP_SIZE) {
    return nodesCreate(ownerID, entries, key, value);
  }

  const isEditable = ownerID && ownerID === nam.ownerID;
  const newEntries = isEditable ? entries : utilArrCopy(entries);

  if (exists) {
    if (removed) {
      // eslint-disable-next-line @typescript-eslint/no-unused-expressions -- TODO enable eslint here
      idx === len - 1 ? newEntries.pop() : (newEntries[idx] = newEntries.pop());
    } else {
      newEntries[idx] = [key, value];
    }
  } else {
    newEntries.push([key, value]);
  }

  if (isEditable) {
    nam.entries = newEntries;
    return nam;
  }

  return nodeArrayMap(ownerID, newEntries);
};

const nodeArrayMap = (cache => (ownerID, entries) => {
  const nam = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEARRAYMAP,
    get: nodeEntryGet,
    iterate: nodeEntriesIterate,
    update: nodeArrayMapUpdate,
  })));

  nam.ownerID = ownerID;
  nam.entries = entries;

  return nam
})();

const nodesCreate = (ownerID, entries, key, value) => {
  if (!ownerID) {
    ownerID = new OwnerID();
  }

  let node = nodeValue(ownerID, hash(key), [key, value]);
  for (let ii = 0; ii < entries.length; ii++) {
    const entry = entries[ii];
    node = node.update(ownerID, 0, undefined, entry[0], entry[1]);
  }
  return node;
};

const nodesIterate = (collection, fn, reverse) => {
  const nodes = collection.nodes;
  for (let ii = 0, maxIndex = nodes.length - 1; ii <= maxIndex; ii++) {
    const node = nodes[reverse ? maxIndex - ii : ii];
    if (node && node.iterate(fn, reverse) === false) {
      return false;
    }
  }
};

const nodeHashArrayMap = (cache => (ownerID, count, nodes) => {
  const nham = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEHASHARRAYMAP,
    iterate: nodesIterate,
    get: nodeHashArrayMapGet,
    update: nodeHashArrayMapUpdate,
  })));

  nham.ownerID = ownerID;
  nham.count = count;
  nham.nodes = nodes;

  return nham
})();

export {
  nodeValue,
  nodeValueIterate,
  nodeArrayMap,
  nodeArrayMapUpdate,
  nodeHashArrayMap,
  nodeHashArrayMapGet,
  nodeHashArrayMapUpdate,
  nodeHashCollision,
  nodeHashCollisionUpdate,
  nodeBitmapIndexed,
  nodeBitmapIndexedGet,
  nodeBitmapIndexedUpdate,
  nodesIterate,
  nodesExpand,
  nodesPack,
  nodesCreate,
  nodeEntryGet,
  nodeEntriesIterate,
  nodeMergeInto,
  nodeIsLeaf,
  nodeUpdate,
};

From this file, I experimented with dynamic-method definitions and am using this pattern,

const nodeHashArrayMap = (cache => (ownerID, count, nodes) => {
  const nham = Object.create(cache || (cache = transformToMethods({
    shape: SHAPE_NODEHASHARRAYMAP,
    iterate: nodesIterate,
    get: nodeHashArrayMapGet,
    update: nodeHashArrayMapUpdate,
  })));

  nham.ownerID = ownerID;
  nham.count = count;
  nham.nodes = nodes;

  return nham
})();

class extend doesn't allow programmatic method-definitions the way Object.create does, so it is more flexible for experimenting. If inheritence can be flattened to remove extend usage, difficulties using class will go away . These nodes are constructed without using new so new nodeHashArrayMap(options) becomes nodeHashArrayMap(options)

All tests are passing here ~5-6 seconds, same as main branch

Probably going to break for at least a few days.

@iambumblehead
Copy link
Author

A likely performance improvement would be to replace arrCopy.ts' for loop with just arr.slice() everywhere.

@jdeniau
Copy link
Member

jdeniau commented Apr 26, 2025

That seems counterintuitive 🤔

@iambumblehead
Copy link
Author

That seems counterintuitive 🤔

really? please explain :), I don't understand

@jdeniau
Copy link
Member

jdeniau commented Apr 27, 2025

My bad, I though you wanted to slice on each iteration.
I do not know why we use a for loop here.
Maybe it was performant in the past?
Or maybe it wasn't available for older browser, like ie

t’s been available across browsers since July 2015.
https://developer.mozilla.org/fr/docs/Web/JavaScript/Reference/Global_Objects/Array/slice#compatibilit%C3%A9_des_navigateurs

@jdeniau
Copy link
Member

jdeniau commented Apr 27, 2025

I opened #2098 to address that in 6.0

@iambumblehead
Copy link
Author

iambumblehead commented Apr 27, 2025

If the interface would change this way, I wish 'to' had been previously defined with tests. If "fromJS" and other behaviours were already migrated from toXxx in advance it would be nice.

-collection.toMap()
+collection.to(Map)

@iambumblehead
Copy link
Author

I might succeed in removing circular deps here and, if so, it might be smoother to introduce such changes in a subsequent 7.x release. Due to high-difficulty, changes are non-trivial and would become the subject of discussions.

In the 6.x branch, it would be nice to have a 'to(Xxx)' behaviour with tests; with result, a 7.x branch could continue verifying 'to(Xxx)' while removing 'toXxx()'. Politely, if you added that, it would be great.

I did not preserve inline-typescript-things from ts files. The reason was, I'm using a fresh desktop without typescript-setup and I wanted to just experiment first. If/when I succeed and, if you then basically like the changes, I can help with re-adding the types or could PR using a different branch that strategically-applies commits and preserves types.

Also, this patch should enable import file-extensions (if that suits you)

-import { Seq, KeyedSeq, IndexedSeq, SetSeq } from './Seq';
+import { Seq, KeyedSeq, IndexedSeq, SetSeq } from './Seq.js';
eslint.config.mjs (patch)
diff --git a/eslint.config.mjs b/eslint.config.mjs
index 5f9dcd48..48e5804b 100644
--- a/eslint.config.mjs
+++ b/eslint.config.mjs
@@ -62,6 +62,14 @@ export default tseslint.config(
     files: ['src/*'],
     rules: {
       'no-console': 'error',
+      "import/no-unresolved": "off",
+      'import/extensions': ['error', 'always', { js: 'always', ts: 'always' }],
+    },
+    settings: {
+        "import/no-unresolved": "off",
+        'import/resolver': {
+            typescript: {},
+        },
     },
   },

@iambumblehead
Copy link
Author

It would be nice if toJS and fromJS could somehow be migrated to a to(Xxx) approach that is granular. Say an application only uses Map data, its toJS and fromJS behaviours could use Map without importing List.

Technical implementation would not be challenging, but designing the interface for it would be challenging.

@iambumblehead
Copy link
Author

Having spent more time with this... probably the initial PR should remove circular dependencies and keep all existing tests and interfaces that convert to/from different types of data.

Probably no new messages from me for a week or so.

@iambumblehead
Copy link
Author

An acyclic import tree here passes many tests. Not finished and still many things to do.

Basically the way its working is, collections share a basic collection definition like this,

const collection = { ...collectionBasicDefinitions }
const collectionIndexed = () => cache || (
  cache = Object.assign(Object.create(collection), { ...collectionIndexedDefinitions }))

When "Map" or other collection types are imported from a "Universe" module, that module assigns interdependent definitions to the "basic", "keyed", "indexed" and other definitions

Object.assign(collection, { ...interdependentMethods })
Object.assign(collectionIndexed(), { ...interdependentMethodsIndexed })

Most of the "factory" functions rely on dynamically-resolved constructors so list-control behaviours using the factory functions have cross-dependent relationships. To quickly get around this now, they and their callers are updated to use additional params with whatever is needed and sometimes this is ugly,

sort: function (comparator) {
  return collectionXOpSort(this, SeqKeyed, SeqIndexed, SeqSet, comparator)
},
concat: function (...values) {
  return collectionXOpConcat(
    this,
    SeqKeyed,
    seqKeyedSeqFromValue,
    seqIndexedSeqFromValue,
    values)
},

Names of functions that are cross-dependent are prefixed "collectionX" to be easily found and maybe factored away later. Based on experiments here, factory functions for indexed collections usually need indexed-related things, keyed collections keyed-related things and so on, so different collection-types could use collection-specific list-functions to reduce cross-dependency.

Probably no new messages again for another week or so.

@bdurrer
Copy link
Contributor

bdurrer commented May 5, 2025

It would be nice if toJS and fromJS could somehow be migrated to a to(Xxx) approach that is granular. Say an application only uses Map data, its toJS and fromJS behaviours could use Map without importing List.

Technical implementation would not be challenging, but designing the interface for it would be challenging.

Personally, I would hate this as user. Either the user would have to list all data types of that structure, or there are dynamic imports. These methods convert data deeply, you have no way to know what data is in there.

You could make it mandatory to pass in the reviver, which does the actual work. The default one doesn't even use ordered types. But as a dev of an existing project, I would not like that much either (lot of work, zero gain).

PS: It's cool that you keep working on this, I am looking forward to see the path and what the end result is :)

@iambumblehead
Copy link
Author

Thanks for sharing your position. It is good this PR won't change the interface in any way.

To remove circular imports, it become necessary to make vast changes to the way immutable is composed. Attempts to use smaller changes did not work out, but flow and pieces of functionality are unchanged.

@iambumblehead
Copy link
Author

npm run benchmark prints a long stack trace. How does one run the benchmarks?

ugh TypeError [ERR_INVALID_ARG_TYPE]: The "chunk" argument must be of type string or an instance of Buffer, TypedArray, or DataView. Received undefined
    at _write (node:internal/streams/writable:482:13)
    at Writable.write (node:internal/streams/writable:510:10)
    at Suite.onStart (/home/bumble/soft/immutable-js/resources/benchmark.js:151:26)
    at /home/bumble/soft/immutable-js/node_modules/benchmark/benchmark.js:1215:40
    at arrayEach (/home/bumble/soft/immutable-js/node_modules/lodash/lodash.js:530:11)
    at Function.forEach (/home/bumble/soft/immutable-js/node_modules/lodash/lodash.js:9410:14)
    at Suite.emit (/home/bumble/soft/immutable-js/node_modules/benchmark/benchmark.js:1214:11)
    at Suite.onStart (/home/bumble/soft/immutable-js/node_modules/benchmark/benchmark.js:1174:17)
    at invoke (/home/bumble/soft/immutable-js/node_modules/benchmark/benchmark.js:954:25)
    at Suite.runSuite [as run] (/home/bumble/soft/immutable-js/node_modules/benchmark/benchmark.js:1169:7)

@iambumblehead
Copy link
Author

maybe it is unmaintained and not working w/ latest versions of node

@iambumblehead
Copy link
Author

all tests are passing and would like to verify anything possible before sharing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants