-
Notifications
You must be signed in to change notification settings - Fork 492
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore!: bump lru-cache to 10.x & increase node engine #695
Conversation
This would need to be part of #501 since changing engines is a semver major. |
Node 16 is not supported anymore so maybe this doesn't need to be a breaking change? |
Changing the engines to drop support for any node version, supported or not, is a breaking change for a package. |
Okay, a major version bump makes sense then. |
This package doesn't use any fancy features of How about we get rid of the dependency here and replace it with a simple class based on Map? class LRUCache {
constructor(max = 0) {
if (!Number.isInteger(max) || max < 0)
throw new TypeError('max must be a nonnegative integer');
this.max = max;
this.map = new Map();
}
get(key) {
const value = this.map.get(key);
if (value === undefined)
return undefined;
else {
// Remove the key from the map and add it to the end
this.map.delete(key);
this.map.set(key, value);
return value;
}
}
has(key) {
return this.map.has(key);
}
delete(key) {
if (this.map.has(key)) {
this.map.delete(key);
return true;
}
else
return false;
}
set(key, value) {
const deleted = this.delete(key);
if (!deleted && value !== undefined) {
// If cache is full, delete the least recently used item
if (this.map.size >= this.max) {
const firstKey = this.map.keys().next().value;
this.delete(firstKey);
}
this.map.set(key, value);
}
return this;
}
clear() {
this.map.clear();
}
capacity() {
return this.max;
}
size() {
return this.map.size;
}
*entries() {
for (const [key, value] of this.map)
yield [key, value];
}
}
export default LRUCache; I'm happy to provide a PR with 100% test coverage if you like. PS: You could also argue whether LRU is actually the best caching strategy. FIFO might be much faster for the majority of cases. But that's a different story. |
On one hand it does seem like a simple Map cache is all we need and should be easy to implement. On the other hand "why don't you just" and "how hard can it be?" are usually hard lessons once one goes down the path of answering them. If it's not too much trouble, you can make that PR. I think that will inform a lot of our decision better than an academic discussion would. |
Closing in favor of #697 |
Trying to improve/solve the issue with dedupe on
cli
: npm/cli#7350I don't know if it's a good idea since
semver
is being used by a lot of projects and bumping the version ofnode-lru-cache
requires the bump ofengines.node
to>=16
, dropping a lot of node versions at once.