Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: base caching #55

Merged
merged 4 commits into from
Aug 23, 2024
Merged

feat: base caching #55

merged 4 commits into from
Aug 23, 2024

Conversation

0xkenj1
Copy link
Collaborator

@0xkenj1 0xkenj1 commented Aug 22, 2024

🤖 Linear

Closes ZKS-208 Closes ZKS-199

Description

  • Added caching of 60s default to ecosystem and zkchain endpoints
  • Moved core contract addresses from hardcoded config to env variables

We decided to move forward with NodeCache since cacheable by nature uses Promises so hit the cache, this causes some problems when trying to override res.json hook on express since res.json is considered sync.

For next steps we will:

  • keep using NodeCache as Highlevel api caching
  • keep using cacheable for modules advanced caching
  • analyze with more time if we can effectively use cacheable for the cacheMiddleware

Copy link

linear bot commented Aug 22, 2024

@0xkenj1 0xkenj1 requested review from 0xnigir1 and 0xyaco August 22, 2024 21:23
Copy link

linear bot commented Aug 22, 2024

Copy link
Collaborator

@0xnigir1 0xnigir1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice, again weird paths xd but lgtm. caching is everywhere in our app ⚡

@@ -1,11 +1,11 @@
import { inspect } from "util";
import { caching } from "cache-manager";

import { EvmProvider } from "@zkchainhub/chain-providers/dist/src/index.js";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
import { EvmProvider } from "@zkchainhub/chain-providers/dist/src/index.js";
import { EvmProvider } from "@zkchainhub/chain-providers";

@@ -1,6 +1,7 @@
import { Address, encodeFunctionData, erc20Abi, parseEther, zeroAddress } from "viem";
import { afterEach, describe, expect, it, Mocked, vi } from "vitest";

import { EvmProvider, MulticallNotFound } from "@zkchainhub/chain-providers/dist/src/index.js";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
import { EvmProvider, MulticallNotFound } from "@zkchainhub/chain-providers/dist/src/index.js";
import { EvmProvider, MulticallNotFound } from "@zkchainhub/chain-providers";

import { NextFunction, Request, Response } from "express";
import NodeCache from "node-cache";

const DEFAULT_TTL = 60; // 1 minute
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we add this as env config too? maybe add it to tech debt to not forget


res.json = (body): Response => {
// Cache the response body
cache.set(key, body, args.ttl);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do NodeCache instances have some kind of limit on how many items it can cache? I understand this is a v0.1 solution so no worries if you've taken into account this potentially growing a lot.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is just a temporary solution, no idea how many items can NodeCache handle. In this case we are just gona have 2 items.

@0xkenj1 0xkenj1 merged commit 66359db into dev Aug 23, 2024
6 checks passed
@0xkenj1 0xkenj1 deleted the feat/base-caching branch August 23, 2024 18:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants