Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No summary from AI model if there is too many logs #14

Open
zanknaf opened this issue Apr 24, 2024 · 1 comment
Open

No summary from AI model if there is too many logs #14

zanknaf opened this issue Apr 24, 2024 · 1 comment

Comments

@zanknaf
Copy link
Collaborator

zanknaf commented Apr 24, 2024

Tx explain hits token limit with transactions that have 70+ logs (estimate based on testing).

Possible solutions: Compressing call trace data, Using only asset changes object

Example transaction:
https://etherscan.io/tx/0x3212df955f1ff00f04cab390ad2cd8c21982a6d4cbc8db8805c2aa892975cfb5

Response from claude:
Error streaming explanation: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'prompt is too long: 214668 tokens > 199999 maximum'}}

@zanknaf
Copy link
Collaborator Author

zanknaf commented Apr 24, 2024

It is 0,05% of transactions that will not work because of token limit

Calculation done:
Anything over 4M gas used (I have tried transactions with 3,8M gas and they still work, 4M is most likely the magic limit)

Over the past 8 days only 5016 transactions reached this limit, out of those 30% are scam USDC/USDT transfer with 1000 logs each, some are smart contract deployments and some are complex swaps/mev transactions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant