Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lightning Terminal Crash on start up from Loop DB Error #706

Open
gkrizek opened this issue Feb 29, 2024 · 7 comments
Open

Lightning Terminal Crash on start up from Loop DB Error #706

gkrizek opened this issue Feb 29, 2024 · 7 comments
Assignees

Comments

@gkrizek
Copy link

gkrizek commented Feb 29, 2024

I'm running litd with the latest version (0.12.3). We've had several nodes hit a problem where they fail to start up and the entire litd binary shuts down. The stack trace indicates it's an issue with Loop and database access. We have made no changes since previous versions. Here is a stack trace:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x8 pc=0x11c488c]

goroutine 1 [running]:
github.com/lightninglabs/loop/loopd.openDatabase(0x3fc55e0, 0x1aee188?)
	github.com/lightninglabs/[email protected]/loopd/utils.go:62 +0x29c
github.com/lightninglabs/loop/loopd.(*Daemon).initialize(0x40000d83c0, 0x0)
	github.com/lightninglabs/[email protected]/loopd/daemon.go:411 +0x518
github.com/lightninglabs/loop/loopd.(*Daemon).StartAsSubserver(0xfffcefcc8c10?, 0xffff972f3f18?, 0x18?)
	github.com/lightninglabs/[email protected]/loopd/daemon.go:194 +0x7c
github.com/lightninglabs/lightning-terminal/subservers.(*loopSubServer).Start(0x40009539b8?, {0x14b31cc?, 0x18?}, 0x185a4a0?, 0x1?)
	github.com/lightninglabs/lightning-terminal/subservers/loop.go:74 +0x28
github.com/lightninglabs/lightning-terminal/subservers.(*subServerWrapper).startIntegrated(0x4000600420, {0x2e9f8e0?, 0x40008a0350?}, 0x1b1c683?, 0x8?, 0x40064c3230)
	github.com/lightninglabs/lightning-terminal/subservers/subserver.go:106 +0x3c
github.com/lightninglabs/lightning-terminal/subservers.(*Manager).StartIntegratedServers(0x4000618540, {0x2e9f8e0, 0x40008a0350}, 0x1ad8c89?, 0xc?)
	github.com/lightninglabs/lightning-terminal/subservers/manager.go:94 +0x16c
github.com/lightninglabs/lightning-terminal.(*LightningTerminal).start(0x4000000900)
	github.com/lightninglabs/lightning-terminal/terminal.go:666 +0x18a8
github.com/lightninglabs/lightning-terminal.(*LightningTerminal).Run(0x4000000900)
	github.com/lightninglabs/lightning-terminal/terminal.go:305 +0x658
main.main()
	github.com/lightninglabs/lightning-terminal/cmd/litd/main.go:14 +0x88

To resolve the issue, we delete the loop directory.

Expected behavior

Litd can run without crashing

Actual behavior

Litd start up crashes

@bhandras
Copy link
Member

Could you please provide a log from a faulty node? Feel free to cut out the non relevant parts. Or if sensitive, perhaps could you grep for Failed to fix faulty timestamps?

@gkrizek
Copy link
Author

gkrizek commented Feb 29, 2024

Here's the latest before the crash, but it's not very helpful. (newest on top) Basically just crashes after trying to open.


2024-02-29 15:03:39.949 [INF] LOOPD: Opening sqlite3 database at: /root/loop/mainnet/loop_sqlite.db
2024-02-29 15:03:39.949 [INF] LOOPD: Found sqlite db at /root/loop/mainnet/loop_sqlite.db, skipping migration
2024-02-29 15:03:39.949 [INF] LOOPD: Swap server address: swap.lightning.today:11010
2024-02-29 15:03:39.948 [INF] LOOPD: Protocol version: MuSig2
2024-02-29 15:03:39.939 [INF] LITD: Baking internal super macaroon
2024-02-29 15:03:39.939 [INF] LITD: Full lnd client connected
2024-02-29 15:03:39.939 [INF] LNDC: lnd is now fully synced to its chain backend
2024-02-29 15:03:35.366 [INF] CRTR: Syncing channel graph from height=832477 (hash=000000000000000000025271f9b0f4b66a69baf3df5e9de9e9048bb3e553596f) to height=832552 (hash=00000000000000000001b0bd5836859a4f58f495a94d8f0ea97ca2704b869b67)
2024-02-29 15:03:35.364 [INF] CRTR: Prune tip for Channel Graph: height=832477, hash=000000000000000000025271f9b0f4b66a69baf3df5e9de9e9048bb3e553596f
2024-02-29 15:03:35.361 [INF] CRTR: Filtering chain using 17599 channels active
2024-02-29 15:03:34.764 [INF] CRTR: FilteredChainView starting
2024-02-29 15:03:34.761 [INF] CRTR: Channel Router starting
2024-02-29 15:03:34.760 [INF] NTFN: New block epoch subscription
2024-02-29 15:03:34.760 [INF] NTFN: New block epoch subscription
2024-02-29 15:03:34.760 [INF] DISC: Authenticated Gossiper starting
2024-02-29 15:03:34.760 [INF] CNCT: ChannelArbitrator(<chanid>:1): starting state=StateDefault, trigger=chainTrigger, triggerHeight=832552
2024-02-29 15:03:34.759 [INF] CNCT: ChannelArbitrator(<chanid>:0): starting state=StateDefault, trigger=chainTrigger, triggerHeight=832552

@bhandras
Copy link
Member

Would you mind sending me a faulty node's db if you still have it? You can find me on Slack or Keybase too.

@bhandras bhandras self-assigned this Feb 29, 2024
@gkrizek
Copy link
Author

gkrizek commented Mar 2, 2024

Just sent a loop dir backup from an affected node to you in Slack.

@bhandras
Copy link
Member

bhandras commented Mar 4, 2024

Looks like the DB is currently corrupted, if I just try to open it with sqlite3 I get the error message that it is malformed. I assume that your filesystem wouldn't corrupt files, but have to ask: did you experience any outages or disk issues that could lead to the corrupted DB?

@gkrizek
Copy link
Author

gkrizek commented Mar 4, 2024

No, we've never had an issue with that. This just started popping up on the latest litd release and it only surfaces on the loop database. Also no outages or storage issues.

If there was bigger storage or corruption issues I would expect it to be more sporadic across files.

@bhandras
Copy link
Member

bhandras commented Mar 4, 2024

I suspect that loopd/terminal crash or abrupt shutdown without the changes in the linked PR may have caused the corruption. Let's keep this issue open until you get the chance to deploy it so we can see if the behavior continues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants