Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: process full server response #278

Merged
merged 1 commit into from
Oct 10, 2024

Conversation

st3v3nmw
Copy link
Collaborator

@st3v3nmw st3v3nmw commented Oct 7, 2024

After this change, we're not fully processing the server's response leading to some weird behavior... like the client & server getting stuck since the client is not providing the expected exchange-token or sequence. And we end up not processing any messages hours after registration:

Screenshot from 2024-10-07 20-31-02

This behavior can be seen with the latest snap revisions from the edge channel (probably all revisions after revision 244).

Cause

Server responses look like:

{'client-accepted-types-hash': b'N\x82(WO\x9a\x11}E\xda\x91\xbb\xcf\xc6\\.',
 'messages': [],
 'next-exchange-token': b'3274f7dd-3e6a-4cfd-ae66-a43ed5845e42',
 'next-expected-sequence': 53,
 'server-api': '3.3',
 'server-uuid': b'f1a65772-c771-11ea-927a-0017a48f36ae'}

yet, the ServerResponse class looks like this:

@dataclass
class ServerResponse:
    """The HTTP response from the server after a message exchange."""

    server_uuid: str
    messages: List[Dict[str, Any]]

which means that after this asdict, the response passed to MessageExchange._handle_result looks like this 👇 yet that function expects the original response as is.

{'server_uuid': b'f1a65772-c771-11ea-927a-0017a48f36ae', 'messages': []}

Fix

Process the full server response while retaining the original field types (i.e. keeping bytes instead of str as before) and kebab-case while processing the message.

The server provides next-exchange-token as bytes & the client has been storing it that way. We need to convert it to str before using it as a header value in landscape.client.exchange.exchange_messages. After this change, we dropped that conversion which creates a bad header in landscape.lib.fetch.fetch (line 102):

>>> [f"{key}: {value}" for (key, value) in {"X-Exchange-Token": "token"}.items()]
['X-Exchange-Token: token']
>>> [f"{key}: {value}" for (key, value) in {"X-Exchange-Token": b"token"}.items()]
["X-Exchange-Token: b'token'"]

When the server receives this bad exchange token, it forces the client to re-register creating a clone :(.

@st3v3nmw st3v3nmw force-pushed the what-is-happening branch 3 times, most recently from de65703 to 6871045 Compare October 7, 2024 21:37
@st3v3nmw st3v3nmw marked this pull request as ready for review October 7, 2024 21:54
Copy link
Contributor

@Perfect5th Perfect5th left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see any issues with the code. Good find!

@Perfect5th Perfect5th merged commit e29b279 into canonical:main Oct 10, 2024
5 checks passed
@st3v3nmw st3v3nmw deleted the what-is-happening branch October 11, 2024 03:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants