Skip to content

Commit

Permalink
Add ephemeral block headers to the history network spec
Browse files Browse the repository at this point in the history
  • Loading branch information
pipermerriam committed Sep 25, 2024
1 parent 3508500 commit 912bafe
Showing 1 changed file with 32 additions and 3 deletions.
35 changes: 32 additions & 3 deletions history/history-network.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,13 @@ The history network supports the following protocol messages:
In the history network the `custom_payload` field of the `Ping` and `Pong` messages is the serialization of an SSZ Container specified as `custom_data`:

```python
custom_data = Container(data_radius: uint256)
custom_data = Container(data_radius: uint256, ephemeral_header_count: uint16)
custom_payload = SSZ.serialize(custom_data)
```

* The `data_radius` value defines the *distance* from the node's node-id for which other clients may assume the node would be interested in content.
* The `ephemeral_header_count` value defines the number of *recent* headers that this node stores. The maximum effective value for this is 8192.

### Routing Table

The history network uses the standard routing table structure from the Portal Wire Protocol.
Expand All @@ -79,7 +82,7 @@ The history network uses the standard routing table structure from the Portal Wi
The history network includes one additional piece of node state that should be tracked. Nodes must track the `data_radius` from the Ping and Pong messages for other nodes in the network. This value is a 256 bit integer and represents the data that a node is "interested" in. We define the following function to determine whether node in the network should be interested in a piece of content.

```python
interested(node, content) = distance(node.id, content.id) <= node.radius
interested(node, content) = distance(node.id, content.id) <= node.data_radius
```

A node is expected to maintain `radius` information for each node in its local node table. A node's `radius` value may fluctuate as the contents of its local key-value store change.
Expand Down Expand Up @@ -157,7 +160,7 @@ each receipt/transaction and re-rlp-encode it, but only if it is a legacy transa

HistoricalHashesAccumulatorProof = Vector[Bytes32, 15]

BlockHeaderProof = Union[None, HistoricalHashesAccumulatorProof]
BlockHeaderProof = Union[HistoricalHashesAccumulatorProof]

BlockHeaderWithProof = Container(
header: ByteList[MAX_HEADER_LENGTH], # RLP encoded header in SSZ ByteList
Expand Down Expand Up @@ -200,6 +203,32 @@ content = SSZ.serialize(block_header_with_proof)
content_key = selector + SSZ.serialize(block_number_key)
```

##### Ephemeral Block Headers

This content type represents block headers *near* the HEAD of the chain. They are provable by tracing through the chain of `header.parent_hash` values. All nodes in the network are assumed to have these available. The `Ping.custom_data` and `Pong.custom_data` fields can be used to learn the number of recent headers that a client makes available.

> Note: The history network does not provide a mechanism for knowing the HEAD of the chain. Clients to this network **must** have an external oracle for this information. The Portal Beacon Network is able to provide this information.
> Note: The content-id for this data type is not meaningful. All nodes in the network are assumed to store this content.
> Note: This message is not valid for Gossip. Clients should not send or accept gossip messages for this content type.
```python
# The maximum number of ephemeral headers that can be requested or transferred
# in a single request.
MAX_EPHEMERAL_HEADER_PAYLOAD = 256

# Content and content key

recent_headers = Container(block_hash: Bytes32, ancestor_count: uint8)
selector = 0x04

block_header_list = List(ByteList[1024], limit=256)

content = SSZ.serialize(block_header_list)
content_key = selector + SSZ.serialize(recent_headers)
```

#### Block Body

After the addition of `withdrawals` to the block body in the [EIP-4895](https://eips.ethereum.org/EIPS/eip-4895),
Expand Down

0 comments on commit 912bafe

Please sign in to comment.