Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EPIC] Sequencing API #1649

Open
5 tasks
Manav-Aggarwal opened this issue Apr 26, 2024 · 2 comments
Open
5 tasks

[EPIC] Sequencing API #1649

Manav-Aggarwal opened this issue Apr 26, 2024 · 2 comments

Comments

@Manav-Aggarwal
Copy link
Member

Manav-Aggarwal commented Apr 26, 2024

  • EPIC: Separate SignedHeader and Data into 2 different blobs #829
  • Separate Sequencer and Executor into different functions
  • Refactor sequencer like an endpoint so that it can:
    • Accept a Tx
    • create a batch post to DA
    • pull/subscribe for batch
  • Refactor executor to:
    • Subscribe/pull batch
    • Talk to verifier for validating the batch
    • pull block time from sequencer header
    • Execute the batch to produce a header
  • Implement batch verifier
    • Runs Shared Sequencer Light node
    • Runs DA light node
    • Verifies:
      • Batch is part of canonical SS chain
      • Batch is published to DA
@Manav-Aggarwal
Copy link
Member Author

Manav-Aggarwal commented May 16, 2024

Needs from Astria:

  • API links on:
    • pulling batch from the SS
    • code pointers from the conductor on where this is happening
  • Getting Merkle proofs from the SS Namespace
  • Verification of Merkle proofs
  • Serialization/Deserialization and decompression of batch data
  • Composer submission to SS
  • pull block time from sequencer header

@joroshiba

@joroshiba
Copy link
Contributor

For grabbing data from the sequencer: we have a grpc service withGetFilteredSequencerBlock which returns the transactions and proofs required to validate data for a single rollup. Protobufs are posted to buf, directly link to rpc definition here

Data posting to celestia is generally batched, serialized as protobuf data and compressed via brotli with a compression setting of 5. There are, from the view of an individual rollup, two batches: Sequencer Metadata and Rollup Data

This documentation needs some updating with new data shapes, but the gist is correct: https://github.com/astriaorg/astria/blob/main/specs/data-flow-and-verification.md

Transactions can be submitted to composer (essentially a gas station) via grpc: https://buf.build/astria/composer-apis/docs/main:astria.composer.v1alpha1. The "rollup id" here is also used by conductor for knowing which rollup to read from sequencer and which namespace to read on celestia.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: No status
Development

No branches or pull requests

3 participants