Solana Validator 101: Transaction Processing

High-level of transaction flow in Solana.
  1. The dapp built a transaction to buy some amount of tokens.
  2. The dapp sent the transaction to your wallet (Phantom, Sollet, etc.) to be signed.
  3. The wallet signed the transaction using your private key and sends it back to the dapp.
  4. The dapp took the signed transaction and uses the sendTransaction HTTP API call to send the transaction to the current RPC provider specified in the dapp.
  5. The RPC server sent your transaction as a UDP packet to the current and next validator on the leader schedule.
  6. The validator’s TPU receives the transaction, verifies the signature using CPU or GPU, executes it, and propagates it to other validators in the network.

Transaction Processing Unit

Validator & TPU Overview
  • tpu: normal transactions (Serum orders, NFT minting, token transfers, etc.)
  • tpu_vote: votes
  • tpu_forwards: if the current leader can’t process all transactions, it forwards unprocessed packets to the next leader on this port.
  • Verified gossip vote packets
  • Verified tpu_vote packets
  • Verified tpu packets (normal transactions)
  1. Deserialize the packet into a SanitizedTransaction.
  2. Run the transaction through a Quality of Service (QoS) model. This selects transactions to execute depending on a few properties (signature, length of instruction data bytes, and some type of cost model based on access patterns for a given program id).
  3. The pipeline then grabs a batch of transactions to be executed. This group of transactions are greedily selected to form a parallelizable entry (a group of transactions that can be executed in parallel). In order to do this, it uses the isWriteable flag that clients set when building transactions and a per-account read-write lock to ensure no data race conditions.
  4. Transactions are executed.
  5. The results are sent to the PohService and then forwarded to the broadcast stage to be shredded (packetized) and propagated to the rest of the network. They are also saved to the bank and accounts database.
Multiple batches can be processed at the same. When the bottom pipeline fails to grab a lock on c, it will buffer it for the next iteration.
Executed batches also need to be parallelizable.
output = "solana summer"
while 1:
output = hash([output])
output = "solana summer"
record_queue = Queue()
while 1:
record = record_queue.maybe_pop()
if record:
output = hash([output, record])
else:
output = hash([output])
Turbine

Thank You

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store