Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create witness and blob preloader #31

Merged
merged 47 commits into from
Feb 25, 2025
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
bdf0f34
add func without cargo dep
bxue-l2 Jan 13, 2025
bff6d13
working local proof computation
bxue-l2 Jan 13, 2025
298793d
add readme
bxue-l2 Jan 13, 2025
1c3f2c0
init
bxue-l2 Jan 13, 2025
a1c9374
fix comments
bxue-l2 Jan 15, 2025
755a5c8
fix lint
bxue-l2 Jan 23, 2025
be6e64e
add cryptography crate
bxue-l2 Jan 24, 2025
14aad44
working with new kzg lib interface
bxue-l2 Jan 30, 2025
4ec8403
WIP
Feb 14, 2025
99c1406
t1
bxue-l2 Feb 14, 2025
bb54f9d
cleanup
bxue-l2 Feb 15, 2025
967332c
add docs fix dep
bxue-l2 Feb 15, 2025
221d829
nit
bxue-l2 Feb 18, 2025
362cfb5
fix format issue
bxue-l2 Feb 18, 2025
53bf291
update eigenda-v2-struct lib
bxue-l2 Feb 18, 2025
d7f6ace
cleanup compute-kzg-proof
bxue-l2 Feb 18, 2025
90e93ff
fix lint
bxue-l2 Feb 18, 2025
4cdf6ca
fix lint
bxue-l2 Feb 18, 2025
38103d4
add trait and impl or preloader and oracle
bxue-l2 Feb 19, 2025
7e03e25
add cert version, make both v1 v2 share the identical interface
bxue-l2 Feb 19, 2025
c0e96d2
add cert version and multiplex cert handling
bxue-l2 Feb 19, 2025
24bd4f9
add zkvm verify logic, refactor validity code into its own struct
bxue-l2 Feb 20, 2025
4ffa71f
rename file
bxue-l2 Feb 20, 2025
641fb41
add hint type and refactor fetcher
bxue-l2 Feb 20, 2025
067b4c4
Update bin/host/src/eigenda_fetcher/mod.rs
bxue-l2 Feb 21, 2025
e847a16
rename hint
bxue-l2 Feb 21, 2025
27670ae
Merge branch 'impl-hint-type-for-v1' of https://github.com/Layr-Labs/…
bxue-l2 Feb 21, 2025
d90799b
rename blob_version to payload_encode_version
bxue-l2 Feb 24, 2025
e066250
fix lint
bxue-l2 Feb 24, 2025
fd01cfa
Update crates/proof/src/preloaded_eigenda_provider.rs
bxue-l2 Feb 20, 2025
6eac168
Update crates/compute-kzg-proof/README.md
bxue-l2 Feb 20, 2025
5c0be2d
Update crates/proof/src/eigenda_blob_witness.rs
bxue-l2 Feb 20, 2025
384ec0b
fix lint
bxue-l2 Feb 20, 2025
462285d
fix warn to info
bxue-l2 Feb 20, 2025
7793fb6
fix read me
bxue-l2 Feb 20, 2025
62c2668
make kzg proof typed
bxue-l2 Feb 20, 2025
108cea0
fix bug
bxue-l2 Feb 20, 2025
7382e45
had a terrible merge conflict, fix missing problems
bxue-l2 Feb 25, 2025
b6609ff
working v1
Feb 25, 2025
4854613
working v1
Feb 25, 2025
ab4b9ec
fix comm bug
Feb 25, 2025
f98c543
fix lint error
Feb 25, 2025
97dab19
add inputs args to run hokulea
Feb 25, 2025
0ee38dc
revert machete
Feb 25, 2025
bcfe850
fix lint
Feb 25, 2025
e5b0f67
change gitflow for machate
Feb 25, 2025
5bfbdf1
add nightly to machete to make it run
Feb 25, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
374 changes: 128 additions & 246 deletions Cargo.lock

Large diffs are not rendered by default.

12 changes: 11 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ members = ["bin/*", "crates/*"]
hokulea-client = { path = "bin/client", version = "0.1.0", default-features = false }
hokulea-eigenda = { path = "crates/eigenda", version = "0.1.0", default-features = false }
hokulea-proof = { path = "crates/proof", version = "0.1.0", default-features = false }
hokulea-compute-kzg-proof = { path = "crates/compute-kzg-proof", version = "0.1.0", default-features = false }

# Kona
# We use git dependencies instead of version dependencies because Kona is moving very fast right now
Expand Down Expand Up @@ -63,13 +64,22 @@ reqwest = "0.12.12"
async-trait = "0.1.85"
linked_list_allocator = "0.10.5"
bytes = "1.9.0"
num = "0.4"

# General
sha2 = { version = "0.10.8", default-features = false }
c-kzg = { version = "2.0.0", default-features = false }
anyhow = { version = "1.0.95", default-features = false }
thiserror = { version = "2.0.9", default-features = false }
rust-kzg-bn254 = { version = "0.2.1", default-features = false }
rust-kzg-bn254-primitives = { git = "https://github.com/Layr-Labs/rust-kzg-bn254", rev = "b3e532e9aad533009849755d5ad7b9578a16bfb2", default-features = false }
rust-kzg-bn254-prover = { git = "https://github.com/Layr-Labs/rust-kzg-bn254", rev = "b3e532e9aad533009849755d5ad7b9578a16bfb2", default-features = false }
rust-kzg-bn254-verifier = { git = "https://github.com/Layr-Labs/rust-kzg-bn254", rev = "b3e532e9aad533009849755d5ad7b9578a16bfb2", default-features = false }

# EigenDA v2 struct
eigenda-v2-struct-rust = { git = "https://github.com/bxue-l2/eigenda-v2-struct-rust", rev = "cdeaf4e6ed7c3d55e70bf97c2c04a5e056b780ef" }

ark-bn254 = "0.5.0"
ark-ff = { version = "0.5.0", features = ["parallel"] }

# Tracing
tracing-loki = "0.2.5"
Expand Down
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@

Hokulea is a library to provide the altda providers for a derivation pipeline built with [kona](https://github.com/anton-rs/kona) to understand eigenDA blobs, following the [kona book](https://anton-rs.github.io/kona/sdk/pipeline/providers.html#implementing-a-custom-data-availability-provider) recommendation (also see this [comment](https://github.com/anton-rs/kona/pull/862#issuecomment-2515038089)).

### Download SRS points
Hokulea host currently computes a challenge proof that validates the correctness of the eigenda blob against the provided kzg commitment. Such computation requires the host to have access to sufficient KZG SRS points.

### Running against devnet

First start the devnet:
Expand All @@ -17,4 +20,4 @@ cd bin/client
just run-client-native-against-devnet
```

![](./hokulea.jpeg)
![](./hokulea.jpeg)
2 changes: 1 addition & 1 deletion bin/client/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,4 @@ kona-executor.workspace = true

hokulea-proof.workspace = true

tracing.workspace = true
tracing.workspace = true
1 change: 1 addition & 0 deletions bin/client/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ use kona_preimage::{
};

use alloc::sync::Arc;

use core::fmt::Debug;
use kona_executor::TrieDBProvider;
use kona_proof::{
Expand Down
1 change: 1 addition & 0 deletions bin/host/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ edition = "2021"
hokulea-proof.workspace = true
hokulea-client.workspace = true
hokulea-eigenda.workspace = true
hokulea-compute-kzg-proof.workspace = true

# Kona
kona-preimage = { workspace = true, features = ["std"] }
Expand Down
41 changes: 22 additions & 19 deletions bin/host/src/eigenda_fetcher/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ use alloy_provider::ReqwestProvider;
use alloy_rlp::Decodable;
use anyhow::{anyhow, Result};
use core::panic;
use hokulea_compute_kzg_proof::compute_kzg_proof;
use hokulea_eigenda::BlobInfo;
use hokulea_eigenda::EigenDABlobData;
use hokulea_eigenda::BYTES_PER_FIELD_ELEMENT;
Expand Down Expand Up @@ -156,15 +157,15 @@ where
let cert_blob_info = BlobInfo::decode(&mut &item_slice[4..]).unwrap();

// Proxy should return a cert whose data_length measured in symbol (i.e. 32 Bytes)
let blob_length = cert_blob_info.blob_header.data_length as u64;
warn!("blob length: {:?}", blob_length);
let data_length = cert_blob_info.blob_header.data_length as u64;
warn!("data length: {:?}", data_length);

let eigenda_blob = EigenDABlobData::encode(rollup_data.as_ref());

if eigenda_blob.blob.len() != blob_length as usize * BYTES_PER_FIELD_ELEMENT {
if eigenda_blob.blob.len() != data_length as usize * BYTES_PER_FIELD_ELEMENT {
return Err(
anyhow!("data size from cert does not equal to reconstructed data codec_rollup_data_len {} blob size {}",
eigenda_blob.blob.len(), blob_length as usize * BYTES_PER_FIELD_ELEMENT));
eigenda_blob.blob.len(), data_length as usize * BYTES_PER_FIELD_ELEMENT));
}

// Write all the field elements to the key-value store.
Expand All @@ -176,9 +177,9 @@ where
blob_key[..32].copy_from_slice(cert_blob_info.blob_header.commitment.x.as_ref());
blob_key[32..64].copy_from_slice(cert_blob_info.blob_header.commitment.y.as_ref());

trace!("cert_blob_info blob_length {:?}", blob_length);
trace!("cert_blob_info data_length {:?}", data_length);

for i in 0..blob_length {
for i in 0..data_length {
blob_key[88..].copy_from_slice(i.to_be_bytes().as_ref());
let blob_key_hash = keccak256(blob_key.as_ref());

Expand All @@ -192,21 +193,23 @@ where
)?;
}

// TODO proof is at the random point, but we need to figure out where to generate
//
// Write the KZG Proof as the last element, needed for ZK
//blob_key[88..].copy_from_slice((blob_length).to_be_bytes().as_ref());
//let blob_key_hash = keccak256(blob_key.as_ref());
let kzg_proof = match compute_kzg_proof(&eigenda_blob.blob) {
Ok(p) => p,
Err(e) => return Err(anyhow!("cannot compute kzg proof {}", e)),
};

//kv_write_lock.set(
// PreimageKey::new(*blob_key_hash, PreimageKeyType::Keccak256).into(),
// blob_key.into(),
//)?;
// Write the KZG Proof as the last element, needed for ZK
blob_key[88..].copy_from_slice((data_length).to_be_bytes().as_ref());
let blob_key_hash = keccak256(blob_key.as_ref());
kv_write_lock.set(
PreimageKey::new(*blob_key_hash, PreimageKeyType::Keccak256).into(),
blob_key.into(),
)?;
// proof to be done
//kv_write_lock.set(
// PreimageKey::new(*blob_key_hash, PreimageKeyType::GlobalGeneric).into(),
// [1, 2, 3].to_vec(),
//)?;
kv_write_lock.set(
PreimageKey::new(*blob_key_hash, PreimageKeyType::GlobalGeneric).into(),
kzg_proof.to_vec(),
)?;
} else {
panic!("Invalid hint type: {hint_type}. FetcherWithEigenDASupport.prefetch only supports EigenDACommitment hints.");
}
Expand Down
12 changes: 12 additions & 0 deletions crates/compute-kzg-proof/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
[package]
name = "hokulea-compute-kzg-proof"
version = "0.1.0"
edition = "2021"


[dependencies]
rust-kzg-bn254-prover.workspace = true
rust-kzg-bn254-primitives.workspace = true
num.workspace = true

alloy-primitives.workspace = true
5 changes: 5 additions & 0 deletions crates/compute-kzg-proof/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# `compute-kzg-proof`

This is the temporary crate for generating a kzg proof using eigenda blob. In the future, such proof is carried inside the blob header. Then this crate can be removed.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wdym by "inside the blob header"?
Also is there anything preventing this other implementation right now? Or is it just that its harder and you didn't want to do it now?

Copy link
Collaborator Author

@bxue-l2 bxue-l2 Feb 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe @litt3 is working on it in proxy

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bxue-l2 Are you referring to the addition of pi to the BlobCommitment?

That's on my list of TODOs, but not at the top, FYI

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't think it is urgent

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have an issue tracking that? best to link to that issue than to leave vague comments. Then people know exactly what's up and how to track the progress.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is an issue, but I suppose the question is whether it's appropriate to link private linear issues in a public forum. Seems a bit odd

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely, which is why we should always be creating issues on github first, and linear will import them. Isn't this the issue? Layr-Labs/eigenda#1037


This crate access filesystem, cannot be used in any fault proof or zk vm.
59 changes: 59 additions & 0 deletions crates/compute-kzg-proof/src/compute_kzg_proof.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
//! This is the temporary crate for generating a kzg proof using eigenda blob. In the future,
//! such proof is carried inside the blob header. Then this crate can be removed. This crate access filesystem,
//! cannot be used in any fault proof or zk vm.
extern crate alloc;
use alloc::vec::Vec;
use alloy_primitives::Bytes;
use num::BigUint;
use rust_kzg_bn254_primitives::blob::Blob;
use rust_kzg_bn254_primitives::errors::KzgError;
use rust_kzg_bn254_prover::kzg::KZG;
use rust_kzg_bn254_prover::srs::SRS;

/// This function computes a KZG proof for a eigenDA blob
/// In the future, the eigenda blob header would contain the proof such that it does not require local computation
/// nitro code <https://github.com/Layr-Labs/nitro/blob/14f09745b74321f91d1f702c3e7bb5eb7d0e49ce/arbitrator/prover/src/kzgbn254.rs#L141>
/// could refactor in the future, such that both host and client can compute the proof
pub fn compute_kzg_proof(blob: &[u8]) -> Result<Bytes, KzgError> {
// In the future, it might make sense to let the proxy to return kzg proof, instead of local computation
let srs = SRS::new("resources/g1.32mb.point", 268435456, 1024).unwrap();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you loading 256MiB?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also you should try to get into the habbit of not .unwrap()ing in functions that return a Result.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

268435456 is the number of SRS, see issue here, Layr-Labs/rust-kzg-bn254#46

Copy link
Collaborator Author

@bxue-l2 bxue-l2 Feb 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then you have to convert your error into one of kzg error. Simpler just abort, and let user fix the proof issue, as there is no point to continnue

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then why return an error at all? just panic everywhere

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also comented on that issue. Still don't understand how we are loading 256MiB from a file that is supposed to be 32MiB large.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because some error can be handled by the upper layer and is acceptable by the application, whereas in some other cases, it just has to break. I guess your point is that it should only breaks at the highest level for clean shutdown

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we only load 1024 points in the code above
268435456 is not the number of point we are going to load, it is just some number we make sure 1024 is not greater than that

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ahhhhh I see I totally misunderstood SRS::new()'s signature then.
My question remains then though, why are you setting 268435456 and not 1024?
And if order is not needed, can we just get rid of it?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should just get rid of it

let mut kzg = KZG::new();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't we have a constructor that doesn't requiring making kzg mutable and then modifying it in place?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be good to add


let input = Blob::new(blob);
let input_poly = input.to_polynomial_eval_form();

kzg.calculate_and_store_roots_of_unity(blob.len() as u64)
.unwrap();

let mut commitment_bytes = vec![0u8; 0];

let commitment = kzg.commit_eval_form(&input_poly, &srs)?;

// TODO the library should have returned the bytes, or provide a helper
// for conversion. For both proof and commitment
let commitment_x_bigint: BigUint = commitment.x.into();
let commitment_y_bigint: BigUint = commitment.y.into();

append_left_padded_biguint_be(&mut commitment_bytes, &commitment_x_bigint);
append_left_padded_biguint_be(&mut commitment_bytes, &commitment_y_bigint);

let mut proof_bytes = vec![0u8; 0];

let proof = kzg.compute_blob_proof(&input, &commitment, &srs)?;
let proof_x_bigint: BigUint = proof.x.into();
let proof_y_bigint: BigUint = proof.y.into();

append_left_padded_biguint_be(&mut proof_bytes, &proof_x_bigint);
append_left_padded_biguint_be(&mut proof_bytes, &proof_y_bigint);

// push data into witness
Ok(proof_bytes.into())
}

/// This function convert a BigUint into 32Bytes vector in big endian format
pub fn append_left_padded_biguint_be(vec: &mut Vec<u8>, biguint: &BigUint) {
let bytes = biguint.to_bytes_be();
let padding = 32 - bytes.len();
vec.extend(std::iter::repeat_n(0, padding));
vec.extend_from_slice(&bytes);
}
14 changes: 14 additions & 0 deletions crates/compute-kzg-proof/src/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
#![doc = include_str!("../README.md")]
#![warn(
missing_debug_implementations,
missing_docs,
unreachable_pub,
rustdoc::all
)]
#![deny(unused_must_use, rust_2018_idioms)]
#![cfg_attr(docsrs, feature(doc_cfg, doc_auto_cfg))]
#![cfg_attr(not(test), warn(unused_crate_dependencies))]

pub mod compute_kzg_proof;

pub use compute_kzg_proof::compute_kzg_proof;
2 changes: 1 addition & 1 deletion crates/eigenda/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ alloy-rlp.workspace = true
tracing.workspace = true
async-trait.workspace = true
bytes.workspace = true
rust-kzg-bn254.workspace = true
rust-kzg-bn254-primitives.workspace = true
maili-protocol.workspace = true

[features]
Expand Down
2 changes: 1 addition & 1 deletion crates/eigenda/src/eigenda_data.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ use alloc::vec;
use alloy_primitives::Bytes;
use bytes::buf::Buf;
use kona_derive::errors::BlobDecodingError;
use rust_kzg_bn254::helpers;
use rust_kzg_bn254_primitives::helpers;

#[derive(Default, Clone, Debug)]
/// Represents the data structure for EigenDA Blob.
Expand Down
3 changes: 0 additions & 3 deletions crates/eigenda/src/traits.rs
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,4 @@ pub trait EigenDABlobProvider {

/// Fetches a blob.
async fn get_blob(&mut self, cert: &Bytes) -> Result<Bytes, Self::Error>;

/// Fetches an element from a blob.
async fn get_element(&mut self, cert: &Bytes, element: &Bytes) -> Result<Bytes, Self::Error>;
}
7 changes: 7 additions & 0 deletions crates/proof/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,13 @@ kona-derive.workspace = true

hokulea-eigenda.workspace = true

eigenda-v2-struct-rust.workspace = true
rust-kzg-bn254-primitives.workspace = true
rust-kzg-bn254-verifier.workspace = true

ark-bn254.workspace = true
ark-ff.workspace = true

# Alloy
alloy-primitives.workspace = true

Expand Down
33 changes: 33 additions & 0 deletions crates/proof/src/eigenda_blob_witness.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
extern crate alloc;
use alloc::vec::Vec;
use alloy_primitives::Bytes;

use eigenda_v2_struct_rust::EigenDAV2Cert;
use rust_kzg_bn254_primitives::blob::Blob;

/// stores
#[derive(Debug, Clone, Default)]
pub struct EigenDABlobWitnessData {
/// eigenda v2 cert
pub eigenda_certs: Vec<EigenDAV2Cert>,
/// blob empty if cert is invalid
pub eigenda_blobs: Vec<Blob>,
/// kzg proof on Fiat Shamir points
pub kzg_proofs: Vec<Bytes>,
/// a zk proof attesting the Cert is valid
/// each element is a tuple indicating
/// (validity, proof for validity) regardless of
/// validity is true or false
pub validity_proofs: Vec<(bool, Bytes)>,
}

impl EigenDABlobWitnessData {
pub fn new() -> Self {
EigenDABlobWitnessData {
eigenda_certs: Vec::new(),
eigenda_blobs: Vec::new(),
kzg_proofs: Vec::new(),
validity_proofs: Vec::new(),
}
}
}
23 changes: 0 additions & 23 deletions crates/proof/src/eigenda_provider.rs
Original file line number Diff line number Diff line change
Expand Up @@ -100,27 +100,4 @@ impl<T: CommsClient + Sync + Send> EigenDABlobProvider for OracleEigenDAProvider

Ok(blob.into())
}

async fn get_element(&mut self, cert: &Bytes, element: &Bytes) -> Result<Bytes, Self::Error> {
self.oracle
.write(&ExtendedHintType::EigenDACommitment.encode_with(&[cert]))
.await
.map_err(OracleProviderError::Preimage)?;

let cert_point_key = Bytes::copy_from_slice(&[cert.to_vec(), element.to_vec()].concat());

self.oracle
.write(&ExtendedHintType::EigenDACommitment.encode_with(&[&cert_point_key]))
.await
.map_err(OracleProviderError::Preimage)?;
let data = self
.oracle
.get(PreimageKey::new(
*keccak256(cert_point_key),
PreimageKeyType::GlobalGeneric,
))
.await
.map_err(OracleProviderError::Preimage)?;
Ok(data.into())
}
}
4 changes: 4 additions & 0 deletions crates/proof/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,7 @@ pub mod hint;
pub mod pipeline;

pub mod eigenda_provider;

pub mod preloaded_eigenda_provider;

pub mod eigenda_blob_witness;
Loading