-
Notifications
You must be signed in to change notification settings - Fork 277
Zk sdk #953
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: toccata
Are you sure you want to change the base?
Zk sdk #953
Changes from 9 commits
66922dc
b9dbe17
156ca8b
5ea0fb5
d25305c
d771435
f893dbd
260c9cb
1cdebb7
3209ce8
d235c6b
43f9657
1e005c7
786814a
e062651
88527d2
b1cadf6
41de812
3bad7a7
538eb42
5798082
373f780
0e11699
a136c51
10acbd7
b603891
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Large diffs are not rendered by default.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,6 +1,8 @@ | ||
| mod error; | ||
| mod fields; | ||
| pub mod groth16; | ||
| mod points; | ||
| mod result; | ||
| pub mod risc0; | ||
| pub mod tags; | ||
| pub mod tests; | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,11 @@ | ||
| use thiserror::Error; | ||
|
|
||
| #[derive(Debug, Error)] | ||
| pub enum PointError { | ||
| #[error("Malformed G1 field element")] | ||
| MalformedG1, | ||
| #[error("Malformed G2 field element")] | ||
| MalformedG2, | ||
| #[error("Ark deserialization error: {0}")] | ||
| ArkDeserialization(#[from] ark_serialize::SerializationError), | ||
| } |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,57 @@ | ||
| mod error; | ||
| use ark_bn254::{G1Affine, G2Affine}; | ||
| use ark_serialize::CanonicalDeserialize; | ||
|
|
||
| pub trait PointFromBytes<'input>: Sized { | ||
| type Input: ?Sized; | ||
| fn from_bytes(bytes: &'input Self::Input) -> Result<Self, PointError>; | ||
| } | ||
| pub use error::PointError; | ||
| pub struct G1(pub G1Affine); | ||
| pub struct G2(pub G2Affine); | ||
|
|
||
| impl<'input> PointFromBytes<'input> for G1 { | ||
| type Input = Vec<Vec<u8>>; | ||
|
|
||
| /// Deserialize an element over the G1 group from bytes in big-endian format | ||
| fn from_bytes(bytes: &Self::Input) -> Result<G1, PointError> { | ||
| if bytes.len() != 2 { | ||
| return Err(PointError::MalformedG1); | ||
| } | ||
| let g1_affine: Vec<u8> = bytes[0].iter().rev().chain(bytes[1].iter().rev()).cloned().collect(); | ||
|
Comment on lines
+14
to
+21
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why shout it be 2 dimensional reversed vectors?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. r0 stores in BE order whilst ark expects LE, but I agree here that the fn name should be more specific to r0, not generic |
||
|
|
||
| Ok(G1(G1Affine::deserialize_uncompressed(&*g1_affine)?)) | ||
| } | ||
| } | ||
|
|
||
| impl<'input> PointFromBytes<'input> for G2 { | ||
| type Input = Vec<Vec<Vec<u8>>>; | ||
|
|
||
| fn from_bytes(bytes: &Self::Input) -> Result<G2, PointError> { | ||
| if bytes.len() != 2 || bytes[0].len() != 2 || bytes[1].len() != 2 { | ||
| return Err(PointError::MalformedG2); | ||
| } | ||
| let g2_affine: Vec<u8> = bytes[0][1] | ||
| .iter() | ||
| .rev() | ||
| .chain(bytes[0][0].iter().rev()) | ||
| .chain(bytes[1][1].iter().rev()) | ||
| .chain(bytes[1][0].iter().rev()) | ||
| .cloned() | ||
| .collect(); | ||
|
Comment on lines
+31
to
+41
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why shout it be 2 dimensional reversed vectors?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. see above |
||
|
|
||
| Ok(G2(G2Affine::deserialize_uncompressed(&*g2_affine)?)) | ||
| } | ||
| } | ||
|
|
||
| impl Into<G1Affine> for G1 { | ||
| fn into(self) -> G1Affine { | ||
| self.0 | ||
| } | ||
| } | ||
|
|
||
| impl Into<G2Affine> for G2 { | ||
| fn into(self) -> G2Affine { | ||
| self.0 | ||
| } | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| use crate::zk_precompiles::error::ZkIntegrityError; | ||
|
|
||
| pub type Result<T> = std::result::Result<T, ZkIntegrityError>; |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -45,6 +45,19 @@ impl TryFrom<u8> for HashFnId { | |
| } | ||
| } | ||
|
|
||
| impl TryFrom<&String> for HashFnId { | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. &str
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. fromStr seems a better trait candidate than tryfrom
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why what if you pass incorrect string? then everything will panic |
||
| type Error = R0Error; | ||
|
|
||
| fn try_from(value: &String) -> Result<Self, Self::Error> { | ||
| match value.as_str() { | ||
| "blake2b" => Ok(HashFnId::Blake2b), | ||
| "poseidon2" => Ok(HashFnId::Poseidon2), | ||
| "sha-256" => Ok(HashFnId::Sha256), | ||
| _ => Err(R0Error::InvalidHashFnId(value.as_bytes().get(0).copied().unwrap_or(255))), | ||
| } | ||
| } | ||
| } | ||
|
|
||
| impl From<HashFnId> for u8 { | ||
| fn from(value: HashFnId) -> Self { | ||
| value as u8 | ||
|
|
@@ -70,7 +83,6 @@ pub struct SuccinctReceipt { | |
| /// The control ID of this receipt, identifying the recursion program that was run (e.g. lift, | ||
| /// join, or resolve). | ||
| control_id: Digest, | ||
|
|
||
| /// Claim containing information about the computation that this receipt proves. | ||
| /// | ||
| /// The standard claim type is [ReceiptClaim][crate::ReceiptClaim], which represents a RISC-V | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| use crate::zk_precompiles::risc0::R0Error; | ||
|
|
||
| pub type Result<T> = std::result::Result<T, R0Error>; |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,77 @@ | ||
| pub mod vk; | ||
| use super::super::result::Result; | ||
| use crate::{ | ||
| opcodes::codes::OpZkPrecompile, | ||
| script_builder::ScriptBuilder, | ||
| zk_precompiles::{ | ||
| fields::Fr, | ||
| points::{G1, G2, PointFromBytes}, | ||
| risc0::{ | ||
| R0Error, | ||
| zk_to_script::{R0ScriptBuilder, groth16::vk::try_verifying_key}, | ||
| }, | ||
| tags::ZkTag, | ||
| }, | ||
| }; | ||
| use ark_bn254::{Bn254, Config}; | ||
| use ark_ec::bn::Bn; | ||
| use ark_groth16::{Proof, VerifyingKey}; | ||
| use ark_serialize::{CanonicalDeserialize, CanonicalSerialize}; | ||
| use risc0_binfmt::Digestible; | ||
| use risc0_groth16::Seal; | ||
| use risc0_zkvm::{Digest, Groth16Receipt, Groth16ReceiptVerifierParameters, MaybePruned, SuccinctReceipt, sha}; | ||
| fn split_digest_bytes(d: Digest) -> ([u8; 32], [u8; 32]) { | ||
| let bytes = d.as_bytes(); | ||
| let mut lo = [0u8; 32]; | ||
| let mut hi = [0u8; 32]; | ||
| lo[..16].copy_from_slice(&bytes[..16]); | ||
| hi[..16].copy_from_slice(&bytes[16..32]); | ||
| (lo, hi) | ||
| } | ||
|
|
||
| fn to_fixed_array(input: &[u8]) -> [u8; 32] { | ||
| let mut fixed_array = [0u8; 32]; | ||
| let start = core::cmp::max(32, input.len()) - core::cmp::min(32, input.len()); | ||
| fixed_array[start..].copy_from_slice(&input[input.len().saturating_sub(32)..]); | ||
| fixed_array | ||
| } | ||
| impl R0ScriptBuilder { | ||
| /// Converts a Groth16Receipt into a Kaspa script. | ||
| /// This script unlocks the UTXO if the verification of the receipt | ||
| /// succeeds. | ||
| pub fn from_groth<Claim: Digestible + Clone>(receipt: &Groth16Receipt<Claim>) -> Result<ScriptBuilder> { | ||
| let mut params = Groth16ReceiptVerifierParameters::default(); | ||
| let seal = &receipt.seal; | ||
| let digested_claim = receipt.claim.digest::<sha::Impl>(); | ||
| let (a0, a1) = split_digest_bytes(params.control_root); | ||
| let (c0, c1) = split_digest_bytes(digested_claim); | ||
| let id_bn254 = to_fixed_array(params.bn254_control_id.as_bytes()); | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. digest impls
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. fair point |
||
| let seal = Seal::decode(seal).map_err(|e| R0Error::SealDecoding(e.to_string()))?; | ||
| let verifying_key = try_verifying_key()?; | ||
|
|
||
| let g1 = G1::from_bytes(&seal.a)?; | ||
| let g1_c = G1::from_bytes(&seal.c)?; | ||
| let g2 = G2::from_bytes(&seal.b)?; | ||
| let mut encoded_proof = Vec::new(); | ||
| let proof: Proof<ark_ec::bn::Bn<ark_bn254::Config>> = Proof::<Bn254> { a: g1.0, b: g2.0, c: g1_c.0 }; | ||
| proof.serialize_compressed(&mut encoded_proof)?; | ||
| // Serialize with serde_ark feature which under the hood is just | ||
| // uncompressed serialization. | ||
| // Re-serialize then deserialize to get the inner ark VK | ||
| let mut serialized_vk = Vec::new(); | ||
| verifying_key.serialize_compressed(&mut serialized_vk).map_err(|_| R0Error::BincodeVkSerialization)?; | ||
| let mut builder = ScriptBuilder::new(); | ||
| builder.add_data(&id_bn254)?; | ||
| builder.add_data(&c1)?; | ||
| builder.add_data(&c0)?; | ||
| builder.add_data(&a1)?; | ||
| builder.add_data(&a0)?; | ||
| builder.add_i64(5)?; | ||
| builder.add_data(&encoded_proof)?; | ||
| builder.add_data(&serialized_vk)?; | ||
| builder.add_data(&[ZkTag::Groth16 as u8])?; | ||
| builder.add_op(OpZkPrecompile)?; | ||
| Ok(builder) | ||
| } | ||
| } | ||
| // build_zk_script(&[seal, claim, hashfn, control_index, control_digests, journal, image_id, vec![stark_tag]]).unwrap() | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,67 @@ | ||
| use std::str::FromStr; | ||
|
|
||
| use ark_bn254::Bn254; | ||
| use num_bigint::BigInt; | ||
|
|
||
| use crate::zk_precompiles::{ | ||
| points::{G1, G2, PointFromBytes}, | ||
| risc0::{R0Error, zk_to_script::groth16::to_fixed_array}, | ||
| }; | ||
|
|
||
| // Constants from: risc0-ethereum/contracts/src/groth16/Groth16Verifier.sol | ||
| // When running a new ceremony, update them by running cargo xtask bootstrap-groth16 | ||
| // after updating the new Groth16Verifier.sol on the risc0-ethereum repo. | ||
| const ALPHA_X: &str = "20491192805390485299153009773594534940189261866228447918068658471970481763042"; | ||
| const ALPHA_Y: &str = "9383485363053290200918347156157836566562967994039712273449902621266178545958"; | ||
| const BETA_X1: &str = "4252822878758300859123897981450591353533073413197771768651442665752259397132"; | ||
| const BETA_X2: &str = "6375614351688725206403948262868962793625744043794305715222011528459656738731"; | ||
| const BETA_Y1: &str = "21847035105528745403288232691147584728191162732299865338377159692350059136679"; | ||
| const BETA_Y2: &str = "10505242626370262277552901082094356697409835680220590971873171140371331206856"; | ||
| const GAMMA_X1: &str = "11559732032986387107991004021392285783925812861821192530917403151452391805634"; | ||
| const GAMMA_X2: &str = "10857046999023057135944570762232829481370756359578518086990519993285655852781"; | ||
| const GAMMA_Y1: &str = "4082367875863433681332203403145435568316851327593401208105741076214120093531"; | ||
| const GAMMA_Y2: &str = "8495653923123431417604973247489272438418190587263600148770280649306958101930"; | ||
| const DELTA_X1: &str = "1668323501672964604911431804142266013250380587483576094566949227275849579036"; | ||
| const DELTA_X2: &str = "12043754404802191763554326994664886008979042643626290185762540825416902247219"; | ||
| const DELTA_Y1: &str = "7710631539206257456743780535472368339139328733484942210876916214502466455394"; | ||
| const DELTA_Y2: &str = "13740680757317479711909903993315946540841369848973133181051452051592786724563"; | ||
|
|
||
| const IC0_X: &str = "8446592859352799428420270221449902464741693648963397251242447530457567083492"; | ||
| const IC0_Y: &str = "1064796367193003797175961162477173481551615790032213185848276823815288302804"; | ||
| const IC1_X: &str = "3179835575189816632597428042194253779818690147323192973511715175294048485951"; | ||
| const IC1_Y: &str = "20895841676865356752879376687052266198216014795822152491318012491767775979074"; | ||
| const IC2_X: &str = "5332723250224941161709478398807683311971555792614491788690328996478511465287"; | ||
| const IC2_Y: &str = "21199491073419440416471372042641226693637837098357067793586556692319371762571"; | ||
| const IC3_X: &str = "12457994489566736295787256452575216703923664299075106359829199968023158780583"; | ||
| const IC3_Y: &str = "19706766271952591897761291684837117091856807401404423804318744964752784280790"; | ||
| const IC4_X: &str = "19617808913178163826953378459323299110911217259216006187355745713323154132237"; | ||
| const IC4_Y: &str = "21663537384585072695701846972542344484111393047775983928357046779215877070466"; | ||
| const IC5_X: &str = "6834578911681792552110317589222010969491336870276623105249474534788043166867"; | ||
| const IC5_Y: &str = "15060583660288623605191393599883223885678013570733629274538391874953353488393"; | ||
|
|
||
| /// Convert a decimal U256 string to a 32-byte big-endian Vec. | ||
| fn from_u256(value: &str) -> Result<Vec<u8>, R0Error> { | ||
| let bytes = BigInt::from_str(value)?.to_bytes_be().1; | ||
| Ok(to_fixed_array(&bytes).to_vec()) | ||
| } | ||
|
|
||
| pub fn try_verifying_key() -> Result<ark_groth16::VerifyingKey<Bn254>, R0Error> { | ||
| let alpha_g1 = G1::from_bytes(&vec![from_u256(ALPHA_X)?, from_u256(ALPHA_Y)?])?.0; | ||
| let beta_g2 = | ||
| G2::from_bytes(&vec![vec![from_u256(BETA_X1)?, from_u256(BETA_X2)?], vec![from_u256(BETA_Y1)?, from_u256(BETA_Y2)?]])?.0; | ||
| let gamma_g2 = | ||
| G2::from_bytes(&vec![vec![from_u256(GAMMA_X1)?, from_u256(GAMMA_X2)?], vec![from_u256(GAMMA_Y1)?, from_u256(GAMMA_Y2)?]])?.0; | ||
| let delta_g2 = | ||
| G2::from_bytes(&vec![vec![from_u256(DELTA_X1)?, from_u256(DELTA_X2)?], vec![from_u256(DELTA_Y1)?, from_u256(DELTA_Y2)?]])?.0; | ||
|
|
||
| let gamma_abc_g1 = vec![ | ||
| G1::from_bytes(&vec![from_u256(IC0_X)?, from_u256(IC0_Y)?])?.0, | ||
| G1::from_bytes(&vec![from_u256(IC1_X)?, from_u256(IC1_Y)?])?.0, | ||
| G1::from_bytes(&vec![from_u256(IC2_X)?, from_u256(IC2_Y)?])?.0, | ||
| G1::from_bytes(&vec![from_u256(IC3_X)?, from_u256(IC3_Y)?])?.0, | ||
| G1::from_bytes(&vec![from_u256(IC4_X)?, from_u256(IC4_Y)?])?.0, | ||
| G1::from_bytes(&vec![from_u256(IC5_X)?, from_u256(IC5_Y)?])?.0, | ||
| ]; | ||
|
|
||
| Ok(ark_groth16::VerifyingKey::<Bn254> { alpha_g1, beta_g2, gamma_g2, delta_g2, gamma_abc_g1 }) | ||
| } | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. the key is constant, why is it calculated in runtime. why does it require vector allocations and conversions from string
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed, let me fix this
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. there is no test that ensures its the same calculatuon of verifying key as upstream crate has
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There is no way to extract inner verifying key from r0 repo its only public to the crate see:
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There is an indirect test, there is an r0 proof that verifies using this key.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. But ill make it better |
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| mod groth16; | ||
| mod succinct; | ||
|
|
||
| pub struct R0ScriptBuilder; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this intentional or a leftover? why does that path need special treatment here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right must be a leftover, fixing.