KZG | Composable and distributed systems group
Mon, 2026-01-19
Sharing our experimental call summaries.
Al-generated digests of Yak Collective study groups.
Key resources discussed
NotebookLM analysis: https://notebooklm.google.com/notebook/7694ac20-65ad-400f-8536-1be4386c631a
Dankrad Feist explainer: https://dankradfeist.de/ethereum/2020/06/16/kate-polynomial-commitments.html
Original 2010 paper: https://iacr.org/archive/asiacrypt2010/6477178/6477178.pdf
Ceremony explanation: https://0xhagen.medium.com/explain-kzg-ceremony-on-ethereum-like-i-am-5-lia5-4439a3950446
Street-Level Framing: Why KZG Commitments Matter for Ethereum
The group opened by trying to build an intuitive, non-specialist framing of KZG polynomial commitments in the Ethereum ecosystem. One participant described Ethereum as “a really bad computer”—slow, expensive, and unwieldy—but with the crucial property of decentralization, which is why people tolerate those costs. Scaling Ethereum, in this view, is about adding capacity without eroding that decentralization.
Within that framing, KZG polynomial commitments were positioned as a key cryptographic building block that enables scaling without a proportional increase in proof size. In other words, you can validate much more data without the proofs growing linearly with that data. This is particularly important for:
Data availability sampling (DAS) and blob storage in Ethereum rollups
Other systems that need to prove “this data was here” without shipping all the data
The group repeatedly emphasized that very few participants genuinely understand the underlying math (elliptic curves, finite fields, pairings), and that this is acceptable up to a point—similar to not fully understanding how cars or airplanes work while still using them. But that unease about cryptographic opacity remained a theme throughout the discussion.
What KZG Polynomial Commitments Are (At a High Level)
One participant summarized KZG polynomial commitments as an “envelope” for data: you store data in a structured mathematical object (a polynomial), commit to it, and later can verify that a particular piece of data was indeed inside that commitment.
Key points raised:
Constant-size proofs: The standout advantage of KZG commitments versus hash-based schemes is that proof size remains constant regardless of how much data is committed. This is central for Ethereum’s blob/DAS design.
Application scope:
Ethereum blob data and data availability sampling.
SNARK-based VMs (e.g., Plonk-based zkEVMs used by Polygon and others).
Possible general-purpose uses: e.g., sensors that periodically produce data, where you want to bundle large datasets and later verify that specific data points were included.
Participants noted that KZG is not “just an Ethereum thing”; it’s a general cryptographic primitive that Ethereum has adopted and operationalized at scale.
There was also appreciation for the timeline: KZG originated in a 2010 paper from three institutions/authors, later “appropriated” by Ethereum. The group found it interesting to see the arc from academic cryptography to large-scale production infrastructure.
Security Model and Quantum Concerns
The security of KZG commitments, as discussed, rests on standard hardness assumptions:
Discrete logarithm problem (and related assumptions in elliptic curve groups).
Finite field / elliptic curve structure underlying the pairing-based construction.
Points surfaced:
Not quantum-resistant: The notes and articles mentioned that KZG (in its current form) is not resistant to powerful quantum computers. The group did not delve into the exact attack models but speculated that a quantum adversary could potentially:
Recover the “toxic waste” (secret randomness) from the trusted setup.
Use that to forge proofs or otherwise break the integrity of the commitment scheme.
Security vs polynomial degree: One important technical nuance highlighted:
The security guarantee depends on the polynomial degree being much smaller than the size of the underlying elliptic curve group.
As blob sizes grow (i.e., higher-degree polynomials), the security decreases in principle, though the current parameters are chosen such that this decrease is negligible in practice.
Implication: blob size cannot grow arbitrarily without revisiting or reparameterizing the underlying curve.
Several participants noted that, for now, quantum threats are “future” issues, but that the horizon might be closer than previously assumed, given more serious discussions by people who were previously skeptics.
The Trusted Setup Ceremony: Lived Experience and Threats
A substantial part of the discussion focused on the “ceremony” aspect: how the trusted setup works in practice, how it feels to participate, and where it might fail.
Powers of Tau / KZG Ceremony Experience
First-hand experience was described for Ethereum’s Powers of Tau ceremony:
Participation flow:
Visit a designated website (run by the Ethereum community/Foundation).
Generate randomness by moving the mouse around, similar to wallet generation UX.
Wait 20–30 minutes while a progress indicator (e.g., a big circle) completes.
Log in or sign with an Ethereum wallet.
Receive a receipt proving contribution and a little “fireworks” style celebration UI.
Scale and redundancy:
Roughly 140,000+ participants contributed.
Cryptographically, you only need one honest participant who properly destroys their contribution (“toxic waste”) for the entire setup to be secure.
This raised a natural question: Why so many participants if only one honest one is required? The implied answer is to make it overwhelmingly likely at least one honest contribution exists, even under pessimistic assumptions.
This experience was compared to other ceremonies, like Zcash’s earlier, much smaller trusted setup with ~7–8 participants, and possibly to “Equilibrium” style ceremonies some participants had joined.
Ceremony Threat Model
The group discussed realistic vs theoretical attack vectors:
Man-in-the-middle / front-running:
A more mundane and plausible attack would be to intercept participants before they reach the legitimate ceremony endpoint:
E.g., a spoofed site that appears to be the official Ethereum ceremony.
Capture/modify contributions or bypass them altogether.
This exploit acts at the “front door”—before any fancy cryptography—potentially compromising the entire setup without needing quantum power.
Centralization risk:
A centralized website as the rendezvous point itself is an attack vector.
The system relies heavily on trust in the Ethereum Foundation’s infrastructure and communication channels (e.g., that you got the right URL from someone you trust).
Asymmetry in favor of defenders:
One conceptual point people liked: in contrast to many security scenarios, here the asymmetry favors defenders:
If any one contribution is honest and properly destroyed, the system is safe.
You do not need to secure every participant end-to-end; you just need at least one uncompromised path.
There was curiosity (but no concrete answers) about the operational hardening around the ceremony:
How they checked for spoofed sites.
How key material and transcripts were managed and published.
What guarantees participants could verify after the fact.
Complexity, “Priesthoods,” and Vitalik’s Trustlessness Vision
A major theme of the discussion was the tension between:
Cryptographic / protocol complexity
The ethos of decentralization and trustlessness
The reality of specialized expertise (“priesthoods”)
The Priesthood Problem
Participants noted that:
The number of people globally who can:
Read the KZG papers and code,
Understand the elliptic curve and pairing machinery, and
Audit client implementations for correctness
…is likely on the order of fewer than a hundred.
This effectively creates a priesthood: a small class of experts whose understanding underpins everyone else’s trust.
Concerns raised:
This conflicts with a particular reading of Vitalik’s vision for Ethereum:
In a recent post, he argued for protocol simplification and minimizing reliance on specialized, opaque “priesthoods.”
One criterion for “true trustlessness” is that you shouldn’t need a tiny group of high priests to understand and audit the system.
KZG and similar advanced cryptography arguably move in the opposite direction: more powerful, but more complex and more esoteric.
Counterargument: Complexity Is Inevitable at the Frontier
Others pushed back on the idea that we can avoid priesthoods:
Technological precedent:
75 years ago, programming was a priesthood; now many high school students can code.
But at the same time, a new priesthood has emerged around EUV lithography and advanced chip fabrication. Only a handful of PhDs and specialized engineers can build or deeply understand these tools.
Similarly, only a limited number of entities (e.g., SpaceX) can build certain classes of high-end rocket engines, whose internals are not widely understood or reproduced.
Moving frontier:
There is always a “bottleneck frontier” where understanding is concentrated and hard-won.
As some layers become commoditized and understandable by many, new advanced layers appear that require deep expertise.
Unless technological progress itself stops, a non-priesthood world is unrealistic.
The group largely converged on the view that priesthoods are inevitable, but their structure matters (open vs closed, see next section).
Open vs Closed Priesthoods, and the Social Layer
Beyond pure technical complexity, the group spent significant time on the social and governance aspects: how knowledge and control are structured and what that implies for safety and agency.
Open vs Closed Priesthoods
A useful distinction was drawn:
Open priesthood:
Knowledge is public (papers, code, standards).
There is a ladder—even if it takes years of study, a motivated person can climb into the expert set.
You don’t need initiation from a gatekeeping clique; you can self-study and verify.
Closed priesthood:
Knowledge is proprietary, obscured, or hidden.
Access depends on being initiated or employed by specific organizations.
Examples include:
Certain kinds of proprietary firmware, DRM schemes, or car diagnostic tools.
Vendor-only protocols and undocumented systems used in automotive and OT (operational technology).
Participants argued that modern cryptography generally aspires to the open priesthood model: open standards, proofs, and implementations that anyone can examine.
In contrast, many industrial systems (e.g., car computers) lean heavily on security by obscurity and proprietary ecosystems:
Digital rights management (DRM) that restricts what you can do with your own devices.
Diagnostic interfaces and parts pairing that only authorized dealers can access.
Business incentives to maintain exclusive control.
The group saw Ethereum and similar crypto projects as trying—at least philosophically—to push toward open priesthoods, even as the math grows more complex.
Safety, Control, and the “Nerfing” of Capabilities
A broader social critique emerged around current tech trends:
There is a growing desire—especially visible in AI safety, mobile OS security, and content moderation—to:
Systemically prevent users from doing “bad things” (e.g., removing speed limiters, generating harmful content, circumventing DRM).
Achieve safety by limiting capabilities at the systems level, rather than by empowering capable users with tools and information and relying on law + social norms.
This approach often “nerfs” legitimate uses:
You can’t distinguish in advance between a “good” and “bad” use of a powerful tool purely at the technical level.
System-wide constraints thus end up blocking both.
In contrast, some platforms (example mentioned: Bluesky via Jay Graber’s work) try to:
Empower users to filter and manage their own experience (e.g., feed curation) rather than enforce top-down controls.
Preserve more flexibility and user agency at the cost of some risk.
Applied back to KZG and Ethereum:
Advanced crypto pushes more capability to the edges (anyone can verify state cheaply, use zk-proofs, etc.).
But the cryptographic complexity and trusted setup steps create new forms of dependency on a small group of experts and infrastructural operators.
The question is whether this is an open or closed priesthood, and whether that distinction remains robust over time.
Learning Curves, Analogies, and the Need for Foundational Deep Dives
Several participants described their current relationship to the material:
“Music theory” analogy:
One person compared this to learning music:
At first, you just let the music “wash over you,” developing a feel for the tone, motifs, and emotional structure.
Later, learning formal music theory is like a jarring shift into “engineering brain”—it becomes about intervals, harmonic progressions, and structure.
For KZG and modern crypto, many participants are in that first phase: getting a feel for the space, vocabulary, and concerns, trusting that their brain is quietly building scaffolding.
“Spanish vs Portuguese” analogy:
Another analogy: someone who learned Spanish in high school hearing Portuguese.
It feels close enough that you think you understand—until you realize you really don’t.
Similarly, people with math or CS training can track parts of the KZG math (finite fields, polynomials, elliptic curves), but not reliably enough to avoid dangerous misunderstandings.
Multiple people expressed a desire for foundational deep dives on:
Elliptic curves and finite fields.
Pairings and how they are used in KZG and SNARKs.
Even more “basic” cryptographic primitives such as SHA-256, Merkle trees, etc., as a layered build-up.
There was agreement that current reading (papers, blog posts, notebook outputs) only goes so far without that foundation.
At the same time, participants noted how intellectually stimulating this space is:
It’s the most exciting material some have engaged with since grad school.
But also “distant” from their original training and pushed far in time from that training, so it feels like catching up on a different discipline.
Trade-offs: Quantum-Resistant Alternatives and Defensive Downgrades
The group touched briefly on potential quantum-resistant replacements for KZG-style systems:
Quantum-safe alternatives would likely:
Provide similar functionality (polynomial commitments, succinct proofs) but with:
Larger proofs
Higher gas/compute costs
Stricter constraints on blob sizes or throughput
This was described (in outside reading referenced) as a “defensive downgrade”:
You pay more (performance and cost) to keep the same feature set, just with stronger assumptions against quantum adversaries.
Analogy: paying significantly more for health insurance to get equivalent or worse coverage.
This framing reinforced the core tension: security vs performance vs complexity, with Ethereum currently opting for a powerful but non-quantum-resistant scheme and betting on time horizons.
Wrap-Up
Key Takeaways
KZG polynomial commitments are a central piece of Ethereum’s scaling story, enabling constant-size proofs for large data (e.g., blobs for DAS), with applications beyond Ethereum.
The trusted setup ceremony (Powers of Tau) is both a deeply social and technical construct: 140k+ participants, but security formally requires only one honest contribution; real threats are as much about infrastructure (MITM) as about pure cryptography.
The scheme is not quantum-resistant and its security depends on the polynomial degree remaining well below the elliptic curve group size; current parameters make this safe, but it ties blob sizing to curve choices.
There is a clear “priesthood problem”: only a small global expert set can verify implementations or fully understand the math, which sits uneasily with some visions of trustless decentralization.
Most participants converged on the view that expert priesthoods are inevitable at the technological frontier; the important distinction is between open (documented, accessible, laddered) and closed (proprietary, gatekept) priesthoods.
The broader tech culture is shifting toward system-level restriction of capabilities for safety (AI, mobile OSs, DRM), in tension with crypto’s tradition of empowering users and relying on open designs.
Open Questions Raised
How exactly would a powerful quantum adversary attack KZG-based systems, and what are the real timelines and costs?
What concrete mechanisms did/does Ethereum use to defend the KZG ceremony against man-in-the-middle or spoofing attacks?
How large can blobs (and thus polynomial degrees) safely grow before current security assumptions become meaningfully weaker?
What would a practical quantum-resistant replacement for KZG look like in Ethereum, and how severe would the performance “defensive downgrade” be?
How can the community structurally encourage open priesthoods—lowering barriers to entry into the expert set—while still making forward progress on complex cryptography?
Yak Collective Discord call thread:
https://discord.com/channels/692111190851059762/1462003492024619215


