Interoperability's Product-Market Fit | Deep Dive

Primary Topic

This episode delves into the intricacies of blockchain interoperability and its implications for the future of multi-chain architectures in the cryptocurrency space.

Episode Summary

In this insightful episode of the "Bell Curve" podcast, hosted by Blockworks, Nikhil Suri from the Wormhole foundation explores the evolving landscape of blockchain interoperability. Nikhil discusses how interoperability protocols like Wormhole are pivotal in connecting disparate blockchain networks, enhancing user experiences by allowing seamless interactions across various platforms. The conversation covers technical challenges, security concerns, and the future of multi-layer (L2, L3) solutions in blockchain infrastructure. Additionally, the role of zero-knowledge proofs (ZKPs) in improving the scalability and security of these networks is highlighted. The discussion also touches on the broader implications of blockchain interoperability for DeFi and the overarching blockchain ecosystem, emphasizing its necessity for the sustained growth and adoption of blockchain technologies.

Main Takeaways

  1. Blockchain interoperability is essential for a unified crypto ecosystem.
  2. Zero-knowledge proofs are key to enhancing security and scalability.
  3. Multi-layer solutions will significantly influence the future blockchain architecture.
  4. The role of social dynamics in technology adoption is as crucial as the technical aspects.
  5. Ongoing innovations and standardizations in blockchain are vital for its evolution.

Episode Chapters

1: Introduction to Interoperability

Nikhil Suri discusses the importance of interoperability in blockchain technology. He emphasizes how it enhances user experience by connecting different blockchain networks, allowing seamless interactions. Nikhil Suri: "Interoperability is at the core of a unified crypto experience."

2: Technical Challenges and Security

The challenges of implementing interoperability solutions and the new security considerations they introduce are explored. Nikhil Suri: "We need to ensure new solutions do not compromise on security."

3: The Future of Blockchain Layers

Discussion on the implications of L2 and L3 solutions and their potential to reduce costs and increase transaction speeds across networks. Nikhil Suri: "L2 and L3 solutions are game changers in blockchain scalability."

4: The Role of Zero-Knowledge Proofs

An in-depth look at how zero-knowledge proofs contribute to security and how they might shape the future of blockchain interoperability. Nikhil Suri: "Zero-knowledge proofs could revolutionize how blockchains communicate."

Actionable Advice

  1. Explore using interoperability protocols like Wormhole for blockchain projects to enhance user experience.
  2. Consider the implications of multi-layer solutions on your blockchain architecture.
  3. Keep abreast of developments in zero-knowledge proofs to enhance security measures.
  4. Engage in community discussions about standards to help shape the future of interoperability.
  5. Monitor advancements in blockchain technology to stay competitive.

About This Episode

In today's episode, Michael chats with Nikhil Suri from the Wormhole Foundation about the future of multichain and interoperability in crypto. They discuss the challenges of creating a seamless user experience across different chains, the role of general message passing protocols like Wormhole, and the potential impact of zero-knowledge proofs on scaling and security. Nikhil also shares his thoughts on the social challenges of interoperability standards adoption and the current product-market fit for cross-chain applications, particularly in the realm of token transfers and multichain governance.

People

Nikhil Suri, Michael Ippolito

Companies

Wormhole Foundation

Books

None

Guest Name(s):

Nikhil Suri

Content Warnings:

None

Transcript

Nikhil Suri

Warmwool contributors have been working with tally and scope lift to build what we call Multigov, which is a sort of first of its kind multi chain governance solution. And as more projects kind of go multi chain and expand out from a single chain that they are deployed on to other chains that they view as having valuable user bases and having valuable liquidity, their token holders for their governance token are going to be distributed across all of these chains, and they're going to want to still be able to perform effective governance while having a distributed set of token holders. Hey everyone, this episode is brought to you by, say, the blazing fast parallelized blockchain which is unlocking Solana like performance for the vast ocean of ETH devs out there. Now you're going to be hearing all about, say and their new v two upgrade. But if you take away one thing, the EVM is here to stay.

Michael Ippolito

There are some problems with it which we're going to get into later in the episode. But say, and especially their v two upgrade is helping solve that. So thank you very much, say, for making this episode possible. Hey everyone, this episode is brought to you by Kento, the safety first l two, which is accelerating the transition to the on chain financial system. They've got some very cool features like user owned KYC and native Account Abstraction, which solves a bunch of the onboarding and bottleneck problems that many l two suffer from.

So you're going to be hearing all about them later in the program. But if you want to learn more and better, if you want to join their launch program enjin Eng, you can click the link at the bottom bottom of this episode. Go check them out. Tell them I sent you. If you're building anything in web three, you likely need oracles and verifiable randomness.

That's why Bellcurve is partnering up with Supra, which offers the fastest oracles and D VRF free for listeners of bell Curve for twelve months. And you can get that@supra.com blockworks the link there is in the show notes, or stick around. We're going to be talking about them later in the program. All right, everyone, welcome back to another episode of Bell Curve. Before we jump in, quick disclaimer.

The views expressed by my co host today are their personal views, and they do not represent the views of any organization with which the co hosts are associated with. Nothing in the episode is construed or relied upon as financial, technical, tax, legal, or other advice. You know the deal. Now let's jump into the episode hey, everyone, welcome back to another episode of Bell Curve. Today I'm joined by Nikhil Suri at the Wormhole foundation.

Kid, welcome, man. Hey, Michael, thanks for having me on. Excited to be here. Yeah, man, I'm really pumped. It's great to chat with you.

I mean, I know we've officially closed down the season here on the multi chain endgame, but wormhole, you guys were generously made that season possible. And also, you guys have basically one of the leading interoperability apps on the market today, and I figure no better place to start than the source and to hear from you, who's deep in the weeds of building this stuff. So, yeah, I'd love to just frankly get your perspective, maybe ask you some of the questions that we asked a bunch of our guests this season as a little bit of a retrospective, but starting from a really high vantage point, what does the future of sort of multi chain look like? And maybe if we could touch on the interoperability aspects as well there. Yeah, yeah, great question.

Nikhil Suri

I guess the way I see so warmhole is, at its core, interoperability platform. Wormhole contributors believe that the future is going to be multi chain today. The status quo is already multi chain, and we think that the future is only going to become more multi chain. So there are only going to be more chains in the future. There are going to be more requirements for interoperability between those chains.

And we see interoperability as really core to providing a good ux in crypto. If you have a bunch of siloed chains, the experience is really fragmented. Users are fragmented. Liquidity is fragmented. Applications are fragmented.

That's not a good future for crypto. A good future for crypto is where users can all interact with all applications that are in the entire crypto ecosystem, no matter what chain those applications are built on. There are different reasons to build applications on certain chains. Some chains are faster, some chains are more secure. So there are different trade offs to every chain that you already see today and that you'll start seeing more of in the future.

And ultimately, it shouldn't matter to the user where an application is built. That matters to the developer. And the developer can build wherever they feel most comfortable, wherever they have the most expertise, and then they can use an interoperability solution, such as wormhole, to bring users to them, to bring liquidity to them, and to access the rest of the broad crypto ecosystem. Hmm. Nikhil, let's get into some of the weeds on how that's possible, because I think everyone's agreed with that right, which is right now, if you're a user of crypto, especially like a multi chain user, it just feels crazy.

Michael Ippolito

Like if you are trying to go up to base and sort of speculate and go to one of their l three s or something like that, it's like, oh my God, I got to route my ethereum maybe from my coinbase down to ETH main chain and like, forget about if you have to go from solana to base or something like that. Just feels very difficult. We all agree that that's not the end state. But the question is, I think at least when it comes to some of these interoperability protocols, is it would be really nice if users and liquidity were just 100% portable and fungible across different computing environments. Question is, what new security assumptions are we taking on by doing that?

So can you address, just like how are we actually solving this problem? And from a user standpoint, like our, what's kind of behind the middle, so to speak? Yeah, yeah, for sure. So getting more into the weeds of how wormhole works, but also kind of that informs how most other interoperability protocols today work, is wormhole is a general message passing protocol. That means that you have a source chain where you send a message from, and then that message can be delivered to a destination chain or multiple destination chains.

Nikhil Suri

Depends on if your model is like bi directional point to point or multicast. And that message can be any payload of data. That message can encode a token transfer, it can encode kind of a governance message, it can encode an intent, it can encode an oracle price, it can encode anything. How that actually works today for, or most interoperability protocols is that and for wormhole is that a message is emitted on the source chain with a payload of data. And then in wormhole's case, there are the 19 wormhole guardians.

These are 19 very large, well known validator companies that are fully doxed, well known actors who are trusted in the ecosystem, and they are all running full nodes for the source chain that the message is emitted on in wormhole and or using the wormhole protocol. And they each independently observe that message via their full nodes, so there's no additional trust assumption there. And then they each independently verify it and attest to it. And in the case of wormhole, when 13 out of 19 guardians have attested to a message that's been emitted on a source chain, that produces a valid attestation that can then be submitted to a destination chain. So that's like a verified action in the wormhole protocol.

And so this is the way that most interoperability protocols work today because it's roughly the best that you can do. There's this huge push towards ZK and more trustless infrastructure. We can talk about that more. That has a lot of sort of engineering hurdles that need to be. That need to be got.

You need to get by a lot of engineering hurdles. You need to reduce the cost, you need to make it faster. So there are a lot of things that need to happen to make ZK and trustless messaging possible. But today, wormhole and most other interop protocols are backed by these trusted entities as multisig bridges.

Michael Ippolito

Yeah, I think that's one of the challenges. And even in a GMP, like a general message passing type protocol, like a wormhole. We also talked to Brian this season as well from layer zero. There was a very interesting nuance and question that came up, which is basically my understanding, and I'll probably overgeneralizing here, and correct me if I'm wrong, is wormhole provides the foundation or the framework for other entities to come in, leverage that neutral infrastructure, and then build their own security apparatus sort of on top of that. And there was a very interesting debate this season in between with heart about should you be, as the general message passing layer, opinionated, because realistically, what people end up doing is they take this nice, credible, unopinionated infrastructure and they use it to have a multisig.

And this happens with a lot of the different. This is like, there's nothing you can really do about that. I guess you could adopt the philosophy that actually, we don't even want to give our users the ability to do this, and we're going to have more opinionation about what that infrastructure looks like. How do you tackle that question? Because from an end user standpoint, it ends up being, a lot of this stuff is secured by multisigs, but from a wormhole standpoint, I'm on the standpoint that you should let your customers do what they want to do.

Nikhil Suri

Gotcha. Gotcha. Yeah, I guess that's one of the, that's something that we think about a lot as warmware contributors, and that's one of the fine lines I think you have to tread when working on interoperability. Interoperability is, I think, a super complex space in crypto, because there are on chain components, there are off chain components, there are lots of components involved, there are lots of places where things can go wrong, and there are lots of different configurations that you can put those components in to interact together. And you need to figure out where to tread this fine line between, like, ease of use versus flexibility for the integrator or flexibility for the user.

And so I think at the core protocol level, I think that it's best to give developers and give integrators as much flexibility as possible. One thing I feel like I see time and time again from integrators I talked to, and just from crypto projects generally, is that everyone likes to do things in similar, but very slightly different ways. And so giving people and projects the flexibility to kind of accommodate their specific use case is really important. That's on the developer side when it comes to the user side and actually building uis and actually building applications that end users are going to use, I think it's important to actually be a little bit more opinionated and different uis and different applications will approach these trade offs in different ways. And that's why you have so many different kinds of applications that users can choose from, if that makes sense.

Michael Ippolito

It does. It does. Do you have any opinions yourself? I'd be curious, from your seat, on what the general architecture is going to be of warm pulse port applications. Here's the debate that's been happening a lot, especially within ethereum circles right now, l two s or l three s.

And there's these two sort of camps which are, on the one hand there is a massive cost improvement going to l three s. You batch everything down to the l two. You don't actually need to touch ethereum for those expensive settlement fees. And so there actually is an, I don't know if it's a full order of magnitude, but there's a big cost reduction between being an l three and an l two. Then on the other side of the camp you get people that say, oh my God, l three is what is it going to be, l four s or five next?

And we should be, this is maybe more of the Solana perspective of we should be putting, putting this on one chain. And I would just be curious. You guys are sitting at like the absolute bottom of the stack, and I would be curious if you have an opinion about what's going on way up here. Yeah, I'm curious to see how the l two, l three, and then if we ever expand beyond l three s to l four s, l five s, or what you're talking about. I'm curious to see how it turns out.

Nikhil Suri

I personally still think that the l three space is kind of nascent and emerging. I feel like it's really big. It's really big. In terms of narrative right now. But to me, I'm still sort of trying to.

To me, it's not totally clear, like, that l three s are going to win or that l three s are going to become super popular because, for example, we haven't used up. L two s are not at capacity right now. There are l one chains that are coming out like Monad, which are going to. Which are planning to hyperscale and become, like, very, very fast. And so I personally am curious to see how this game turns out from a wormhole protocol perspective.

Again, wormhole sits at the absolute bottom of stack, and wormhole can work with any chain. And so if there are exciting l three chains, then wormhole contributors would love to work with those chains that come out. If there are exciting applications, lots of users going to those chains, then we're excited to work with them. But I personally am not totally convinced that l three s are going to win out the day and everyone's going to move to l three s yet. I'm curious to see how this turns out.

Michael Ippolito

Yeah, it's.

I don't know. Well, okay, let me get your opinion on this, because this is another theme of the season that kept cropping up, which was ZK proofs, and I want to say, popularized by Polygon. I think they're definitely not the only team to be working on this by any means, but I think they captured a lot of people's imagination with this idea of the ag layer. And it's a simple concept, even if you just look at it from first principles. What does it mean to be a roll up?

It really has to do with that. It's like the settlement contract and that extremely secure bridge from the l one to the l two. And what a zero knowledge proof says is that actually, if you take a signed message and you substitute that with a proof, suddenly you don't need a lot of the same apparatus from l two to main chain. And especially once you start to batch and aggregate those proofs, then the economics start to look quite a bit better as well, in looking through the lens of zkps, the roll up ecosystem. And l two s versus l three s start to look pretty different.

So I'm not sure how much you've thought about. I'm sure you probably have in some capacity as an interoperability focused protocol. But how do zero knowledge proofs change the game here, and do they change how you think about l two s versus l three s at all? Yeah, that's an interesting question in terms of, I guess my general thoughts on. And then we can talk about how it affects l two s, l three s, and other specific parts of the crypto ecosystem.

Nikhil Suri

But my general thoughts on Zk Proust are that there's a lot of work that's happening on ZK. There's a lot of investment being put into ZK. And specifically from a scaling perspective. And from an interop perspective, people are more focused on. Or the part about ZK proofs that's more interesting or valuable is the succinctness.

So the ability to quickly verify a proof that something happened on another chain, that's really valuable from an interop perspective. That's also really valuable from like, a l two scaling perspective.

There are a couple of things that, like, need to. So these ZK proofs are, like, possible today. People have created ZK proofs. There are a couple things that need to be improved about them. They need to be made faster and they need to be made cheaper.

Right now, it takes a lot of compute power to generate these ZK proofs, and it can take a long time. And that's what a lot of, like, research and work has been going into. That's what a lot of wormhole contributors have been putting work into, is making ZK proofs faster and cheaper to generate. So those are like my general thoughts on ZK. I think that things in the interop world are, like, trending towards ZK.

Things will become ZK. You will see corridors getting more trustless over time, but it will take time because you need to roll out kind of ZK proofs for all these different kinds of source chains and different vms and things like that. I think ZK proofs on Solana is also a pretty complicated thing that hasn't really been solved fully yet. When it comes to impact on l two s and l three s, it'll help them scale from an interop perspective. You know, again, I think that when we.

I think wormhole, when we see, like, more standards around, like, when wormhole starts releasing ZK messaging, and when we start to build more standards around, like, ZK proofs, perhaps that can help wormhole scale to more chains. But I think that remains to be seen. Can you say a little bit more about why it's more complicated to generate ZK proofs for, say, an ecosystem like Solana versus an EVM based ecosystem? Yeah, for sure. I think so.

My understanding here isn't super deep, but at least from when I've talked to contributors who work on the ZK side of wormhole, it's that Solana has a bit more of a complex. A bit more of a complex consensus mechanism than Ethereum. That also changes a lot more frequently than Ethereum. I think the ETH consensus mechanism is pretty standardized. It doesn't change too often.

And, like, clients have already been developed for ETH, they've been in development for a long time. There's, like, a very ongoing ZK effort there. I think on the Solana side, it seems like the consensus mechanism changes a lot more frequently, so it's harder to build ZK like clients that don't need to be updated. Okay. I mean, yeah, that does make sense.

Michael Ippolito

Come on, Solana, standardize your stuff. No, just kidding. Yeah, it's a challenge, man. I mean, that's one of the things about these ecosystem. The balance of how much do you want to upgrade stuff versus just letting sleeping dog sly.

It's a. It's a balance. So I interrupted you a little bit. You know, going back to maybe some of the, like, one of the challenges on, at least from an interrupt standpoint, around ZK proofs is the cost and latency. Because, you know, whenever you talk to someone who's based in ZK, it's incredible, right?

Like, the benefits, you know, it really does. It's. It's a cool example of very moon math tech, where the concepts are very interesting and very complicated. The implementation is definitely very tough. It's still wild west, although UMA is SpL one and risk zeros.

TKVM is helping a lot, but it's still very much on the edge. And the question I always ask is, if this is so great, why aren't people using it? And it usually comes down to cost and latency, and that's where I think you're starting to see a lot of interest in these aggregation layers. For CK ZK proofs. Do you think.

Do you see that as a layer which ends up accruing value? That seems like an economies of scale type game where you're trying to amortize costs across a whole bunch of different proofs. Do you think that ends up being kind of like a sticky value accrual layer? What do you think about that? Yeah, you might have to remind me on why Ag layers are able to aggregate ZK proofs from a lot of different ecosystems.

I'm not sure that they actually can from a bunch of different ecosystems. For instance, I know at polygons they're starting with, in a dream sense, for polygon. Again, if anyone from polygon is listening, you can feel free to push back if my understanding is incorrect, because I'm also venturing out into challenging territory here, but it would be say, hey, especially for a bunch of the ZK roll ups, there's a lot of proofs that are going on. Still approvers, realistically, like one gigantic computer in a server room somewhere, chugging out proofs in a very expensive way. Patching it down to Ethereum is expensive.

And wouldn't it be nice if instead of doing that you could aggregate their signatures or whatever and then batch that down one time? So it should be like a cost saving sort of win win. And my sense is that the other roll ups view this is competitive. And it's like I because Polygon has really shifted the way that they talk about this layer as being extremely neutral now. So my sense is that it's been a social challenge to try to get other roll ups to opt into it.

But I'm a little less clear that you could very easily do Solana based CK proofs and batch it with something happening on polygon or ZK. Got it. Yeah, that makes sense. I would say I haven't dived too much into the weeds of Polygon's AG layer, but generally my thoughts on ZK proof generation and speeding up ZK proofs is that whoever solves this problem, I think it's a big problem to solve of first, on the technical side, how do you generate ZK proofs faster and more cheaply? And so how do you speed up generation, do it cheaply?

Nikhil Suri

And then how do you like create, how do you incentivize, or how do you create like a marketplace of provers who will actually generate these proofs for you? And how do you incentivize those provers to generate these proofs for you? I think that the protocol that can crack that is the protocol which will accrue a lot of value in ZK messaging and ZK interoperability. It's something that wormhole contributors are working a lot on. That's why we've brought on a lot of ZK contributors to help us.

But yeah, I think a lot of value can accrue to the messaging protocol that does that, because it'll be the most trustless kind of messaging and it'll be hopefully the fastest kind of ZK messaging, and it'll also be the most reliable kind of interop protocol. I think a ZK protocol that isn't able to properly incentivize actors to generate proofs is not going to be as reliable because they're going to have issues with if it costs a lot of money to generate these proofs, you know, you need to find someone who's going to do it. Hey, everyone, wanted to give a big shout out to today's title sponsor, say. Now I want to talk to you guys a little bit about why I think say is cool from a design standpoint, a big problem that it solves for ETH devs out there and then some cool stuff that say has coming up. The reason I like say from an architecture perspective is, again, it's a very fast blockchain parallelization, all of that stuff.

Michael Ippolito

But say has essentially been custom building block space, which is for consumer apps and dexs. Now, they have some very cool features which enable that. So twin turbo consensus, optimistic parallelization, say DB, all of this stuff allows you to reduce the time to finality, make for very, very fast transactions. If you're building a consumer app or Dex, this is basically the blockchain for you. If you've been building the EVM, you love the EVM, but there are some restrictions about it that don't support your app.

So maybe you can't do fast enough transactions or it's not parallelized. Whatever it is, you can now take all that stuff that you built. You don't have to start from scratch, and you can build it on site. Now, recently they've launched v two, but also public Devnet. So the way that you can follow that and keep up to date is go and follow, say, network on Twitter.

All right, thanks, guys. Hi, everyone. This episode is brought to you by Kinto, the safety first l two with user owned KYC and native account abstraction that is accelerating the transition to the on chain financial system. Now, we've been talking about Larry Fink on this program and how he's been shouting the good word about tokenization to all of his buddies on Wall street. This is happening.

And the very cool part about Kinto is that they are built to accept a whole bunch of assets, a whole bunch of users from the traditional financial system, which current l two s have done an okay job. But there's a lot of room for improvement for onboarding. So it features very nice things like user owned KYC, native account abstraction that solve some of the biggest blockers around mainstream adoption, namely security and user experience. You can become a founding member at Enjin if you believe in this on chain financial system. Join Kintos launch program and thank me later.

Cheers, guys. And the link bottom of this episode. Click it now. Hey, everyone. This episode is brought to you by supra delivering the freshest oracle price feeds across 50 plus blockchains.

When it comes to crypto, speed and security are both critical. And luckily for y'all, Supra has you covered on both of those things. Be that for liquidation triggers critical price levels. Whatever it is, Supra's got your back. The other great thing about Supra, they love listeners of Bell Curve.

So they're going to hook you guys up with twelve months of free oracle services and verifiable randomness. They call it DVRF, centralized verifiable randomness. So twelve months free, you just got to go to supra.com lockworks. Two more cool things about Supra, they're really secure, easy to integrate. They run on twelve times lower gas feeds than other oracles out there.

So you're going to save on that as well. If you're listening and, you know, builders that need surface like supra, you can actually refer them and get $1,500 in referrals. So to get all these goodies, you just got to go to supra.com blockworks. And again, the link there is in the bottom of the show notes. Click that.

So I know that I sent you. Cheers, guys. So, final question on this. How much of the interoperability challenge do you see as a social challenge versus a technical challenge? And what I mean by that is really, okay, this should all be really easy if we all just adopted the same set of standards.

The very next question everyone asks is, whose standards? And then someone proposes, I think it should be my standards. And so that it feels like is where a bunch of these conversations end up getting shut down. So how would you weight that in between technical challenge versus social challenge? And if it's mostly social, and how do we solve that?

Nikhil Suri

Yeah, that's a really good question. Um, that's really good question. I think a lot of. I frankly think a lot of the, I think ZK proofs are underway. We're going to have ZK messaging rolling out on various corridors for different kinds of vms and chains.

I almost am of the opinion that a lot of the technical challenges in Interop are being thoroughly worked on and are on their way to being solved. So ZK messaging, more flexibility for integrators, different kinds of cross chain architectures, all these things have a lot of work that have been put into them, and mature existing solutions are out there. Like wormhole is a very mature interop protocol that's battle tested. So are the other big interop protocols that are out there. They're pretty mature and battle tested.

So I do think like, there is a big social problem to solve in terms of like stand interop standards and like, adoption of which protocols to use. And I go back and forth on this, sometimes I think that it's, it's a matter of like, what's the right word?

Like one chain adopting a single interop standard. And then other times I think it's like, actually it should always be up to applications to adopt whichever kind of interop standard they want that works best for their use case. Just like there are lots of chains, there are different interop standards or interop protocols, and they have different trade offs. And you as the developer should adopt whichever one like, works for your use case. And then under the hood, the end user isn't going to realize, like, which Interop protocol is being used.

And there are ways to build applications that incorporate different kinds of interop protocols that don't fragment liquidity, that keep liquidity unified, that keep users unified, if that makes sense. Yeah, I think the application nuance is definitely. I tend to agree with you there. If I was a maximalist in one sense, it would be that I think the applications are going to be end up, they're going to end up driving the show at the end of the day. And if there are on chain apps or on chain products that have tens of millions of active users, the infrastructure is going to bend over backwards to support them in shape or form they need to have.

Michael Ippolito

Yeah, that's the weird thing right now. Maybe that's part of the challenge, is that there aren't really those apps with product market fit. And so you've got a bunch of infrastructure companies jockeying for my standards versus your standards. And it's almost like all theoretical at this standpoint because we don't know which apps are going to be successful and what they're going to need. There are a lot of circular arguments.

Maybe that's part of the reason. Yeah, yeah. It's a problem I think generally crypto needs to solve. It's interesting, like, within crypto, Interop has tremendous product market fit. Like a lot of protocols are going multi chain, and a lot of protocols see going multi chain as valuable to access more liquidity, access more users, just generally grow their project.

Nikhil Suri

Then there are end applications like Dexs, all the DeFi applications, games, governance, et cetera, that have some product market fit from the crypto superusers but are still trying to break into the mom and pops of the world. I think that's generally a problem. Crypto needs to solve. But I agree with you. I do lean towards thinking in terms of the interop world that there are different interop solutions, just like there are different chains and each has their own trade offs.

And the developer ultimately is going to choose which one works for them and they can use an interop solution without fragmenting liquidity while still being able to compose with other kinds of applications on different chains that they deploy to. So the interop solution is going to sit in the background, power the rails of cross chain communication, and ultimately it's not really going to matter what interop solution is being used to the end user, as long as the developer architects their solution in a user friendly way. Let's talk about a little bit where the product market fit for interop exists today. You know, you set in a very privileged seat from this particular standpoint. Like, you see a lot of why, like, people are by either applications want to build in a cross chain way.

Michael Ippolito

You see why users want to maybe move from change. Like can I push you a little bit more on that to say, like, when you say that interop has product market fit, what do you mean exactly by that? Yeah, I guess the strongest product market fit for interop is token transfers. Like there's a lot of cross chain token transfer activity. And protocols also view taking their token cross chain as important to grow their user base.

Nikhil Suri

So you see this, for example, with Jito Soul. Jito Soul recently went to Ethereum using wormhole native token transfers. And Jito Soul is going to Ethereum to expand their protocol, and they're expanding their protocol to Ethereum. And as a result, they want to move their governance token over as well so that users on Ethereum are able to trade their governance token, get access to that, get exposure to that price action, and ultimately that's going to evolve into cross chain governance. For example, maybe Jito is going to launch a multi chain dao in the future, maybe other protocols are going to launch multi chain daos in the future as they bring their governance token and as they generally move their project multi chain.

So I think a lot of projects see, from my perspective, it seems that a lot of projects see going multi chain as a great way to just grow their user base and grow their protocol. Yeah, I haven't thought about this in a while, but the multi chain governance one is interesting. You think, oh, that'd be fine and simple, but remember that these, these things have to happen with on chain votes and passing, you know, bridging votes. It's tough, man. And you also run into some funny, basically any, any logic that exists from one chain to another is like much tougher to bridge over.

Michael Ippolito

That's why, like nfts in general, it's still pretty tough. I mean, this is why I guess this is the bull case for wormhole for y'all to figure this stuff out, right? And yeah, yeah, I think multi chain governance is a pretty exciting topic. That's something that we've been working on a lot. A lot actually.

Nikhil Suri

So we recently announced partnership with Tally and Scope Lift. Tally is well known team that runs Tally UI, hosts governance proposals and governance voting for a lot of popular protocols. And Scope lift is really talented, smart contract development team that's worked on a lot of governance protocols in the past and we've, and warm contributors have been working with Tally and Scope lift to build what we call Multigov, which is a sort of first of its kind multi chain governance solution. There's some more technical details in the article that the announcement blog post, and I'm happy to dive into it more here, but I think that's an exciting use case. And as more projects kind of go multi chain and expand out from, from a single chain that they're deployed on to other chains that they view as having valuable user bases and having valuable liquidity, they're going to also extend their governance token to those chains.

And then I think that they're going to want to allow their token holders for their governance token are going to be distributed across all of these chains and they're going to want to still be able to perform effective governance while having a distributed set of token holders. Yeah, I'm pumped that you guys are doing this because it feels like one of those things that's super necessary and again, should be kind of easy. But just because of the realities of how blockchains work today, it's just a little bit complicated. Yeah, for sure it's complicated. There are a bunch of edge cases that you need to consider.

You know, chains go down, chains aren't always on the same timestamp. There's timestamp drift between chains, you know, so there are a lot of different kinds of edge cases you need to consider. It's like a fun architectural problem to go through, but it's one we view as really important to solve. How do you timestamp. How does timestamp draft in between chains get resolved?

Michael Ippolito

That's. I didn't realize that was the thing. Yeah. Yeah, it's an interesting, it's an interesting problem. It's.

Nikhil Suri

Chains are never going to have the same time stamp, one is always going to be ahead and one is always going to be behind slightly. You hope that they stay roughly aligned or within a certain range of alignment, but it is entirely possible that one chain really drags for a certain period of time. And there are different kinds of. There are different kinds of solutions that others have put out there in the past to try and mitigate this problem.

You know, one solution is that you can, and I don't think there's a way to 100% solve this problem. There's a way to solve it with, you know, hopefully 99% certainty, but there's not a way to, like, fully mitigate the timestamp drift issue. I don't want to get too into the weeds, actually, of this, but there's an interesting solution together by the pull together team on sort of multi chain voting and cross chain voting built by the pull together team that sort of uses checkpoints and a range of checkpoints to check that users aren't able to double vote because of timestamp drift in cross chain governance designs. Damn, that's cool. I'll have to get one of those guys on the pod that is definitely in the weeds, but also super.

Michael Ippolito

You know what it reminds me of a little bit is two things, actually. You know how they're used to be. When they, when they first launched satellites into space, they kept noticing that the clocks would be roughly not the same after a while. The satellites being up there, you know, it would be, you know, a couple of seconds off from time on Earth. And it's because time actually moves differently up in space, which, if you're a space nerd at all, that's one of the cooler things.

I just think that's so awesome. Then one of the other examples is the y two k bug. I watched this. I'm going to date myself here, I suppose, but I was alive for y two k, but I wasn't paying attention to the business impact of it and just the decision not to include the first two numbers of the date and the amount of time and effort and fear that caused these database architecture questions, they end up end up being. It's kind of like, well, can we just solve this?

But thorny, thorny problems, you know? Yeah, I mean, a lot of crypto development can sometimes be compared to like, you know, database architecture, eventual consistency questions, which are problems that a lot of people have. A lot of people have put a lot of time into solving in the web two side, and then people come over to web three and they can bring that knowledge with them.

So here's a question that we asked our listeners this season, which is, well, maybe to set the question up a little bit as we move into this world where I think most of the people that are either, oh, it's all going to be bitcoin or all ethereum, I think we're gradually moving away from that view and adopting a more multi chain ecosystem type thing. How many l ones are there going to be? How many different vms are there going to be? Is that the EVM, the SVM and move? Or is it just the EVM?

Or is it questions, how many are there going to be? What is the Pareto or the power law going to look like? We're less sure about that. But one of the challenges that you and I have been getting into a little bit is, I think one of the, what has tempted people a little bit to say we should just put it all on one layer, is be a lot less complex. In a sense, you could actually make less trade offs.

And it doesn't seem like that's what's going to happen. It seems like there's going to be multiple different ecosystems. And as you have these different ecosystems, maybe you end up making some trade offs. And I feel like where this sort of, where people tend to land on this spectrum is, well, if we, you know, if we make these trade offs, then we might as well just be doing Tradfi and like, they kind of throw their hands up. And I'm personally, I'm of the belief that there's a lot of gray area and there's a lot of benefit to crypto outside of maybe some of the extremely early viewpoints on a very limited sense of defining things.

So maybe with that qualification, what would a good outcome for crypto look like for you if we're having this conversation in ten years and a bunch of new stuff has happened, got many chains, some of the infrastructure has been solved, but also maybe we have to make some trade offs on it. Now, what would make you say, you know, what we did, this was a good experiment, this is worthwhile to run. And I would call crypto a success versus how would things play out where you're like, this was like, not successful ultimately. I've gone back and forth also on what I think the ultimate purpose and value of the crypto ecosystem is. I think, you know, from the perspective of improving the global financial infrastructure, I think, like, you know, if in ten years we can look back and we say that, and we say that, you know, crypto was able to improve, like, you know, payments, let's say cross border payments or remittances, right.

Nikhil Suri

Make those a lot cheaper, make those a lot more effective, and ultimately, like, help a lot of people, like, pay using USDC or USDT or whatever you have it. I think that's like a. That's like a good outcome for crypto, that's one good outcome for crypto where it can help, like, improve the experience of paying and improve the experience of, like, cross border payments and cross border remittances for, like, people across the world. I think another good outcome for crypto could be, like, you look back in ten years, you see that a lot of institutions have onboarded onto DeFi. A lot of liquidity has come into the ecosystem.

Like, DeFi is a burgeoning space. It's a way where people can hedge risk. It's a way where people can. It's one place DeFi exchanges become very popular, I think is another success for improving the global financial ecosystem.

At the same time, I think that a lot of value in crypto is also not necessarily focused towards just improving the global financial ecosystem. A lot of value in crypto is, for example, people enjoy.

People enjoy, like, the cultural side of crypto. So people enjoy, like, nfts, people enjoy games, people enjoy meme coins. People enjoy being a part of a project and having a say in the direction of, like, a protocol or project. And I think that's also, like, if we look back in ten years and we say, and we see that, you know, crypto provided a lot of, like, cultural value, crypto provided a lot of entertainment value. Crypto provided a lot of, like, sort of value for people to feel like they were part of building something new, then I think that's also a success.

Does that make sense? Sure does. Yeah, I think that's one of the. I think, you know, crypto is, in a lot of ways, a tough industry to work in for a long period of time. It's really volatile.

Michael Ippolito

I, you know, it's kind of like that thing. The thing that you love about someone is also thing you don't like about them. That's how I sort of feel about crypto. It's. Man, the volatility, man, the human emotions.

It's tough to go through. Like, either, you know, people make this joke, it's either it's a so over, or we're so back and there's nothing in between. It's like your dopamine receptors are getting fried every couple weeks, or you're in a pit of despair that's tough. But on the other hand, there's no industry that is more dynamic, that pushes the boundaries more, that innovates harder, that is open to new ideas. It's really addicting.

It'd be very difficult, at least for me at this point in my crypto journey, to even consider working in another industry, because it would just feel like a massive step back. Yeah, I agree with you. I love working in crypto because it's so exciting, because it changes so much. And maybe that's another thing that I. I think if you would ask me in ten years to look back on crypto, that I would call successes.

Nikhil Suri

Crypto continues crypto. There's so much experimentation. People are, like, trying new things that haven't been tried before in crypto. They're trying new ways to organize governance. They're trying new ways of monetizing culture.

They're trying new ways of trying new kinds of exchanges. Dexs are a new kind of exchange, cross chain bar len protocols, just in general, a lot of defi are like new, new creations. And so I do think that, like, it's also a success if crypto just continues to experiment and continues to push the bounds of, like, what's possible, and that ultimately informs, like, the non crypto world on what people find valuable and what people can do, that's possible. I agree. I've had a conversation with someone today about this.

Michael Ippolito

I think one of the pros and cons of crypto as it exists today, that I'm not actually sure there's a real solution for you. My attitude about it is that you have to take the good with the bad a little bit is, again, going with this theme of trade offs. And the thing that's best about something is also kind of the worst about it. At the same time is how fast crypto moves. And part of the reason why I think tradfi is so ripe for disruption is it's heavily regulated and there's not much innovation that ends up happening.

There's really good reason for that, because the driving factor that makes most financial services or asset management business survival for the long term is safety. That is a value proposition when it comes to money. You don't actually really want people pushing the boundaries. A lot of time you want them, hey, I've got money. I'd like this to be safe.

That's the most important thing. On the flip side of that, though, that's why it's such a ossified industry that I would just hate to work in. Crypto is the opposite of that, and it moves extremely fast. It's really dynamic. On the other hand, look at how many billions of dollars get lost because of hacks every year and stuff like that.

There is a trade off. There are plenty of projects in crypto which thread that needle and they push hard. And it's kind of one of those things that, unfortunately, is going to come down to survivorship bias. It's like, well, if it works out, everyone's going to say you're a genius and you push the boundaries, and it worked out, but if you lose a lot of people money, guess what? No one's going to have any sympathy for you.

And it's. I don't know, I'm kind of just rambling here, but it's a unique challenge of the industry, and maybe you just got to take the good with the bad a little bit. Yeah, I guess I'm curious to see, just like, you know, how it plays out. Crypto is very. It's a dark forest.

Nikhil Suri

It's very. There are a lot of hackers out there. The theory of the kind of focus on open source is good. It improves security. It also, like, invites hackers, black hat hackers, and white hat hackers to try and, like, break protocols.

Protocols. The focus on permissionlessness is kind of the same. And so I'm curious to see how it'll turn out. But I think generally, as an industry, we're getting better at it. We're getting better at trying to make sure that we make hacks whole.

We're getting better at making incentivizing white hat hackers to focus on security and find bugs. So I think, as an industry, we're getting better at it. I'm curious to see all turnout. Totally. And by the way, just a caveat.

Michael Ippolito

I'm. I want to. Don't want my comments to be misconstrued that, oh, that's fine. If people scam other people. That is not what I'm talking about at all.

I'm just saying I think the. And frankly, a lot of the hacking that goes on in crypto is just straight up illegal. Shout out, Avi. Avi with his highly profitable trading strategy. And that's just market manipulation.

That's just illegal. But I was more, you know, talking about the, the fine line in between, like, pushing the bounds of innovation and moving quickly and all the pros and cons that go along with that. But, but I agree with you. And you know what else I was thinking about the other day is, remember, it just used to be much more common to get computer viruses and you're like, ah, man, I don't really want a pc because you get a computer virus. And if you were on Limewire downloading songs in the day, you could just crash your entire computer.

And then you just broke the computer in some instances, and then that just kind of became a solved problem. And I, I sort of feel like that's the way it's going to go with crypto eventually as well. Yeah, I agree. Optimistically, I agree.

You're like, but in reality now, I think it's good to be an optimist in the long run, but also be pragmatic in the short run. So, yeah, Nikhil, man, this has been a ton of fun. Do you have any parting words for listeners either who want to maybe aren't as familiar with wormhole, or can you give the folks a little bit of alpha in terms of don't betray any state secrets or whatever, but anything that's coming up that users might be excited about, just however you want to close things off here? Yeah, for sure. I guess wormhole contributors are really focused on improving cross chain Ux and building, you know, improving the experience of going cross chain, making the experience of going cross chain cheaper, more effective, and like, working on building an entire ecosystem of applications that use cross chain.

Nikhil Suri

So we have a lot of products coming out. So multi chain governance is one such example. There are other products coming out that will sort of improve liquidity for cross chain transfers, products coming out that will improve the relaying infrastructures. There's ZK messaging coming out. So I think there's a lot of products that warmwool contributors are working on that are super exciting that will improve just generally cross chain UX and make the entire infrastructure more secure and easier to use.

So super excited for that. Awesome, man. Well, congrats. Appreciate all the good work that you're doing at wormhole. And this was a lot of fun.

Michael Ippolito

We should, we should do it again sometime soon. Yeah. Thanks for having me on, Michael. Enjoy. Cheers, man.

Hey, everyone, want to give a final shout out to this episode's title sponsor. Say now, there are a whole bunch of really exciting reasons to be building on, say, v two outside of just parallelization. I want you to head over to say IO to looking into building on their public Devnet again, click the link at the bottom of this episode and head over to say IO. Start building something today.