< Articles

Gnosis and the future of web3

In a recent trip to Berlin, I was very excited and extremely appreciative to have the opportunity to meet with the COO of Gnosis, Friederike Ernst. The Gnosis team have led so many movements in web3 so it’s great to hear her thoughts on how they see the future of web3.

Aug 31, 2022 • 16 mins

Author: Angus Mackay

Angus: My first question is about Futarchy and prediction markets. Prediction markets are easily explored where there are objective questions to be answered. Where the questions posed are more subjective it’s hard to boil an answer down to a single metric. Does Gnosis believe there are very broad applications for prediction markets?

Friederike: So, at the core, that’s an Oracle problem. How do you ascertain what is true and what is not, and often that’s a difficult question. It’s actually one question that we’ve never tried to solve. We’ve mostly always used trustless Oracle providers where anyone can take a view on how something will turn out, and if other people don’t agree, there’s an escalation game. This process continues and attracts more money so a collective truth can be found. The idea is that you basically don’t need the escalation game, but just the fact that it’s there, keeps people honest.

Angus: Other than the sports and political use cases that you usually associate with prediction markets, what do you see are the opportunities to expand their application?

Friederike: Our core business model has pivoted away from prediction markets. I actually do think there’s a large arena in which we’ll see prediction markets happen and play out in the future. I just think there’s a paradigm shift that needs to happen first.

As prediction markets work today, you use a fiat currency as collateral (Euros, USD, AUD etc.) and you have a question. If the question turns out to be true then the people holding the YES token win the collective stake of those people who hold the NO token. If the question turns out to be false then the people holding the NO token win the collective stake of those people who hold the YES token.

Typically prediction markets run over quite some time. An example would be will Elon Mush actually buy Twitter? Lets say this plays out until the end of the year. That’s the typical period you would give your prediction market to run. That means your collateral is locked up for 6 months. That’s really bad for capital efficiency, because you could do other things with that collateral in the meantime.

Angus: Could you use a deposit bond or another type of promise to pay so you’re not tying up your capital?

Friederike: In principle that’s possible, it just makes the trustless design a lot harder because you need to make sure that people don’t over commit. In a way this is what happens with insurance. Insurance markets are predictions on whether certain events will happen, like your house burning down. It pays out if the answer is yes and it doesn’t pay out if the answer is no. The way that the insurance company gets around the capital efficiency problem here is by pooling a lot of similar risks. Actuaries then work out which risks are orthogonal to each other so they’re portfolios are not highly correlated.

Again, in principle you can do that but it’s very difficult to actually make these in a trustless manner, especially for prediction markets. There are types of insurance that run over a very short amount of time. Flight insurance is a good example. If you’re flying out to London next week and you want insurance that pays out if your flight is delayed, the probability may be 10–20% of a delay occurring. You could probably run a prediction market on this risk and it still be capital efficient but for many other things this is less clear.

If you look at markets that have taken off in a big way in the past, they have tended to be markets that are not growth limited in the same way. The stock market is a good example, perhaps not right at the moment. If you were to invest in an index fund without knowing anything about stocks, you would still expect it to go up over the course of 5, 10 or 30 years. This is not the case for prediction markets. Prediction markets operating with fiat money as collateral are inherently zero sum. If I win someone else loses and that’s not a hallmark of markets that take off in a big way.

What I think will happen at some point, and you can mark my words, we will see prediction markets that aren’t backed by fiat collateral.

If I come back to my Elon Musk and Twitter example, and you use Twitter stock and not US dollars as collateral, you need to stake Twitter stock until the end of the year if you think the takeover will be successful and you also need to stake Twitter stock if you don’t. Either position gives you the same exposure to Twitter stock. Holding Twitter stock still exposes you to the eventual outcome of the takeover. If you have a view of what will eventually occur you can hold onto one token and sell the other one, together that would give you a market value for Twitter. This unbundles a specific idiosyncratic risk of a Musk takeover from the all other risks that may affect the value of Twitter stock at any point in time. It becomes a pure vote on a Musk takeover, allowing you to hedge against this particular event whilst maintaining exposure to all other risks and events affecting Twitter .

This will open up an entirely new market. Obviously for this to happen a lot of things need to happen first. You need tokenised stocks, liquid markets for these tokenised stocks, and market makers. I think this will happen in a big way at some point, that’s not now, but perhaps it’s in 5 years’ time.

So we’ve built a decentralised prediction market platform. It’s out there, you can use it, it exists. It’s called Omen. We’ve now moved on to other things.

Angus: In Gnosis Protocol v2 and the CowSwap DEX, you’re using transaction batching and standardised pricing to maximise peer-to-peer (p2p) order flow, and you’re incentivising ‘solvers’ to flip the MEV problem. Is the vision of Gnosis to keep extending the powers of incentivisation to eventually minimise or eliminate MEV?

Friederike: Yes. I think MEV is an invisible tax on users. Charging fees for providing a service is completely fine but these fees should be transparent. 

Where fees are collected covertly it’s the ecosystem that suffers. It’s not a small fee either. I want to be very explicit that we don’t agree with this and we are doing our best to engineer our way out of it for the ecosystem.

Angus: I still can’t believe it still happens in just about every traditional financial market.

Friederike: Yes, but on blockchains the extent to which it’s happening is much larger. There have been analyses of this and it’s amounts to just under 1% of all value transacted, and that’s an enormous covert fee. I know there’s an entire camp of people that defend this as necessary to keep the network stable. It’s very good framing but I don’t think it’s the truth of the matter.

Angus: With regulation of DeFi coming as early as 2024 in Europe (MICA), how can the DeFi industry ensure that any legislation is soundly based and doesn’t restrict or destroy future opportunities?

Friederike: I think I would disagree with the premise of the question. The way you posed the question suggests that there’s going to be a large impact on DeFi and I don’t necessarily think that’s going to be the case. I think regulation, and this goes for any sort of law, is only as good as a state’s ability to enforce it. If society changes too fast and laws are seen as outdated, that’s not a great outcome for society. The powers of decentralisation are strong and they’re here to stay.

Angus: What have been some of the key lessons you’ve learned in relation to governance with GnosisDAO?

Friederike: That’s a really good question. When it comes to governance you can see we’re at the very beginning, these are the very early days. We as humanity have been experimenting with governance for 200,000 years or more. Ever since people have banded together in groups you had some form of rules and governance in play. 

To assume web3 is going to be able to reinvent governance in the space of a couple of years is a very steep assumption. We’re seeing the same problems as everyone else. 

Low voter turnout, a reluctance to engage in the governance process and so on, despite the fact that we’re one of the DAO’s with the highest voter turnout. There are multiple initiatives being tested to encourage higher voter participation like forcing people to delegate their vote, nomination schemes that involve liquid democracy and others. We’re thinking about this very actively.

We have a project within Gnosis called Zodiac that builds DAO tooling. Lots of DAO’s within the Ethereum ecosystem build on top of the Gnosis Safe as a multisig, and Zodiac builds modules that you can install for these safes. Features that traditional software companies would call granular permissions management. An example would be giving individual keys to do some things without needing approval from an entire quorum of the multisig or without having a snapshot vote for something. It gives you the ability to customise what a particular key can be used for and under what conditions it can be revoked.

One of the things we’re using custom keys for is to delegate active treasury management, this includes short-term yield farming and strategic yield farming, to another DAO (Karpatkey). 

They don’t have custody of our funds. They have a whitelist of actions they can execute on the GnosisDAO wallet. They can’t withdraw our funds for instance and they can’t rebalance between different pools and so on. You can tell from how coarse this tooling is that it needs to be improved significantly.

Most democractic societies have a representative democracy, like Germany, Australia, the US and others. There are a few societies that have a direct democracy, like Switzerland. Voting is not compulsory in Switzerland and they also suffer from low voter turnout. You basically vote on everything, and in some cantons you vote on whether citizenship should be granted for every single person that applies.

Angus: Wow, that’s granular!

Friederike: It’s very granular. Another example is the decision to build a playground two streets over for the kids at a school. When voting becomes this granular, not surprisingly you end up with low voter turnout. There’s a sweet spot in between though because most people don’t actually align with one party on every issue.

Societies are complex and there’s a variety of issues where you could delegate your vote to an individual who you feel aligns most closely with your view on that issue. For example, I feel very strongly about civil liberties, but at the same time I’m not a libertarian in the sense that people that can’t provide for themselves shouldn’t be cared for. There should be a social security net but that’s no reason to take civil liberties away from people. I would like to be able to delegate my vote to another person to decide on the science policy we have or the foreign policy that we have. And I think this is where we need to get to with governance in web3.

I think in terms of society at large this will take some time, but web3’s main advantage is that we can run many experiments in parallel and iterate way faster than traditional societies, and you don’t have to have just one system you can support.

Angus: That’s a great call out. And you can have multiple systems that allow different tracks for different endeavours. Does GnosisDAO already have different ways of voting for different initiatives?

Friederike: Not yet. We’re talking about this right now. 

One type of voting we’re looking at is ascent-based voting where there’s a number of people trusted by the DAO who can propose an action that will be automatically approved unless there’s a veto of the proposal. 

Often there are day-to-day proposals that are important for the DAO, like ensuring contributors are paid, that usually always go through. Significant volumes of these small proposals though contribute to voter apathy. It becomes a part-time job just to keep up with governance for a single DAO.

Angus: So, you have low thresholds for proposals that are repetitive and not contentious and high thresholds for proposals that are significant and could be contentious?

Friederike: Yes. These kinds of changes to DAO governance are obviously very simple but they can be made arbitrarily complex because all these voting patterns need to be hard coded. It has to be decided upfront what’s contentious and what’s not. How will each of the voting patterns be changeable? Which ones will require a super majority vote to change and so on.

Angus: Is Gnosis planning for a multi-ecosystem future with cross chain composability or does your focus just extend to the broader Ethereum ecosystem?

Friederike: We’re firm believers in the future of the EVM and the EVM chain. Interoperability is a lot easier between EVM based chains and they have such a large part of the developer mind share. The Cosmos and Polkadot ecosystems for example obviously have smart developers but there’s nowhere near the amount of depth in tooling for these ecosystems. I saw a graph recently in The Block about how much is spent across ecosystems in total and how much is spent per developer. For EVM the cost spent per developer was the lowest because there are such a large number of developers already building on EVM chains.

Angus: A large portion of the Polkadot ecosystem is building on EVM as well. They also have the option of offering WASM at the same time. Doesn’t that make them competitors?

Friederike: No Substrate is different. It’s not a bad system. It’s well designed and in some respects it’s better than the Ethereum system. It’s difficult to build on and it requires a steep learning curve. Developers transitioning across have a hard time and we think that EVM is sticky. That’s kind of our core hypothesis within Gnosis and the Gnosis ecosystem.

Late last year Gnosis merged with xDAI which is currently a proof of authority chain, very close to Ethereum. It’s been around since 2018. It’s now known as Gnosis Chain. We also have another chain called Gnosis Beacon Chain which is a proof of stake chain, like Ethereum will become after the merge, and GNO is the staking token. Our value proposition centres around EVM and being truly decentralised. Gnosis Beacon Chain is the second most decentralised chain after Ethereum.

Ethereum has around 400,000 validators and Gnosis Beacon Chain has 100,000 validators. I believe the next closest ecosystem after that is Cardano with about 3,000 validators. It’s a large jump, and if you think about security and security assumptions, decentralisation is important because otherwise you run the risk of collusion attacks.

The idea behind Gnosis Beacon Chain is that it’s maximally closed to Ethereum, that you can just port things over. If you look at how our transaction fees have changed over the last couple of years, it’s crazy. Almost everything that ever lived on Ethereum has been priced out.

Angus: That’s why I started looking at alternative chains originally because setting up a Gnosis Safe was going to cost me $500 in gas fees.

Friederike: Exactly. Everything that’s not an incredibly high value, and I’m not just talking about non-financial interactions, I’m talking about lower value transactions of less than $50,000. Gas fees may be only 50 basis points on this amount but it’s still something you’d rather not spend. Gnosis Beacon Chain is a home for all these projects that have been priced out of Ethereum.

I’m not a huge fan of DeFi to be honest. I know that we’ve built things that would be classified as DeFi. I’m a huge proponent of open finance. 

Opening avenues to people who previously didn’t have access and lowering barriers to entry and so on. I’m all for that. But the speculative nature of DeFi where it’s money legos, on money legos, on money legos and then everything topples. That produces a couple of percentage points of profits for the same 500 people in the world. That’s not what I get up for in the morning and its not what motivates me. Opening up these applications like money markets and having these for the wider population of the world is a good goal but this is not where DeFi is currently headed.

That’s why I like to differentiate between open finance and DeFi because to me the motivation is different. I think you need those defi primitives in any ecosystem and they exist in the Gnosis ecosystem. The projects the Gnosis Beacon Chain currently centres around doesn’t need these to be sustainable. The projects where there’s 5% more yield to be gained from yield farming happen elsewhere. I would also argue that yield farming is not very sustainable because the capital that it attracts is inherently mercenary. I don’t think it’s a good use of money. We intend to win on culture. 1,500 DAOs live on top of Gnosis Beacon Chain. As do all major Unconditional Basic Income (UBI) projects and payment networks. This is where we’re moving at Gnosis. In principle it’s a general purpose direction. We’re absolutely not headed in the Phantom direction. We’re very much prioritising a social direction of grounded use cases.

Angus: One of the messages I heard repeatedly at Polkadot’s conference last week in Berlin was the need for the collective effort to shift to solving real world problems. What do you see as the key challenges to making this shift?

Friederike: There’s a lot of crypto founders out there that believe in society and making the world a better place. I do think this has been watered down a bit over the last bull market, which is why I’m looking forward to the bear market because it clears out some of those projects that are only chasing the money.

Angus: A lot of projects talk about banking the unbanked, but I don’t think we’re any closer to it. We may be still 5 of 10 years away from achieving that.

Friederike: I agree and that’s where Gnosis Beacon Chain comes in. We also want to bank the unbanked and one the core tenants of this is the UBI.

Angus: Do you have a UBI in Germany or in Europe?

Friederike: We don’t. We have lots of initiatives that kind of push for it. We have a social security net and a minimum wage but it’s not unconditional. You have to prove that you go to job interviews and so on. You also cease getting it when you get a job. The idea behind a UBI is that we don’t have a resource problem we have distribution problem. If you look at how much humanity produces, in principle, it’s enough for everyone. Somehow about 30% of what’s produced goes to waste for no good reason, which leaves you with a distribution problem.

I think that in a world that will change substantially over the next 20 to 30 years a lot of existing jobs are going to be made redundant. I think this is necessary and it’s wonderful that labour intensive, repetitive jobs are being made redundant. It’s frees up the people that used to do these jobs to do more meaningful things. The UBI is the necessary precondition for this to happen.

Angus: What real world problems would you regard as the lowest hanging fruit for web3 entrepreneurs to consider?

Friederike: Payments. It’s funny because we have said that for 10 years. We’ve always said that you’ll pay for your coffee with Bitcoin but no one actually does this because it’s too expensive. If you look at the current costs of transaction processing for everyday payments, like buying lunch for your seven year old, you’re charged 2–3% in card fees. Even if it’s a debit card. This is the lowest hanging fruit that scales up in a meaningful way. We already see this with remittances. A significant amount of remittances and cross border payments are actually done in staples.

Angus: It’s a great onboarding process for people as well to web3. With your COO hat on, where do you find a need for new web3 software to fill gaps or completely replace web2 tools?

Friederike: This is a good question. The standard answer would be everything where you transact value. This is an area where the cost of transactions have in theory been lowered by web3. Setting aside the skyrocketing transaction fees because this is a technical problem that can and will be solved. The next question would be, If you look at web 2.0, you can buy things on the internet, but it always hinges on the fact there’s an intermediary involved that you pay with your money or your data. Not having to do that is one of the core goals of web3.

Angus: Much of DeFi as it stands, is not very decentralised. Some of this may be explained by the particular evolution of a protocol. In other cases it could be explained by the hardness of decentralising everything. Do you think this points to a future where hybrid models play a larger role for longer in the development of web3?

Friederike: I think hybrid models are hard because of regulation. I agree that many projects that say they’re completely decentralised is in reality two 20-something guys with a multisig. It’s still easier to be completely decentralised because otherwise you fall under the purview of one regulator or another. The only way you avoid that is building completely decentralised systems. It’s gets easier though. 

A couple of years ago, the idea that a DAO could own a domain name and have votes on things that are automatically executed on chain would have been preposterous. 

There’s no way a DAO would’ve been able to host something with an ENS and IPFS content. It’s all new. So much has happened and doing things in a truly decentralised way has become easier over the last couple of years from an engineering standpoint.

Angus: What’s the easiest way to onboard people to web3 and how do you think it’s most likely to happen?

Friederike: I don’t think it will be driven by consumers. It will be driven by businesses offering alternative payment systems that reduce payment fees by 2%. Consumers will simply vote with their feet. People don’t know how it works now on web 2.0 and I think it’s going to be the same for blockchain.

Angus: What are some of the pathways that you see for decentralised offerings to start to provide the infrastructure for centralised businesses?

Friederike: There will be decentralised payment rails and transaction settlement rails and everyone will use it. 

What people don’t realise, when something is truly decentralised and belongs to no one in principle, it belongs to everyone. This is the magic of decentralisation.

This allows people who are in principle competitors to coordinate on a much more efficient layer, because there is this impartial layer that belongs to everyone. I think this will be the direction things head in.

Angus: Other than Gnosis, who do you regard as the leading innovators in DeFi right now?

Friederike: In terms of governance, I think Maker. They’ve been decentralised for such a long time, much longer than anyone else. I always have one eye on Maker governance. I think there’s tons of innovations happening in the DAO space, too many to keep up with.

We have good relationships with the DAO’s on the Gnosis Beacon Chain, but we don’t even know all of them when there’s more than 1,500 individual DAO’s. That’s kind of the beauty of it as well. Everyone does their own thing.

Follow us
< Articles

The race to create a DEX that enables cross chain swaps

In a recent trip to Berlin for Polkadot Decoded, the Polkadot ecosystem’s 2022 conference, I made sure I sat down with the CTO of Chainflip, Tom Nash. I’m very appreciative that he chose to give up his time to share his thoughts with Auxilio.

Aug 8, 2022 • 14 mins

Author: Angus Mackay

To introduce Chainflip before we dive into the detail, they’re a decentralised, trustless protocol that enables cross chain swaps between different blockchains and ecosystems. To rephrase that in terms of the customer value proposition, they will potentially allow a user to instantly source the best deal, and the best customer experience to swap crypto assets across multiple ecosystems (Ethereum, Polkadot, Cosmos etc.). It may not sound like it, but it’s a seriously ambitious undertaking that requires some serious tech problem solving.

Angus: What’s makes Chainflip a standout project are the choices you’ve made to integrate technologies across the Ethereum and Polkadot ecosystems, and to not become a Parachain. Can you give us the benefit of your thinking?

Tom: We’ve amalgamated a bunch of different technologies that are best in class to limit what we need to build to produce something that fulfills a specific use case that we see the opportunity to create.

There are a lot of choices that are really easy to make and your encouraged to make them, like build on Substrate, become a Parachain, enjoy shared security, don’t worry about building a validator community, don’t worry about how to incentivise people in an effective way. Many of these choices didn’t really make sense for us.

We don’t benefit from the shared security of Polkadot as we still require staked validators on the Chainflip network to be staked. You can’t run an exchange with $10bn worth of liquidity if you’ve only got $10m worth of collateral because you immediately provide an incentive for people to buy up all of the collateral to take control of the funds. So Chainflip requires validators to be staked and collateralised.

Whilst you can do that with Polkadot and your own Parachain, Collators and Cumulus, it certainly doesn’t make things any simpler for us. In fact it adds a lot of complexity.

Angus: That’s interesting. As a non-technical person you can’t see that. Can you try to explain to us why?

Tom: Sure. There’s usually one aspect to blockchain security and it’s effectively the security of the state transition process. So blockchain is a big database. You want to make sure that all of the rights to that database are authenticated correctly and they follow certain rules. The security of the state transition function is usually provided by the collective stake of the network. In Ethereum’s case this is a bunch of GPU’s mining away. The same goes for Bitcoin.

In Polkadot’s case it’s a bunch of people sitting on loads of DOT and they don’t want their DOT to devalue. Chainflip also has that task. We also need security of our state transition function, but we also require the security of the collateral. So our validators and Chainflip collectively own threshold signature wallets. And we require that these validators have no incentive to collude for the purposes of stealing those funds.

Now the shared security of Polkadot is not tied to security of these DOT. If Chainflip were to leverage the shared security of Polkadot, we’d be delegating the stake to all of those people who hold DOT. And the people that hold DOT are not necessarily the same people who are being incentivised to provide liquidity on the exchange. If we delegated all of our security to DOT validators, but our validators were a different set of people, like collators in the Polkadot ecosystem, and the collators themselves have no stake. The collators are just rolling up the blocks, posting them to Polkadot.

You can force collators to stake but then we’re going back to square one. Why are we using collators and a Parachain if we get no benefit from XCM, which we don’t really. We’re building something at a different layer of abstraction and if we want to support the long tail of XCM assets, then we can just build a front end integration. But the long tail of asset support for Polkadot is not a path that Chainflip wants to go down. You fragment liquidity on the exchange and you force more collateral to have to be deposited in order to support that liquidity. It doesn’t really make sense for us.

“The golden goose for Chainflip are the chains where there is no decentralised liquidity solution at the moment. Chains like Bitcoin, Monero, Zcash, some of the bigger ones that have been left behind by the whole DeFi movement.”

Angus: If I rephrase that in simple terms, a validator in the Chainflip network has two jobs to do – securing the network and securing the liquidity of the network.

Tom: That’s right. Anyone can provide liquidity to the exchange when they trade but the validators that run the network are the ones providing security for that liquidity in two ways. They are one of 150 validators with custodial access to the liquidity and they also secure the state transitions for the AMM.

As I mentioned earlier, If you have $10bn of liquidity, you need some function of that amount as collateral in order to ensure you’re not a honeypot or a target for an economic attack. So if Chainflip were to support all of the Uniswap tokens for example as first party tokens, and people could send those tokens to Chainflip. Then you balloon the amount of liquidity that you’re able to accept and you balloon the amount of liquidity the exchange and the validators are collectively responsible for.

If you do that, you also have to balloon the value of our FLIP token, otherwise the validators are holding onto much less FLIP than they are proportionately providing the liquidity for. If they have $1 of FLIP for every $10 of liquidity they’re securing, things start to look a little out of balance.

Angus: I read in your Docs that the limit of the Frost Signature Schemes you’re using as a custody solution is 150 signatories. Do you expect this to increase as you grow?

Tom: The long term goal would be to scale that number and it’s certainly not a theoretical maximum. It’s a threshold that’s been chosen because it will provide the performance the exchange needs. If you go lower you decrease the level of security, if you go higher you decrease the amount of throughput. So it’s roughly in the Goldilocks zone.

Angus: You’ve chosen not to use the Polkadot messaging protocol (XCM) to facilitate swaps on the Polkadot network or the IBC messaging protocol on the Cosmos chain. Can you tell us what you’re doing instead?

Tom: It doesn’t really feel like it makes sense for us to muck around with the XCM’s and the IBC’s of the world at the moment. I’d love those ecosystems to mature and I’d love for it to be really easy for us to plug into them. You’ve only just started to see XCM channels opened up between Parachains on Polkadot so it’s very early days.

We’ve been building Chainflip for a year and a bit now. It’s just been too late. XCM wasn’t available when we started and there were no plans for it either that we were aware of. If we’d started around now and in 6 months time, it’s very clear how you could leverage XCM, and it’s very clear how to leverage Cosmos’ messaging protocol (IBC), and there’s a bunch of chains to support it, and there’s bunch of projects to support it, maybe then it makes sense. Other projects will have to deal with this whole notion of cross chain communication between something like IBC and XCM.

We’re kind of fundamentally solving a different problem. Chainflip aims to solve the problem of swapping Bitcoin for Ether trustlessly. Projects that use XCM solve the problem of swapping Bitcoin for Polkadot trustlessly. So we’re not really competing. Our competition are the centralised spot markets for these assets like Binance, Coinbase (Pro), Kraken and other central exchanges.

Angus: Ultimately you’re targeting a better user experience than centralised exchanges. Do you expect many to leverage XCM, IBC and other messaging protocols to compete with you?

Tom: Maybe there’ll be a bunch of exchanges that leverage XCM or IBC to do cross chain swaps. It will be very interesting to see if that happens. I suspect that the architecture of a DEX on top of XCM is extremely complicated. You’re going to need lots of oracles. You need lots of people to tell you the price. I’m skeptical and I haven’t seen it yet.

Layer Zero is a really interesting project and they’re most likely to hit our orbit first. They recently released their cross chain messaging tech and it’s cool. It has its faults and its flaws. Again, it doesn’t solve the problem of swapping assets from chain to chain, but I think that’s the likely direction they’re headed and I’d be surprised if they’re not. One of the problems they don’t solve is custody. They claim their technology can be deployed across all these different types of crypto base layers, like Bitcoin for example. I’m certainly really interested to see what they produce next.

Angus: Your vault rotation and creation of new authority sets sounds like a computation heavy process. Is it happening a lot and how do you reduce the need for the process?

Tom: Yes it’s quite inefficient. To produce a new aggregate public key that these 150 nodes have collectively derived and agreed upon requires about 90 seconds. It depends on the cryptography that’s being used under the hood. But it’s certainly not cheap which is why we don’t want it to be happening all of the time. The process is initiated when one of two things happens, either a certain amount of time elapses or a certain amount of nodes goes offline.

So every 28 days, which is probably the right amount of time, a new set of validators is chosen as auction winners. They generate a key, we perform the handover from the old key to the new key, and we have to do that across every single chain that we’re integrated with. Once that process is complete, control has been completely handed over to these new validators for another 28 days.

In the alternative scenario, where Chainflip notices that 20% of validators have dropped offline, we want to avoid a situation where we don’t have at least 100 nodes (or 66% of the 150 nodes) to reach the threshold to sign any transactions, potentially rendering all funds trapped forever. So we kick-off another auction immediately to replace the offline nodes with new validators, ensuring we have a healthy set of nodes again.

Angus: You’ve said that there are a lot of opportunities for AMM ‘s that are running on a custom execution environment. I was wondering if you could explain that to us?

Tom: So Ethereum and other EVM like chains are Turing-complete by design. They are like arbitrary computation platforms. As a result they’re not really efficient. They’re generalised computing machines. You can’t really push them to their limits because you’re working in the embedded systems world, like a micro controller versus an integrated circuit.

A custom execution environment allows you to do a lot more in the context of making the process more efficient. Uniswap for example, isn’t able to write any code that says, let’s tally up all the swaps in a particular block. It’s not able to because of the way the transactions are executed on the AMM, it doesn’t control the underlying execution there.

Chainflip can do that. We have our own validator network. We have our own Mempool. We have our own way of sequencing blocks and we can say we’re going to collect transactions for 10 minutes and then we’re going to match them against each other, and we can execute whatever delta there is on the exchange. So we have a lot more flexibility in that context than any EVM based exchange does.

That’s one of the reasons you’ve seen dYdX very recently come out and say they’re going to build their own blockchain. Everyone’s saying that rollups are the golden goose that’s going to fix everything in the Ethereum network, but they’ve realised that if they were to move to a rollup they still wouldn’t have much control over the underlying execution layer.

You still have to execute everything sequentially. You can do some funky stuff but ultimately you’re still at the whim of the EVM. And also they probably realise that even if they were to move to a rollup they still have to ask users to bridge their funds across to that rollup. And what’s the difference between bridging your funds to a rollup and bridging your funds to Cosmos – not much.

So why not give your users a very similar user experience and also have control over how trades are executed and sequenced. It just makes total sense.

Angus: Is your JIT (Just in time) AMM using batching like Gnosis does to batch transactions to reduce MEV and standardise pricing?

Tom: Yes. We don’t execute everything sequentially as it comes in. We actually did this because when you’re working in a cross chain or cross ecosystem environment, some chains are slower than others. Bitcoin blocks appear every 10 minutes. Ethereum blocks appear every 15 seconds. If you have a pair between some Ethereum based assets say USDC and Bitcoin, and you execute everything sequentially, you’re potentially receiving new USDC deposits every 15 seconds. If you receive a Bitcoin purchase every 15 seconds but a sale every 10 minutes you have a chart that looks very wonky. It looks like a saw wave. It goes up and to the right and then it drops vertically, and this cycle repeats with a 10 minute frequency and that’s not particularly good for users. It creates lots of weird incentives. For example you want to be the first person to get your Bitcoin purchase in after the last Bitcoin block. So what we do is collect all of those swaps, and amalgamate them, and then execute them all at the same clearing price.

Angus: That obviously eliminates the sandwiching that can occur between blocks as well?

Tom: Yes. You also limit opportunities for people that are witnessing upcoming transactions in the Mempool to benefit from trading at the right time and other volume based incentives.

Angus: Chainflip has said that there have been some examples of liquidity providers being incentivised on Uniswap v3 for their large orders. Is this concept largely untested outside of these use cases that you’re aware of?

Tom: It’s a good question. I think so. CowSwap does something very similar to what we want to do. It seems to be working pretty well but not many people use it as a front end. CowSwap doesn’t work the same way as the JIT AMM but it collects a bunch of orders and batches them up so everyone is given the same clearing price over a few blocks. In the context of just in time liquidity, I don’t believe we’ve seen it anywhere other than UniSwap v3. And that’s probably because v3 has a business source licence, so no one’s been able to copy it on the EVM chain. And no one’s yet had the time to build and release an equivalent on a non-EVM execution environment.

Angus: Are there any scale challenges involved initially with providing a minimum level of liquidity for each pair?

Tom: The plan loosely is for funds from the LBP (liquidity bootstrapping tool) to be used to provide liquidity to begin with. Obviously there needs to be liquidity on the exchange to make it useful. As builders of the exchange we will certainly be helping as many people as we can to become proficient liquidity providers. We have a bunch of people lined up to provide liquidity on the exchange. We’ll be helping them to make that profitable. We’ll be trying to drive as much volume as we can through swaps and so on. It remains to be seen exactly what the challenges will be, but given the nature of the exchange, I don’t think there will be too many challenges because of the amount of capital efficiency that we can provide.

I think the bigger challenge will be attracting the right volumes. I don’t think liquidity will be a problem to begin with. The challenge will be making that liquidity feel like Chainflip is a good home for validators by growing our volume.

Angus: With the EU intending to introduce MICA regulations as early as 2024, how do you anticipate this will impact the value proposition of DeFi?

Tom: If it affects the value proposition of DeFi then that product is not DeFi. Maker, arguably one of the most successful product to exist in the DeFi ecosystem is not going to fall victim to this problem. Anyone can build a front end for Chainflip and anyone can build an integration with it. User funds are not held custodially by any legal entity. They don’t have to trust the bridge for any longer than their funds are passing through it. And it would be very, very hard to regulate the product.

Angus: What about if you have retail customers using the product or if you’re domiciled outside of the net?

Tom: If Chainflip Labs, the company, is running a method of interfacing with the exchange, I’m sure there’s probably a bunch of arguments that you can make that Chainflip is providing financial services. If you’re that way inclined you can probably lobby for Chainflip to be caught in the regulatory net. Chainflip doesn’t have to run that front end. Chainflip can ask people to build it. It’s then up to them if they host it in Singapore or Dubai or another country that’s more crypto friendly.

Ultimately Chainflip Labs could end up interfacing with regulators, but the product itself has been released as an open source piece of software and it’s can’t be stopped by regulation.

Angus: You said at the end of your roadmap that this is just the beginning. What are some of the ways you envisage expanding beyond the use cases of cross chain swaps?

Tom: That’s a good question at the moment. I’m fascinated by the tech that we’re building. The threshold signature scheme that we built and the multi-party compensation protocol I’m sure has use cases outside of cryptocurrency. I’m more interested in that than I am in figuring out how Chainflip could be used for NFTs.

Angus: Do you see it having applications for B2B relationships?

Tom: Business to business relationships are incredibly inefficient, or at least it seems that way. I think anywhere there is a shared desire to produce common agreements between a set of untrusted parties could be a potential use case for our technology.

It’s extremely efficient, pretty robust, lightweight, all things considered. Tackling the shared custody problem is easily one of the most interesting things about the problem we’re solving.

Angus: What problems do you see crying out to be solved?

Tom: Privacy. Privacy of the underlying history of the blockchain, the under lying transactions. The average web3 user has multiple wallets, and over a period of 10 years, there’s lots of transactions that have potentially been made with these wallets. If you’re still using these wallets, all the transactions from 10 years ago are still there. That might be desirable, but I think it probably isn’t for most people. If you could go and wipe your transaction history or conceal them moving forward that would be great. If you buy a questionable NFT, you might not want your grand kids to know. The right to be forgotten is really interesting.

Technology is moving in a direction where you don’t have that right. It wasn’t codified in from the start. We don’t have a bill of digital rights and so there’s a lot of information our there about each individual person that they probably don’t even know about. It will become more of a talking point for my kids and the next generation. I see that as a big greenfield opportunity and a big selling point for future technology companies.

You’re seeing it a little bit now with people wanting to shift towards greater privacy but it’s slow. Signal’s had its time in the sun over the past year. Email providers have as well, like Proton Mail, but even they’ve ended up helping law enforcement recently. What it comes down to though is it’s really difficult to solve this problem in the first place. So I think that zero knowledge technology is going to play a huge role in that. I hope the tech industry over the next 10 years tends towards incorporating more of this technology into its products and services.

Follow us