This week we will explain what is The Ethereum Merge with on-chain data.
Hello, hello, welcome to our third Arcana. We're exploring the Web3 data and what we use on chain data today. So, today, we'll dive into the much talked about, but perhaps often misunderstood topic, the theory of merge. Along with me, I've got Boxer and Hildobby.
Hello, hello. All right, so today, we'll be covering the ABCs of Ethereum merge while looking at unseen data. So, we will explain what is the merge, proof of work versus proof of stake, why did it happen, when did it happen, how to track staking Ethereum, etc.
Without further ado, let's go into what the merge is. The merge refers to the adjoining of proof of stake beacon chain with the proof of work maintenance chain. Effectively, the merge transitioned Ethereum from a proof of work power chain to a proof of stake power to chain.
So to understand it, let's look at this visual a little closer. You have two parallel chains running with each other before they merge their proof of work chain, which is the main knot that has been running since Ethereum launched in 2015.
And then you have the beacon chain, which was launched around 2020 on the side. So the proof of work chain powers the consensus and also has the Ethereum state data, so all the transactions and the apps are running on top of it.
It keeps track of the balances as well as yours and my wallet. And then meanwhile, on the other lane, you have the beacon chain, and the beacon chain is just kind of responsible for getting to a consensus through proof of stake.
And we'll cover the exact difference between proof of work and proof of stake coming up next. So, when the merge happened, effectively, the proof of stake merged into the main net, taking over the proof of work as a consensus mechanism.
However, if you see on this graph, the Ethereum stake is still the same. It still remains the same, and the state and transaction history remain the same. So when the merge happened, it went so smoothly, right? Nobody even noticed a difference.
So, that is the merge. And another thing to notice about the merge is that it's split Ethereum into the execution layer and the consensus layer. Basically, now you have two parts. The execution layer is responsible for processing all the transactions and keeping track of the state data.
That's the EVM – Ethereum Virtual Machine – that we often talk about. And then another layer is the consensus layer. It is responsible for coming to a consensus for what the next block should be.
All right, so since the merge transitioned Ethereum from proof of work to proof of stake, let's further understand what proof of work and proof of stake are. So, proof of work and proof of stake are basically civil resistance mechanisms.
And together, with the longest chain rule, it forms the actual consensus mechanism. So oftentimes, we actually just refer to proof of work and proof of stake as the consensus mechanism.
But just note that a complete consensus mechanism actually has this other part, and it also needs a chain selection mechanism. And for Ethereum, we have the longest chain rule. Okay, so coming back to the civil resistance mechanism, what exactly is civil resistance?
It's basically solving the age-old problem of how to reach a consensus when you have a system of a decentralized, distributed party without relying on a trusted central authority. Basically, it's how do you reach an agreement on something.
So here, it would be the next state of Ethereum without really trusting each other. This is also referred to as the Byzantine generals’ problem. If you imagine that you are a general in the Byzantine period, and along with a few other generals and their battalions whom don't really know each other, there's no trust.
Previously, you surround the castle and are ready to attack. But in order to secure a victory, more than half of the army has to attack at the same time, otherwise, you will not have enough manpower, and you will lose. So, how do you coordinate the attack with the other generals from the same time?
You can send a messenger to the other generals. You can send a pigeon or whatnot because you're too far away to really like shout to each other, right? And also you don't even if you could shout, you don't really want the enemy, the enemy to hear.
But there's no way for you to know that your messenger arrived safely, they got killed, or they came back. You don't know if they are actually the messenger or a spy, right?
And furthermore, you also can't really guarantee that the general that's responding to your message is a good general where it could be like a spy general. And from their perspective, they also do not know if they can trust you.
Basically, proof of work and proof of stake are both ways that you can reach a consensus without trust. The ultimate goal here is that we need to trust in the Ether. In the theorem case, we need to trust the next block that's been added to the chain, and we need to trust that the chain – the state of the chain – is what it says it is, without really trusting all the decentralized node in this environment.
So, normally in life, if you need to trust the word of someone without really knowing that person, you need their skin in the game, right?
In the proof of work world, the skin of the game for the miners is the people who are proposing the next block and attesting to the validity of the word, saying that the block contains valid information.
That skin in the game for them is the mining power. For the proof of work chain, all the miners are racing to solve the same really complicated mathematical puzzle. And in order to win, they need to have super powerful computers, so that they can be the first to solve the puzzle and win some rewards at the next block to the chain.
And yeah, these computing powers are not cheap, they're actually very expensive. And so for consistency, if a malicious party wants to constantly propose an odd, corrupt block to the chain, and maintain that state effectively, they would need to have more than 50% of the computing power for the chain.
That is really hard to achieve. That’s the skin in the game in terms of proof of work, and you can already see that in the proof of work world. It's very energy intensive because all the miners are racing to solve the same problem and achieve one goal by adding the next block.
Now, let's talk about the proof of stake work. So, in the proof of stake world, the skin of the game for these validators is the people that propose the next block in protocol assets.
So, for Ethereum, it would be the 32 ETH that they need to deposit and lock into a certain contract. This is also referred to as taking your ETH. Just a quick note, if you don't have 32 ETH, there are other ways that you can participate. There are services where you can pool your ETH.
And then, if you want the staked ETH to be liquid, you could also participate in liquid staking services. But here, we're just talking about the most basic in proof of stake, so you need 32 ETH.
And that's your skin in the game. When you act maliciously, I would be slashed. So the ETH that you deposit and locked in basically staked. It will be taken away as a punishment. As you can already see here, it's a much more energy-efficient way as opposed to proof of work.
So, that's an essence, what's proof of work and proof of stake, they're both ways to help reach an agreement on something in a distributed or decentralized environment without trusting each other.
As we already alluded to now, one of the big reasons why the merge happened was a huge reduction in energy consumption. According to studies, energy consumption will be reduced by 99.95%.
And this really effectively makes the theorem chain very ESG friendly, and it's really great for our Earth. So, a second big reason were benefits of this merge is a drastic reduction in the annual issuance of Ethereum, the ETH currency. According to estimation, it will be close to 90%.
Why does this matter? Think back to what we're living through. The US Dollar is super inflated like the same thing with the pound. They’re having problems with the euros as well, so you really don't want your currency to be super inflated.
Inflation is basically what happens when you have more supply growth to the currency. And so each year your currency will be inflated like x percent and maybe a healthy amount is like 2%.
You know, in the US, that’s the targeted amount, and that's healthy, but when you have too much like you really don't want that. So, in going back to the Thermal world, right? So the same thing can also apply.
When you have the supply growth, which is basically the issue, the issue of ETH minus the burned E. That's the supply growth that's being added to the theorem chain. When this equation has more ace issuance than the ETH burned, you would have annual inflation.
And you really don't want that to be too high, right, because in order for your currency to stay, ultra sound currency, like, that's not what you want. So, when the merge happens, this really positions Ethereum in a really good spot.
So, now Ethereum can achieve this equilibrium, basically, like how much you issue versus how much you burn. So how much you add versus how much you take away, can come to an equilibrium, and your money can just maintain that too much deflate inflation, and not too much deflation.
And that is really good. So, that's another major benefit. Two major vitamins that we have here. So, the drastic reduction in energy consumption and a drastic reduction in the issuance.
Okay, so now we've kind of gone over what the merge is and what are some benefits of the merge. Let's take a look at how we can do some Unchi analysis regarding the merge. So first things first, if we want to find out when the merge was going to happen, we can do that.
I'm going to now switch over to a complimentary dashboard for this RK Arcana. These dashboards are some example analyses that I've put together that we can do regarding the merge on-chain. And then we're gonna just go through them one by one.
So first, how do we find out when did the merge happen? Let me make this a little bigger, so we can all see better. Okay, so the way the merge was determined was that when the Ethereum tree reaches a terminal total difficulty of this gigantic number 5875, the proof of stake will take over proof of work as the consensus mechanism, and the merge will happen.
Okay, so remember, we're talking about this in the proof of work world where you have miners, and they're always into solving these really difficult problems. With each of the blocks that they're trying to add or solve, there's a difficulty level associated.
And then when you accumulate all the difficulty levels from the beginning until whenever the proof of work ended, that is the total difficulty of the Ethereum chain. So, it was decided that when the total difficulty reaches this terminal total difficulty 575875 getting data, then we'll switch over.
Armed with this knowledge, I'm going to just copy-paste this over. I'm going to start a new query and make it a little bigger. As always, I am going to switch over to ng v2.
Here, I'm just going to comment, that's the whole criminal total difficulty that I need. Okay, then here, I'm going to Korea for the theorem chain. And then, as always, I like to just get my blocked from ready.
So, we're analyzing them. We want to know what blocks merged. What block did Ethereum start running on the proof of stake chain? We also want to know the exact timing.
Let’s query for the Ethereum blocks table, and let's go into the column selection. Okay, so first, I want to filter my criteria where I know that I'm looking for the total difficulty reaches or passes over.
So, it's greater than or equal to terminal total difficulty. All right, so with that, I would like to know the block number, as well as the exact timing of this. Yeah, so this basically will tell us the floc number that the Ethereum chain has reached the terminal total difficulty.
The next block of this current block number that we….I'm going to order by the number or the time either one is fine. They're basically the same as the number increases. The time also increases. So, order by one, and then I'm going to only get the first block that matches my criteria.
Now, I'm getting the first block that has reached this total terminal, total difficulty, and blah number that I threw up here, plus one would be the first block of the new proof of stake chain. So, let's grab the number.
And let's also go to the Etherscan side. I'm pasting this number that we just got on, but instead of three plus one is four, I'm gonna look that up. But as we can see, this is indeed the Paris upgrade, aka the much. All right, so that's how we can find a block number.
And also here, we see the exact timing for when the terminal difficulty is reached. So, the next block will be the first, the next time will be the first time that Ethereum begins on the proof of stake chain.
Alright, so if we format this a little bit nicer, we would be able to see this number. Yeah, and then other graphs can also help illustrate the point for the merge. So remember, we're talking about the total difficulty when the total difficulty reaches the terminal total difficulty (TTD) then the merge happens.
So here, we already see it, there's a graph of September. Here’s the total difficulty of the theorem chain, and we can see that it already gradually reaches the TTD up until September 15 when it completely converges.
Something else also that we can see…in the proof of work world, we have the difficulty to basically set for the miners to solve this puzzle. You see, this is the difficulty per block, it kind of like varies a little bit, but when the merge happened, it completely clifts, right? Because there's no longer a need for mining because we're securing the chain through staking.
The total difficulty per block just drops to zero, as expected. Okay, any questions? Taking a pause here…
Okay, moving on. So, the second thing that we can solve for ourselves is how much ETH is at stake and how many validators are out there. And this is important because we know that, now, we switched over from proof of work to proof of stake where the security for the chain is basically how that ETH was staked.
So, I'm interested to see how much Ethereum is staked and also like the total ETH staked over time and cumulatively. How do we achieve that? Basically, we need to find out, remember…they're all the stickers. The people who want to validate the blocks, lock up 32 ETH in a certain vault location. So in a certain address, that's just locked in there.
And then, they become validators qualified to attest and propose blocks. What we need to do here is find that ETH two-volt to address the vault that's locking. So, I pre-made a quick video just so we can have references in the future. The gist here is, basically, you just Google the ETH to the staking address. And then you go get it, and you can view it on Etherscan.
What I want to try to convey here is once you know what you need to Google, a lot of times you can just Google the information that you want. Now, armed with the knowledge for the ETH to staking address, we're going to switch back here. I'm going to say take a note for myself. This is the staking address of the ETH. What do I want? I want to know the value, right? The value of the ETH is staked.
Now, I am interested in the traces table. And why is that? The traces table contains all the internal transactions that are triggered by the service whereas the top-level transaction is in the transactions table. The records are only from the signers, and they are directly initiated by the signers.
Whereas in the traces table, we have internal transactions that are programmatically triggered by the surface-level transactions. So, that's the table that we want to look at. And then let's do our filtering criteria when close to the one just to make the format a little nicer so we can start commenting things faster in the upcoming parts of the query.
Okay, so what do I want? I want to look at all the transactions where someone has sent money to this particular staking address right? So I going to do two equals two. And as always, I'm going to apply a lower function for this stream just to make sure the format can match properly.
Okay, so now back here, what am I looking at? I am looking at the traces table and the column that I want to grab is called value. And then I want this value. It should be automatically denominated by like one to the 18th.
Because ETH is an 18-decimal token currency. Then, I'm interested in summing up the ETH, that's been deposited, right? So, this would be the staked ETH. Okay, awesome. And if you're simply interested in the total ETH base day…
So, pretty much if you're interested in this 14 million number, then this is what you can do. But probably a little bit more meaningful than just one number, we want to look at a graph to see the deposited amount per day per week per month, whatnot, and then see cumulatively what the trends are for the ETH being deposited. We need to do a little bit more work.
So here, we're going to select the time for the deposit. So, block time here. And as I was saying, we can decide on the granularity of what we want to look at. So here, let's just group by date. So applying a date trunk function.
This function will help us trunk the time stamp to whatever granularity that we want. Here, I want a date. I'm going to name this time. Okay, so now, I've introduced this grouping element where I want to look at the total ETH deposited per day, I need to then apply the grouping function where the grouping keyword…so I'm grouping by the first column aka that time here.
This is how we can get the per day like group per day. What is the ETH being deposited, but how do we get the cumulative total that's been deposited? So, I like to separate this into a separate sub-query.
Just do this, then. Sorry. Yeah.
And then okay, so then select from CTE as always? And then Okay, so what do we want to select here? We would like to select the time, right? So that time, perfect. And then we want to select the daily total that's being staked.
Okay, this is the hard part. Let me also fix this really quickly. Okay, so why doesn't one equal two, one, and two? Okay, so select time staked? And the cumulative part? How are we doing the cumulative part?
I'm going to apply a window function over an aggregation. We're going to sum me up the state amount over something. Okay. So when we say cumulus, what do we want? We want to have everything that comes before today to today, right?
Everything that comes before the current row in excel in your database record will not tell the current row. So first, we need to make sure that we order on time, so it doesn't mess up.
And then next, we need to say order by time and then range on bounded preceding the cumulative state. So basically, this just achieves what exactly we were saying we want to have a cumulative sum of the staked amount up until that particular date.
All right, let's give this a run. See something that I'm not seeing…interesting, somehow syntax error out or near some line for T… I'm missing a comma blocked from…Okay, let's try this one more time. All right.
Yeah, and also something you can do to this cumulative sum part inside the same query. But for me, just personal preference-wise, I like it to be a little bit cleaner, I just separated it out into a different sub-query.
And while that's running, let's just try to take a look at some other things you can do. And we're going to come back and see how we can compose a graph like this.
So, now we kind of like know how to view the staked ETH over time. If we're interested in the validator over time, it's actually the exact same query that we were just building except that we are dividing it by 32.
Because remember, we were talking about, with each 32 ETH, you qualify as a validator. So that's simply what you can do. And actually, the two graphs look exactly the same as expected.
What's more interesting is you might be interested in the distinct sticker or the unique address that has been stated directly, right?
So, if we want to do that, we can find out the same information from the table. So basically, we are courting for the Ethereum traces table, and then filtering on the same criteria where this is the ETH to sticking address.
The money you deposited is locked up for now. Then, also making sure that it needs to be a successful transaction…I actually kind of forgot to add this part in the previous query. So, let's actually add that back.
Someone's asking, are we not including the current row? What do you like? This part will include everything up until now. Okay, let me just add this.
Here, just making sure that the transactions need to be valid. Sometimes, when you initiate a deposit, right, it could fail, and I forgot to add this part. So actually, I'm going to just start over one more time.
Okay, switching back to the distinct staker address. So same thing, we're going to filter by the successful transaction that has deposited money into this issue staking address. And then now, we're going to count the distinct from the address that it deposited from right, so that would be our unique stickers.
This is how many distinct stickers like 84,000ish we can find out. And if we are more interested to see what has happened over time, we can do a distinct sticker address.
So very similar but a little bit more complicated. We want to make sure we only count when staker, the count to the first occurrence of the stake or right, so what we're doing here is applying a row number function.
So pretty similar, grabbing the block time, grabbing from the address of the sticker, right? And then we're applying this row number function over, so we're partitioning by the from whatever by the block time and assign a row number.
This effectively assigns a number to the number of occurrences. So, if you have three occurrences you would have three rows of an appearance on the result, and then it will say 123 as your roll number.
We just want to grab the first appearance of you as a staker, so we're not double counting. That's the slightly more complicated parts. But yeah, once you construct this query to grab the time that the staker first staked ETH with that table, then you can count from and then you can aggregate it up by the time.
Here, I think the granularity is the day, so per day, what are the unique stickers that have entered into this gigantic staking pool in a sense? Once you have that very similar logic, you have this the count per day, and then you can sum up that count cumulatively from the beginning up till now. And then that's how you can derive this.
This graph. Okay, someone was asking how would the results be different if you didn't include the range on the bounded preceding part. This is only a half order by the block time in the window function.
I'm a little confused about the second part. Maybe you are on mute, I think. Boxer? Am I on mute? Or are you on mute?
Oh, I'm on mute. Okay. Can you quickly go to the part of the query? So we can look at that? Yeah, we'll make the make dickory big. I think he's asking, What does the range unbounded proceeding actually do here? And from my understanding, you could leave it out and get to the same result.
Oh, I actually did not know that.
You can just do order by time ascending and the default is ascending. So, you can just do an order by time, and you would get to exactly the same. I think that question is…yeah, you're making it more complicated than it needs to be.
And we have another question here, which is, what does the “we are one is one” mean? And you can explain that, right?
Yeah, it goes to one because it's always true. It just effectively helps with like commenting out. So, if you have a bunch of like…and right, so it's like, you don't want to do too much work when you comment it out.
It’s just to help with our laziness. You just make sure when it goes run, and then formatting-wise, it's always nicer for you when you comment it out.
Yeah, it's basically a cheat to more easily work with queries. So, if you want to quit a month out first thing, usually, it would be the first statement. It would be where to lower and then like that address.
And if you ever want to comment that out, then you wouldn't need to move your WHERE statement down. So like, by default, all kinds of advanced analysts just do a like where one equals one. That never happens.
So, you never have to move your way around. It's always aware of one's equals one. And then we just like…you can just like use a bunch of N statements. It just makes it easier to work in edit curves. Yes, all right. Glad we could clear that up. I'll disappear again.
Thanks. Yeah, I actually didn't know that you can just directly do so. Great learning for me today as well. Okay, so now that we have the core results ready, we can create a visualization. So, let's go with a bar chart.
And then by default, it shows the daily youth being staked. Right, so we want to also visualize the cumulative stake. But if you just add directly, it's gonna tower the daily state. Makes sense because you're adding up everything.
So, two things that we can do here. Number one, we can change the into a line to make sure that it's like no longer covering. But even then, right, the daily staked amount is too little to see. So, we can split the y-axis into two axes. So here, we've got two axes.
And then we got to make sure to put one of the columns into a different x. So yeah, now you can see, but we've got the daily state, and then we got the cumulative state. And also, the number if you hover over is a little bit ugly.
Let's format this a little nicer. The way you can format this a little nicer is by applying this label format. And now when you hover over. It looks very nice, reasonable, and readable. Yeah. So, this is how we can create this graph on the dashboard.
And then once you're ready, you can save and then add to whatever dashboard you desire. Awesome.
Okay, so that is the part for looking at staking validator, and whatnot. Any questions before moving on to the last part?
Okay, awesome. Let's look at one more thing. So, here, I want to look at where I want to compare the block rewards. How have the block rewards differed since the merge. So just to caveat work ground rules here. What we're doing here is only looking at the execution layer.
So remember, we were talking about the merge split Ethereum into the execution layer, the EVM part, and the consensus layer so the proof of stake part. We're only looking at the execution layer here.
And also, in addition to anything that is relevant for the mining, the proof of work era does not include The uncle block. So uncle blocks in brief are those walks that are valid. So remember, all the miners are racing to solve the same problem.
And then they can say, “Hey, this is something valid that I would like to add to the chain.” So, sometimes, if you're not the first, you could result in a situation where like, “Okay, we have an actual block, but there are other blocks.”
Those are called Uncle block. You also can get rewarded for my name that way. We're not including those uncle blocks here. And also, this graph only looks at a timeline for after this upgrade called constantly noble.
So effectively, in the constantly noble upgrade, the block reward for mining is reduced from three ETH to two ETH. So again, the beginning was five years, and then it gradually dropped to two ETH. And then now it's zero for mining.
Well, yeah, here is a graph where we're tracking the block rewards on the execution layer. And then the blue area chart represents the netblock rewards that each block has. And then the orange line represents the mining rewards, right?
So, before rewarding the miners, what theorem chain has been issued? The purple line here represents the transaction rewards. So, if you remember from the previous section, when you want to process a transaction, you need to pay a certain gas, right?
That's kind of the transaction fee you pay. And you can also tip miners to make your transactions faster or whatnot, but just in general, that's the transaction reward.
And also the reddish line represents the burned ETH per block. So, we kind of briefly alluded to this. Since EIP 1559, a new rule was implemented where there's a certain base amount of gas that's been burned.
When we calculate the block rewards, we need to take out the burned amount. But yeah, so a few observations we can already see here. In terms of mining, we can see that the mining (where the words are) goes well, and then all of a sudden, it drops to zero, right?
This is what we would expect because it doesn't happen anymore. Another remark is the burn. Obviously, it's like zero all the way until whenever the P 1559 is implemented. And then also, in general, it seems like the transaction reports kind of like tracks.
Someone actually with the block rewards and total. But yeah, so this is how you can kind of play with the numbers and visualize and discover trends in general.
Yeah, so that is all for the kind of live coding session that we're kind of like teaching session we're gonna do today. Any questions before we move on to the next part?
Not career-related questions, but there are a bunch of high-level questions about Ethereum issuance and all that good stuff. So, I guess, stop sharing your screen. We'll bring in Hildobby, and we'll see if we can answer those questions.
Okay, let's do that. Let me hide my screen. Yeah, and awesome.
Yes, there we are. So, we're just gonna go through them chronologically. And where was the first question? Is Celeste ETH just burned forever? Or is it transferred somewhere?
I don't actually know about this. I would say, but I'm not sure, so I don't want to say anything wrong.
Yeah, I'm also not sure. I would also assume it's just learned since who would be the logical kind of benefactor of that, but yeah, maybe there's some big-brain economic thesis around why that's not the case. So, Pablo will research or get back to you?
Because like, there's a lot of stuff, which is like beacon chain data that isn't really accessible for now. And will be after the Shanghai upgrade. Hopefully, when you're able to take out your staked ETH that should be fairly interesting once you're able to see all this.
Yep. All right. So, what's the difference in difficulty and TTD, which is a total terminal difficulty, I think?
Yeah. I mean, I just see them as the same thing. It's probably the current block difficulty. I don't really see what the difference is. It's just name-naming, I guess.
Yeah, I feel like it's just terminology, each block has its own difficulty. And then cumulatively, you have the total difficulty of the chain. And then the TTD.
A terminal total difficulty is just an arbitrary number where at that total terminal difficulty will switch over from proof of work to proof of stake. So just terminology differences, basically referring to the same thing ish.
Okay, next question was, can we tell the percentage of total ether staked? And maybe if you can quickly, slide your screen back in. So, I think he's talking about how much ETH is staked and then how much ETH is staked in how much percent of ETH exists in total.
And I remember there was a big debate about…nobody can even calculate the ETH supply since it's fairly complicated. So, I guess you could kind of make an estimate on Dune. But actually, getting to the exact number could be kind of hard.
That's what I do to estimate the current supplier. I use Etherscan to see the latest every once in a while. And then I calculate what you can calculate, but you can't see for now.
Ether awards, issue rewards, basically, you can't really calculate those. It's on Beacon chain data. So, you can calculate how much is burned. We can see how much new eCommerce and new ETH is issued, basically.
So, that's something that I think boasts a Shangai upgrade. We should be able to have — going forward — a reliable way to see the East supply. But historically, yet stuff as well for now.
Can you expand on or double-click the Shanghai upgrade? Do you have any…is that when the execution layer and validator…I guess they don't merge, but the beacon chain data will actually be available on the main chain? Is that kind of the case?
Exactly. Yeah, I think that's it. And then there's also like…it's very awaited because that's when the people are going to be able to stake. So, it should be fairly interesting.
Yeah, there's a bunch of E IPs as well, which, just like every upgrade, there's a bunch more IPs, and I don't know all of them. But I'm sure. I think Tim Baker should be the guy to follow if you really want to look into all these upgrades and follow exactly Ethereum development.
Yeah, I think that's all some high-yield already.
That's it for the questions from the chat for now. But I think Jackie has some more general questions about the merch as well.
Yeah, just some general announcements, I guess. There has been this Ethereum foundation merge data challenge that's happening. So, if you want to participate and dig more into what the merge means for the data community at large, go check it out and participate.
The link is on the slide, and you can click and then you can go there. And yeah, I guess in general, also, if you're interested in this kind of content where you're learning about Web3 trends and news through Anchin data, follow us at Dune YouTube for more information.
Yeah, I think that's a beautiful piece of art.
Yeah, this is Dolly-generated. Yeah.
Oh, nice. Yeah, it looks really cool. Yeah.
We have an important notification from the chat. Pablo actually looked at the slashing thing. This is last, but whoever reports the person…actually, there's a correction but okay. So, keep this in mind.
Sorry, so it is basically last, but then the Gaussian is words.
So, this is last Monday. So, next to receive what a nominal reward we're pressing the public now like the there's the beacon chain experts. Our plus will need. Yeah, we should have invited that guy. Yeah, we're gonna clarify it in the description of this video so that future people will be able to actually solve this riddle.
Yes, yes. And yeah, so I guess now, if you're ready Hildobby, we can dive into a few questions with the merge. Sure. Yeah, so first off, what surprised you the most about the merge?
I kind of expected things to…I guess not expected, but I thought maybe stuff could go not as well as it did. And like some validators would miss the block more than they did. And I think instantly all the blocks got validated.
Really quickly, I was looking into it actually while you were presenting earlier. Looking into block time before and after the merge. And I made a Query. And basically, you can see how the block time is deterministic.
It's like 12 seconds. And unless a validator misses those 12 seconds, it like you can see basically who missed based on the block time. Block time is not what is defined as block time in Dune but actually the time between two blocks.
That's what I'm referring to. And yeah, so if you look on the chart…yeah, there's a lot of data. So procured slowly. But yeah, you can see how before it wasn't deterministic at all. And now it's very deterministic.
So, the bottom line is like 12 seconds. It's whenever the block is instantly validated. And the official gauge was chosen as random out of the validators. And if he submits the block, then it's all good.
But if he doesn't, if he misses it, then it goes for another 12. Second, that's the second line. And it's only ever been missed twice because that's why you see the third line up top, I guess. Does that make sense?
Yeah, I see. So yeah, I expected like instantly, some validators missing blocks, but it went pretty smoothly, overall…extremely smoothly, which is really cool to see.
Actually, this has some influence on DeFi, right? Where it's like some protocols actually emit rewards, based on the number of blocks they've mined, so now they're kind of giving all the rewards faster.
The sushi master chef has a very famous contract that does this, where it's like their rewards are based on a per-block basis.
Okay, I didn't know that. But yeah, I mean, that makes sense. Now, it's probably easier, though, going forward because it's deterministic. So, it's easier to work with.
At least it's closer to deterministic. The calculation, yeah, it's way easier to project time into the future using the time between blocks.
But yeah, you can also estimate how often blocks are 24 seconds apart, like missed ones. And then based on this, you have an estimation of average block time.
And that should be fairly…the more data we get, the easier and the more accurate there is going forward. Yeah.
Do we know what these few outliers are? Probably not. But shout out to our friends at rated dot network. They're doing validator analysis, they're rating machines.
So, if you want to dive into the data of the validators, you can. I can post into chatter related to that network…I think that's the link. Yeah. So, if you actually want to look at the reputation of these machines, in this case, being beacon chain validators do visit that website.
Awesome. Next question. So, in this post emerge world, what are you most excited about?
Well, for me, I think the most exciting is especially post-Shanghai upgrade because a lot more data is going to come and validator data is pretty cool to look into. I'm really curious about that.
For now, I haven't looked into a lot of data. I think the MeV has been what's trending most on Twitter post-mergers. And then I have an Ethereum Overview Dashboard and staking dashboards where I have a supply calculation.
And on those, I need to change how I calculate it. Because before it was deterministic to each block, plus some rewards and stuff. And now there's no longer this to eat.
There's still the birds part. But we're missing the issue and stealing from the beacon chain, which will come at a later point.
So for now, I just take snapshots of the number based on Etherscan, plus a small calculation. So, that’s accurate somewhat. Those are going forward. But every once in a while, I update it to make sure the estimation is very close to the truth.
Nice, nice. Yeah, it'd be really nice when we can get the consensus layer, and the beacon chain data on, and then we can do even more stuff.
Yeah, I think to be honest, as an analyst lately, I've been diving mostly into taking advantage of Dune v2, looking into cross-chain stuff, looking into more historical data, like higher timeframes and everything.
And we're working on a bunch of my dashboards that I made before. So, that's been my main focus even pre-merge. Once I get this out of the way, I'll probably look more into the merge.
But yeah. I think there's a huge opportunity, though, to look into it. And the Ethereum Foundation has pretty big bounties for it. So, I really encouraged people to do this and submit it to them.
I won't be participating. I'd rather have like, I'd rather have like smaller wizards and people, who get to be showcased through this, I guess. That'd be really cool.
Yeah, yeah. Nice, nice. Well, I'm asking away all these questions. If anyone on the chat has any questions for holdover directly, feel free to type in the chat, and then we will switch over.
But as I'm asking the question, you mentioned you've been working on some other projects. Anything that really excites you that you can share with us that you particularly want to highlight?
I mean, yeah, sure. There's this one I was working on this morning to actually show it to Boxer. I'm working on a Tornado Cash dashboard. And I think a cross-chain Tornado Cash for now. It will be everything but Polygon.
I really want to have a granular view, like macro but a granular view. It’s kind of hard to explain everything…Tornado Cash…and have an easy dashboard.
So, that's like all the pools, the TVL, the USD TVL of all the 22 cache falls on every chain, except Polygon because it's not laid yet on v2. And you have like a toggle where you can switch between either individual chains or L ones or L twos and stuff like this.
Yeah, I don't know, I'm very interested in looking into Tornado Cash data. And then my biggest focus lately has been reworking from the ground up all NFT trades data and making it include more stuff. There's a lot to work on there, even royalties were trading but applying that to all marketplaces.
Basically, I have a chart that's been a few people I've used lately on Twitter and in articles. They are working to include more data and be more accurate because it's missing some marketplaces.
Lately, I've done quite a bit of volume like x two y two and SUTA swap, for example, which makes it seem like the overall volume is probably lower than it actually is.
Because those are missing and they weren't there during January when stuff was high. But they're there now, and they have a significant portion of the volume, so it kind of changes thing a bit.
Yeah, yeah. I think here I do, and Boxer is both super excited about this whole spell book, including, you know, NFT dot trades. For me, I felt like, for the first time in history, all the wizards all day.
You know, people in crypto in the Western world were working toward the same goal. I remember us talking about wash trading stuff. But now, it's implemented as a part of this bubble gray.
So, now everyone can go from there. It's the same set of standards. And this, I don't know, really excites me about a data person in this space.
It's not implemented yet, but it's in the PR. It's almost there. It's an open PR that I'm working on. And I think it would be cool.
So, there's probably something similar to be done about dec streets. Where there are some, there are some weird actions going on as well. I don't know. There's always some wash trading because there's some token reward somewhere that most people don't know about.
But this specific project has a few whales being taken advantage of. And I think there's a similar thing applied to text rates. But yeah, what I'm doing is a table that will be available where you can join on NFT trades.
And for each of them, based on like a few criteria I outline later on Twitter and stuff, it filters out. Is it a wash trade based on my criteria? Or is it not? And it's available to everyone? And I think that should be fairly interesting.
Yeah, yeah, definitely. And this is also something you kind of learn in public where, because the PRS are all publicly open, you can just go see how exactly it's implemented. And then you can learn and then do something similar and different for the next time.
That's also something that I'm really excited about. Speaking of which, there are just like so many things that you can look at in the space — learn or fall into the rabbit hole. There are so many holes to fall into.
What advice do you have for new Web3 data analysts that just come into space to see how much direction and which direction they can go into?
I think that goes into the panel. We had a Dune con basically. The advice is very…it's always the same. At first, you're like…you have a smaller reach as an individual.
So, you want to grow that reach probably starting with protocols that you're interested in. And you see some stats missing that you can help with, and sharing those within, I don't know, protocols or communities or whatever.
Kind of is a good way to start growing as an analyst. That's how I started. That's how boxes started. So, a lot of people are as well.
So yeah, I think starting from the community, making sure you're providing really cool stats and making sure as well that you're actually providing valuable info and not just rehashing stuff that's already out there.
Because it's nice for you to learn how to use that, but it's not gonna get you much attention if that's what you're looking for.
And yeah, if you provide good stats that protocols had not seen before, they'll generally appreciate it. I've had retroactive rewards that way and stuff like this.
So, I think that goes a long way, even for recruiting. It could be a really good step to get you recruited within that project. I think that's why a lot of wizards have this kind of path into recruiting.
Nice, nice. Yeah.
It’s probably slightly more complex if you're starting out. But there's the dex trades migration onto v2, which is life, and there are some bounties for it. So, you guys do more about this than I do.
All bounties are taking 10 people? Well, but yeah, there will be new ones in a couple of days. We first got to see how this round works if the PR actually adds up, or if there's like a hot, big revenue and effort but even pretty much.
Yeah, that makes sense. That makes sense. Nice, nice. Yeah, I think there will be more bounties on the work soon. So, keep an eye out for that.
That's really cool, I think. There's a lot of work starting with making specific abstractions, going from there, and making entire dashboards that just aren't possible natively on Dune because it's complex. It's too computationally complex to do natively within a Query.
And that stuff…yeah, there's definitely a lot of stuff that can be deep. Like can be used to do deep dives into specific research pieces, or there are really a lot of different avenues you can use Dune for like there are the public-facing general dashboards on an IP protocol.
There's also a macro view of the general sector or an L one or an L two, and then there's like research pieces, or there's a lot of wizards who go specifically deep diving into really deep stuff into protocols and don't really get the recognition as much as other wizards because it doesn't necessarily generate as many views.
But I think those are also really good and kind of go under the radar mostly, but if you search on Dune for any project or protocol or whatever, you can find a lot of this.
Yeah, yeah, I think I found myself some days just searching on the bar and seeing what other great dashboards there are. And yeah, totally, it's also like a great learning opportunity and producing opportunity for yourself even if it's like not something that a supernova.
You can also look into others' work and forget and start from there. And that always gets a great start.
Well, looks like we are coming up on time. Thank you so much for having an AMA session with us today. Hildobby and Boxer, thank you for helping with the chat. We'll see you guys again in two weeks on Thursdays at 14:00 UTC time.
All right, that's it for now. Subscribe to our channel for more content, and see you next time. Bye, guys.
The revolution will not be reported quarterly.
Celebrating Dune wizards. Dashboards, stories, alpha dropped in your inbox weekly.
This week, we discuss the whole FTX saga -- a timeline for the series of events pertaining to SBF, FTX, Alameda Research, CZ, Binance, and more. Using the FTX saga as our case study, we will go over how to track on-chain activities/flows, balances, and volumes in an open data blockchain environment.
This week, in honor of the ongoing `dex.trades` migration, we will cover the basics of AMM (automated market makers) and offer a starting point on what data analysis we can do with AMM's on-chain data.
"L2s (Arbitrum & Optimism) account for 30-40% of all transactions on Ethereum, but consume only 2% of the total gas..." That is one of the trends mentioned by Tomasz Tungas in his talk at DuneCon regarding The State of Web3 in 2022. This week we explain what Layer 2 is with on-chain data.
This week we will explain what is The Ethereum Merge with on-chain data.
This week we will investigate the liquidity crisis BendDAO had and its implication for the broader NFT market.
In the inaugural session, we investigate the implications of OFAC's decision to sanction addresses related to the tornado.cash protocol.
The revolution will not be reported quarterly.
Celebrating Dune wizards. Dashboards, stories, alpha dropped in your inbox weekly.
In this talk, Andrew walks us through his insights into the web3 data economy and its participants. He also shares some advice for Solidity devs that would enable analysts to more easily work with their data. Andrew's Twitter: https://twitter.com/andrewhong5297
Grigori from Lido walks us through Lido's operations with and on Dune. Lido builds liquid staking protocols for blockchains. For more on Lido, please visit https://lido.fi/.
In this workshop, Ben from Cow Protocol walks us through his Python Dune Client. The client is built on top of the newly launched Dune API and offers expansive features to work with the Dune API internally. The Dune client can be found here: https://github.com/cowprotocol/dune-client Ben's Twitter: https://twitter.com/bh2smith
The end of Dunecon was marked by an AMA with the two co-founders of Dune, Mats and Fredrik. The AMA was conducted by Mario Gabriele (Founder @ The Generalist). Fredrik and Mats mostly share stories about their journey and give glimpses into the future of Dune. Fredrik's Twitter: https://twitter.com/hagatec Mat's Twitter: https://twitter.com/mewwts Mario's Twitter: https://twitter.com/mariogabriele
Fred & Ivan from GitcoinDAO walk us through the gitcoin systems and show us how they utilize Dune to ensure equal access to data for every DAO member. To learn more about gitcoin, visit gitcoin.co Ivan's Twitter: https://twitter.com/ivanmolto
In this Panel, some of the most famous/notorious Dune Wizards share their web3 journeys, tips on how to get a job, and how to effectively make a name for yourself within the crypto data community. Participants: Boxer, Dune: https://twitter.com/0xBoxer Hildobby, Dragonfly: https://twitter.com/hildobby_ Kofi, 1confirmation: https://twitter.com/0xKofi Chuxin, Optimism: https://twitter.com/chuxin_h
Danning walks us through the evolution of the decentralised exchange marketplace space and the evolution of its participants over the lifespan of the industry. Danning's Twitter: https://twitter.com/sui414
Austin Adams and Xin Wan from Uniswap Labs walk us through the (un)common phenomena of just-in-time liquidity for Uniswap V3 and present their analysis of the issue at hand using Dune's datasets. This talk is largely based on a recent article released by Austin & Xin that can be found here: https://uniswap.org/blog/jit-liquidity. Austin's Twitter: https://twitter.com/AustinAdams10 Xin's Twitter: https://twitter.com/xin__wan
Richard Chen, GP @1confirmation299, walks us through some of his methodologies in approaching web3 data and shares some of his wisdom and hot takes on the current state of the markets. Richard's Twitter: https://twitter.com/rchen8
Dune's Co-Founder and CEO, Fredrik Haga takes us on a short trip down memory lane and explains the rich history both Dune and the Ethereum Ecosystem have in Berlin. Furthermore, Fredrik explains why the global shared backend that underpins all of web3 is a radical change in how businesses are run and how Dune fits into this paradigm change. Fredrik's Twitter: https://twitter.com/hagaetc
The revolution will not be reported quarterly.
Celebrating Dune wizards. Dashboards, stories, alpha dropped in your inbox weekly.
In this final (non-)weekly Wizard Danning and Hildobby interview Dune's very own Wizard Boxer. Boxer will tell the tale of the Dune Community, talk about what makes Dune's Community so special and shed some light on his view of the crypto market and data sphere.
Andrew does Data Science @ Mirror.xyz, gets people web3 data jobs with "web3 data degens" and is in general just a really active contributor to the web3 data space.
Doc manages the data side of Rabbithole.gg and has amazing insights as to how the onboarding of users in crypto is going.
Michael is considered by many to be the best Analyst working on Dune today and is doing amazing stuff day in and day out.
Tom has previously been a PM at Facebook and Instagram, worked for 0x Network and is now a general Partner @ Dragonfly. We'll try to dig Tom's brain about about the history of on-chain data querying and how Tom decides
Richard is one of the greatest Dune Wizards alive and General Partner @ 1confirmation. We talk to him about his views on the industry as is, and where we might be going.
Elias is one of Dune's most prolific users and has made a couple great dashboards that garnered Industry wide attention.
Journey into Crypto Data Science, what she works on today and we talk a whole bunch over what dex aggregators do and how you can measure their success.
How to build a quarterly report with crypto data, what goes into that and how data is leveraged across the industry to find a basis for your decisions.
NFT trends and developments, how to find a footing in the web3 industry and how kofi and 1confirmation think about thesis and frameworks for investing.
NFT marketplaces, advice on how to become a Dune Wizard and building a following using the power of Dune.
Celebrating Dune wizards. Dashboards, stories, alpha dropped in your inbox weekly.