What Are Blobs? Improving Ethereum’s Scalability

[adinserter block=”2″]


Warning: Attempt to read property "post_excerpt" on null in /www/wwwroot/coinpulsehq.com/wp-content/themes/mh-magazine/includes/mh-custom-functions.php on line 392

Warning: Trying to access array offset on false in /www/wwwroot/coinpulsehq.com/wp-content/themes/mh-magazine/includes/mh-custom-functions.php on line 394

Warning: Attempt to read property "post_title" on null in /www/wwwroot/coinpulsehq.com/wp-content/themes/mh-magazine/includes/mh-custom-functions.php on line 394


Officially introduced to the Ethereum network on March 13, 2024 in the Dencun upgrade, blobs are a new type of data storage space designed to make rollups cheaper and more efficient.

The consensus within the Ethereum community is that the best way to scale Ethereum is through rollups, also known as Layer 2s or L2’s, and that the best way to scale rollups is via the introduction of blobs.

Before blobs, if Ethereum went through a period of congestion, that would also affect the price of transactions on all of its Layer 2s. The introduction of blobs removes the correlation between Ethereum congestion and the cost of transactions on Layer 2s.

What are blobs?

Blobs are a new data structure introduced to Ethereum in EIP-4844, more colloquially referred to as “Proto-danksharding”. EIP stands for Ethereum Improvement Proposal—the process by which Ethereum core developers suggest improvements to Ethereum.

Proto-danksharding is the precursor to full-danksharding, and sets the foundation for it by introducing blobs. It does so in the same format in which they’ll be used when full-danksharding is implemented, in order to simplify the transition to full-danksharding.

Full-danksharding, or just Danksharding, is an upcoming Ethereum protocol update. It represents what Ethereum core devs believe will be the last step (for now) in making Ethereum a truly scalable blockchain, by making transactions faster and cheaper.

Proto-danksharding is a step towards implementing Danksharding—it introduces concepts from Danksharding, such as blobs, to Ethereum. Starting with proto-danksharding and the introduction of blobs, instead of diving straight into full-danksharding, reduces the risk created by introducing drastic changes to a network too quickly.

How do blobs work?

Before blobs, when a Layer 2 needed to verify its transactions, it would batch up transactions, and send them to Layer 1 (Ethereum) to verify. The problem was, after the data was verified, it was still stuck on the Ethereum blockchain, taking up blockspace, forever.

This contributes to state bloat, and makes Ethereum more congested—which, in turn, also made Layer 2s more congested.

With blobs, when the data is sent to the Layer 1 for verification, it’s sent in a blob, short for “Binary Large Object.” Each blob has 4096 field elements and can hold up to 32 bytes of data per field element, which equates to around 75 MB per blob. You can think of it as a giant table of data, or… a big blob of data.

Blobs improve on the previous method of verifying data in that once data has been verified from a blob, it can be deleted. This way, all the transaction data from every rollup built on top of Ethereum doesn’t have to live permanently on the Ethereum blockchain, taking up valuable space.

Blobs also operate using a separate blob fee market—introducing “blob gas”. Blob gas is independent of gas on Ethereum mainnet, meaning the only things that use blob gas are blobs themselves.

What’s so special about blobs?

Despite the funny names involved in Ethereum’s Dencun upgrade, it’s a serious improvement. Blobs help to make Ethereum less congested, and they make rollups cheaper and faster to use.

This is because the data that’s verified in blobs can be deleted after use. As such, it doesn’t cause state bloat on Ethereum mainnet. “State bloat” refers to the fact that the more data that is processed by Ethereum, the more data lives permanently on its blockchain—and the more intensive maintaining that network’s “state” becomes.

Separating the blob gas market from the existing gas market is also a marked improvement. Because blobs have a separate gas market, congestion on Ethereum doesn’t affect them. In the past, if there was a large event on Ethereum such as a highly anticipated NFT mint, the congestion caused by this event would leak onto Layer 2s and make transactions there more expensive as well. Separating blob gas markets from traditional ethereum gas markets removes the correlation between Ethereum network congestion and Layer 2 transaction costs and speed.

Blobs also make Layer 2s much more profitable. Before blobs, when gas fees on them correlated with Ethereum network congestion, the cost of operation for Layer 2s, and the Dapps built on top of them, were much higher. Lowering transaction fees make it so that builders and operators are able to run complex smart contracts or products at a fraction of the cost.

Blobs at work

Looking at onchain data from rollup.wtf, we can see that the majority of Layer 2s with the highest transactions per second are already using blobs.

In a tweet, Jesse Pollak, founder of Layer 2 network Base, revealed that after Dencun, the cost of a simple swap transaction on Base dropped from $0.31 to $0.0005.

However, it hasn’t been all sunshine and rainblobs. Blobs have been slower than anticipated when it comes to actually posting transactions to Layer 1. Creating a new gas market for blobs succeeded in decoupling Ethereum congestion from rollup transaction costs, but the gas market itself needs some fine-tuning before it’s everything it’s been hyped up to be.

In one example in June 2024, blob transactions became more expensive than their predecessor– but that can be seen as a sign that blob adoption is still in progress, and hopefully, as L2’s become more efficient at using blobs, more block builders start accepting blocks that include blobs, and blob capacity increases— the costs will go down.

Vitalik Buterin addressed both of these issues in a March 2023 blog post published shortly after Dencun’s release. In it, Buterin cites two core areas of focus that are needed to continue scaling blobs: “Progressively increasing blob capacity, eventually bringing to life the full vision of data availability sample with 16MB per slot of data space,” and “Improving L2s to make better use of the data space that we have.”

So while blobs have seen some growing pains, they were for the most part anticipated, and solutions are underway.

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.





Source link

[adinserter block=”2″]

Be the first to comment

Leave a Reply

Your email address will not be published.


*