0x31ae182a31bb2c3cfd9e2e3732cc53f7606fb773

0x31ae182a31bb2c3cfd9e2e3732cc53f7606fb773

2022 Personal Review Summary - Public Chain Edition

Sync my mirror: https://mirror.xyz/jojonas1.eth

Blog site: https://jojonas.xyz

——————

In the past year, there were various alt-layer1s (maybe this was two years ago?), then Celestia brought a wave of modular concepts, and finally, several Move-based public chains had a brief hype. It seems that everyone in the industry has an obsession with public chains: airdroppers (I just coined this term, hehe) and speculators love the beachhead opportunities that new public chains bring; project teams are busy forking proven protocols; builders hope to realize ideal paradigms in new public chains; traders dislike ETH's dominance because certainty often means fewer opportunities.

Of course, the eternal theme running through public chains is the impossible triangle problem, which is a fortress that all new public chains must break through in the market. Whether through hurdles or detours, they must claim to have "overcome this obstacle to some extent." Meanwhile, cutting-edge technological thinking still seems to be born in the ETH community, with Ethereum maxis waving the banner of "the ultimate public chain," witnessing the ups and downs of successors like NEAR and Solana.

2022 was also the year I personally began to try to understand public chains in depth. For about four months, I dedicated dozens of hours every weekend to learn some basic principles; my motivation to learn was simple: public chains have always been the foundation of the entire ecosystem, and the next "breakthrough" innovation will likely still need to be in the architecture and consensus design of public chains.

This summary contains some fragmented thoughts of mine, and I am not a professional; please point out any errors, thank you 🙏

About Rollup and Application Chains

Back to the impossible triangle: decentralization, security, scalability—currently, it is basically a "two guarantees, one grasp" state; the one being grasped is basically "scalability," which is what everyone refers to as expansion. Expansion has long existed, from direct expansion (large blocks, high block frequency) to state channels, off-chain computation, etc. Each approach has its advantages and limitations; for example, state channels store only the state on-chain while achieving good privacy, but unfortunately, they require both parties to be online simultaneously (Your Majesty, I cannot do it).

Currently, mainstream expansion is mainly focused on rollups and modularity. Everyone is familiar with rollups; the mainstream view is that the OP ecosystem progresses faster, while zk is a more suitable long-term solution, as the security ensured by mathematics is more reliable. Modularity goes a step further: in addition to separating execution and settlement, it simply separates DA as well. The reason for separating DA is mainly due to the background of state inflation, making it increasingly difficult for full nodes to operate, while light nodes cannot identify certain malicious behaviors (because light nodes only check block headers, which means that looking at headers alone is unreliable and easily deceived), the so-called "data availability problem." More on modularity later.

Currently, there are already a large number of rollups, and the TVL data of major layer2s can be seen here https://l2beat.com/scaling/tvl/. The TVL of Arbitrum and Optimism has far surpassed many alt-layer1s. As OP rollups, both have made significant long-term strategic layouts early on, and Arbitrum has also gathered many quality ecosystems, such as GMX and Treasure DAO.

Let me share some points I have been thinking about.

(1) Rollup is not a panacea

Layer2 is cheaper because it shares the costs of the mainnet rollup contracts, calldata calls, etc., but it still incurs additional computation and storage costs; layer2 being faster is akin to fixing a highway to alleviate traffic jams, but the highway only has entrances and exits, and one must move before getting on the highway.

And: does fixing the highway eliminate traffic jams? Whether there are traffic jams depends on the comparison between the road's carrying capacity and daily traffic flow, as well as the planning of traffic flow (such as classification and diversion); currently, rollup costs are basically a fraction of the original, TPS increases dozens to hundreds of times, but compared to traditional application demands, it is still negligible.

Intuitively, if traffic jams continue, then add another layer, layer3; non-general needs such as supply chain traceability and logistics tracking can completely shift to layer3. However, this nesting-style expansion always reminds me of the scene of financial Lego collapsing... One of the original intentions of decentralization is to address the many risks of centralized data storage; but upon careful consideration, a single point of failure is unlikely to cause systemic risk. Public chains facing the world cannot afford any mistakes. (So sometimes I wonder, many commercial scenarios do not necessarily need to be based on public chains; they only need to ensure consensus within their own closed circles and maintain one-way contact with the outside world through cross-chain communication—consensus within a small circle cannot be trusted by the outside world, but the small circle can rely on the outside world for supervision.)

Unlike what everyone mentions about EVM compatibility, I believe the biggest limitations of rollups lie in two points: 1⃣️ the performance of the underlying chain; 2⃣️ the "bridges" connecting rollups. The more "bridges," the higher the risk for the top layer.

Regarding 1⃣️, I think it's a very interesting discussion point. It seems that everyone now defaults to rollups declaring the end of public chains, almost ignoring that rollups are not exclusive to ETH; they are essentially just contracts deployed on layer1. Of course, ETH's performance isn't great (at least before many future upgrades), but it is the safest (second only to Bitcoin). Safety is a basic condition for layer1. If rollups are to be the ultimate path, ETH must undergo significant upgrades to solidify its position before the mass adoption of blockchain. Thus, ETH has unleashed the "-rge" combination punch, while most other chains still haven't found their focus for competition or to hitch a ride...

Regarding 2⃣️, in today's environment where cross-chain bridge incidents are frequent, the collective neglect of the security of rollup cross-chain is also worth noting. Although the principles of the two are different, returning to first principles, the submission of state data by rollups still falls under the category of cross-chain communication. I don't know much about this area, but I assume it won't be easy to achieve security commitments.

(2) Centralization of Sequencers

If I say that the consensus of rollups is very similar to Solana, many people might be shocked. But in reality, the current sequencers of rollups are mostly centralized and controlled by the officials, which is much more centralized than Solana's rotating leaders. This allows the second layer to focus on running performance, as wrongdoing in the second layer seems to be a self-evident proposition (if the sequencer itself does wrong, then what is the point of the project).

First, let's talk about how sequencers can do wrong. Similar to full nodes in the DA problem, sequencers can hide legitimate transactions and order transactions according to their own will (this is also the origin of the MEVA market on OP). Since the finality of layer2 relies on layer1, this gives sequencers a lot of power. Theoretically, sequencers are just not incentivized to do wrong, not that they cannot.

So how do we decentralize sequencers? Here comes the paradox. If layer2 can achieve faster and safer consensus through some mechanism, then why not apply this mechanism directly on layer1? If not, layer2 instead drags down transaction speed.

Or should we directly give the entities behind rollups to a DAO? Compared to modern corporate structures, current DAOs clearly do not have this capability, so I will pass.

A more suitable approach is to deepen the binding between layer one and layer two, such as creating an official ETH layer2 or an officially partnered layer2. This is very convenient, but it may directly lead to the end of innovation potential.

(3) Interoperability and Competitive Landscape Among Layer2s

Should layer2 be general-purpose or specialized? The lack of composability and liquidity among numerous layer2s is also a form of fragmentation; if the positioning of layer2 is as a specialized chain, this scarcity may be acceptable; but if it is general-purpose, this fragmentation undoubtedly makes people uncomfortable—imagine the same DeFi protocol being split into dozens of TVLs by layer2s... Therefore, either a unified situation or a tripartite balance will emerge, or an IBC interoperability protocol will appear among layer2s.

If the nesting demand is strong, specialized layer3s will emerge based on general-purpose layer2s. DeFi protocols will remain on layer2, and I even think that DeFi will stay on layer1, with layer2 only calling through vAMM and cross-chain communication. Applications like games, which have lower security requirements, can completely migrate to layer3; specialized layer3s are essentially application chains. If ETH's rollup ecosystem develops smoothly, there will likely be more chains in the cosmos looking to hitch a ride.

As for whether layer2 itself will develop directly into a specialized type, it depends on the comparison between the expansion limits brought by nesting and actual demand.

(4) One-click Chain Launch and Customization

As the concepts of modularity and application chains become trends, established public chains have all launched their own "son-raising" plans: Polygon's Edge, BNB Chain's BAS, Avalanche's subnet... These more convenient chain launch services, on one hand, push themselves towards the trend of modularity, expanding their development circles; on the other hand, it may indeed be necessary to plan ahead to leave limited mainnet space for the most core applications.

The earliest chains to have the meanings of "modularity" and "application chains" were actually Polkadot and Cosmos. In this article, I mentioned that I still prefer Cosmos's design philosophy. Other chains want to be the father, while Cosmos seems to be looking for brothers. I am slowly beginning to understand the meaning of the name Cosmos: whether it exists or not, I am the universe.

(5) Public Chains Will Become Increasingly Focused on Operations

User experience and ecosystem development influence each other. Let me talk about my experiences with various public chains from the perspective of an ordinary user:

1⃣️Polygon: I stayed on Polygon for a long time because gas was cheap and the ecosystem was complete. However, I left because there was a period when sandwich bots were blocking transactions, leading to a terrible experience.

2⃣️BSC: BSC is the chain I haven't left until now; during the bull market, there were many meme coins and small projects, and I had a great time. Of course, when I say I haven't left, it's because either the projects have gone to zero or the project teams have run away; I can't escape, and the tombstones are already engraved.

3⃣️Avalanche: On the first day I went there, I happened to encounter congestion, with gas fees of one or two dollars per transaction, and I thought, with such a complex consensus design, it seems to be overrated...

4⃣️Solana: When I first went, it was already the desolate period after Solana had been hyped up; at that time, it was popular to say "even Solana dogs don't play." But I immediately fell in love with it—the unified UI design, fast and convenient operations, and gas fees that could basically be ignored... Later, it often went down, but I never left, saying I would leave if the downtime problem wasn't resolved in six months. Then the explosion happened...

5⃣️Others: I haven't really used them; either the experience was poor, or there was no ecosystem, or there were too few people.

In the above discussion, I tried to set aside any professional knowledge and purely wrote from the perspective of an ordinary user, which should clearly depict "what I believe a public chain should look like." TPS, gas, ecosystem, users, operations, and even UI design can all influence user choices.

Now looking back at the early native groups in crypto, they were not ordinary users; they were either gamblers and speculators or somewhat knowledgeable with a bit of faith. As early benefits diminish, blockchain should gradually move towards mass adoption, and most of the later users actually have significant behavioral differences from the early groups.

Users are lazy, which is also why traditional industries have "customer acquisition costs"; in the early days, some projects didn't need marketing because all users were hunters; of course, now it has changed. Hunters are more ruthless, while ordinary users, due to their laziness, often have good stickiness. The former can become the latter, but the latter often finds it hard to become the former; or, from an ethical perspective, the latter shouldn't become the former. Everyone has a gambling nature and profit-seeking mindset, but self-control and judgment are not universal.

About Modularity

Modularity splits the work of nodes, separating DA, consensus, and execution onto different chains, allowing the consensus layer to use more light nodes, and more nodes can join the network; a larger-scale consensus layer provides more space for the execution layer, and the growth of the execution layer will in turn stimulate more nodes to join, enhancing network security.

Modularity has been overly hyped, so I will just briefly mention the economic sustainability of modular blockchains.

Consider a single-chain scenario: as the network develops, the cost of maintaining nodes will gradually increase, which means that the transaction fees users pay will also need to increase; thus, either subsidies or token inflation will be required, both of which are unsustainable. However, in modularity, the costs for consensus layer validators do not significantly increase, while the increase in network scale means that the execution layer pays more fees overall, incentivizing more nodes to join, creating a positive cycle. For users, paying more fees to execute more transactions may even result in lower average costs per transaction...

Overall, modular chains seem to outperform single chains. But I always feel that modular chains face similar issues as rollups (rollups can be understood as execution layers with layer1 as consensus and settlement), namely, how to ensure that the connections between several modules are very secure, very fast, and also decentralized? This is the point I raised in section 2, and on the other hand, similar to point 1, everyone seems to directly assume that single chains and modular chains are two opposing categories, but in fact, the composition of modular chains is precisely single chains, right? Single chains can completely join the modular army.

Therefore, the behavior of kicking out a chain from the competitive arena simply because it is currently a single chain is undoubtedly very foolish; Polygon's keen strategic shift in technology has confirmed this. In any business activity, technology does not hold absolute dominance; a recent example I saw is Royole Technology (I won't elaborate, but you can search if you want to know). How to formulate a good long-term strategy and short-term direction, how to execute at lower costs and more efficiently, how to find one's advantages in competition, how to manage the organization well, how to deal with user attrition, and maintain long-term operations... These are all equally important issues as technology.

In summary, I am not a staunch believer in rollups, modularity, or any new public chains; in my view, all public chains have their respective issues to varying degrees. But currently, if I had to choose one, I would undoubtedly choose ETH, and my reasons are simple:

1⃣️ ETH is currently the most complete, decentralized, and consensus-driven chain;

2⃣️ It also has the most geniuses building it and the most innovation birthplaces;

3⃣️ ETH is currently the most likely to become the underlying layer for future mass adoption;

But if Danksharding happens, and if layer2 or layer3 cannot improve user experience (low transaction costs, high transaction speed), while there are competitors in the market that can do this better, I will likely evaluate the attractiveness and possibilities of those competitors very indecisively.

That said, these reasons are also why I was optimistic about Solana at the time: ① Solana was among the top in terms of ecosystem completeness and had a decent consensus; ② during that early period, developer data consistently placed it second (I can't remember); ③ most importantly, low gas fees and high speed convinced me of its potential for large-scale applications, even though it later continuously went down, wearing me down...

I suffered significant losses with Solana; learning from the Solana lesson, if it is more about capital pushing, just enjoy it, but never get emotionally attached; the probability of FTX collapsing is much higher than that of Vitalik collapsing...

About Sharding

Sharding is actually the scaling solution I find most promising.

The initial sharding idea comes from database sharding. It divides the blockchain network into multiple sub-networks, each shard network reaches consensus independently, and transactions across the network can be divided among different shards to achieve parallel execution and increased throughput. The most serious problem with this approach is the dilution of computing power, meaning that the number of validating nodes in sub-networks becomes fewer and fewer, making attacks on individual shards easier, and the overall security of the network drastically decreases.

Another sharding approach is state sharding, such as ETH's sharding 1.0, which divides the ledger among various shards, achieving time synchronization and finality through a beacon chain.

The latest sharding approach can be referred to as "data sharding," such as ETH's Danksharding and NEAR's NightShade. It directly splits individual blocks, and through DAS, validators only need to verify individual fragments.

Before the emergence of data sharding, sharding was actually similar to rollup scaling, facing the communication problem between chains, which is what I am concerned about—the fragility of "bridges." (As for liquidity fragmentation and such, those are minor issues.) But data sharding opens up new possibilities, allowing blocks to truly take root and sprout on the winding and twisting chains like butterflies... (Just as the recent update of Wu Liu Qi reminds me of the magical sword Qian Ren.) If ETH's Danksharding can perfectly combine with rollups, each performing its role, it might really achieve the "endgame."

The question is whether DAS technology can withstand the test; can any experts elaborate on this...

About Move-based New Public Chains

I really don't understand programming languages, so the following content is just parroting.

1⃣️ Currently, blockchain accounting mainly has two forms: the UTXO model represented by Bitcoin and the account balance model represented by ETH. The latter is essentially no different from bank accounting. The former can be explained: in Bitcoin, a transaction can have multiple inputs and outputs pointing to different addresses, and the balance of an address is the sum of all UTXOs owned by that address. To avoid double-spending attacks, it is necessary to check for conflicts between transaction records, which is a drag; Move defines a new object called "resource," and transactions are achieved through the movement of resources.

2⃣️ Modular contracts. They bring more concise and flexible functions, safer calls, and more convenient upgrades.

3⃣️ More security considerations. For example, static calls, no support for loop calls, use of bytecode languages, etc.

From the developer activity data, the current full-time developers in the Move ecosystem are already comparable to those in Arb, OP, etc., with a very high growth rate (one year's data is not very useful; monthly data or looking again next year will reveal more):

image

From code submissions, there has been a higher degree of fluctuation:

image

source: https://app.artemis.xyz/developers

As a new language, the extent to which Move can attract developers reflects its potential. The more developers there are, the stronger the language will grow, leading to more ecological applications, especially with a sprinkle of capital...

Aptos and Sui, while both being Move-based chains, have very different marketing styles. Aptos is all about "capital," with a sharp market-making style; Sui, in contrast, feels a bit like a quiet handsome guy, though I tried its UI and it wasn't very good-looking, and the experience didn't leave any special memories.

From the perspective of architecture and consensus, how are the two doing?

First, Aptos's consensus mechanism is still based on BFT (called DiemBFT), which is used to address the Byzantine Generals Problem, referring to how distributed systems can correctly coordinate transaction processing through message exchanges. BFT requires more than 2/3 of honest nodes to reach consensus through a three-phase protocol (pre-prepare, prepare, commit). The bottleneck of standard BFT consensus (pBFT) lies in the complexity of communication between nodes; subsequent improvements like Hotstuff, FBA, dBFT, and Tendermint have approached this from different angles; for example, Hotstuff allows nodes to communicate only with the rotating leader.

DiemBFT, based on Hotstuff, breaks down the process of achieving finality for a single block into a pipeline operation, allowing the commit of block N and the prepare of block N+1, as well as the pre-prepare of block N+2 to occur simultaneously (which is quite similar to NEAR's DoomSlug). Aptos also dynamically manages transaction processing through Block-STM's parallel execution engine, and blah blah blah, I will have to learn more about this when I have time to read the white paper for Sui...

(Next is pure intuition time)

To summarize, being able to push such waves through capital means that Aptos and Solana are hard to distinguish in terms of marketing, especially since Aptos has a more luxurious support lineup than Solana from a capital perspective. Even the technical directions of the two are quite similar for newcomers, optimizing BFT, parallel execution, etc. As for what route Aptos's ecological construction will take, it remains unclear.

As mentioned earlier, I personally do not rule out the possibility of an endgame that is not ETH, nor do I believe that chains like Solana and Aptos have nothing left but capital bubbles. Whether Aptos can learn from Solana's lessons remains to be seen. As for Sui and Linera, which has been forgotten in the corner, I will judge when I understand more...

About Block Space

Behind the lively discussions of public chains, "block space" seems to be a term overlooked by the market. As the name suggests, block space refers to the space in a block used for storing data. Why do I think block space is important?

First, it needs to be clarified that block space is not a new term; it has long been proposed and studied. If we view block space as a commodity, miners are the producers, mining pools are the auction houses, and users bid according to their needs. Users record their transactions in block space, which is the first value of block space—storage value.

At the same time, since the size of a single block has an upper limit, the number of transactions that can be completed within a given time is also limited; if a transaction lingers for a long time, when it is finally executed, it may fall at a price that the user does not expect, which is why users are willing to pay a higher price for immediate block space—this is the second value of block space—time value.

Of course, the time value of block space has a more familiar term—MEV.

Time value involves a complex dynamic balance of supply and demand; relatively speaking, the issues pointed to by storage value are simpler and clearer—

1⃣️ What to store? Before rollups and modularity, most chains were open to data storage needs (of course, they still are; it's just that different places are storing now); you can store secrets or garbage, as long as you pay, nodes will do the work. But as costs rise, what is worth storing becomes a question worth pondering.

Is being open to everything (permissionless) really the best solution? If some people just like to pay to store garbage, do all full nodes need to sync this garbage? I don't think so. Block space is scarce, especially after POS, because asset value ➡️ network security ➡️ user demand ➡️ node burden ➡️ asset value, this is a passive positive cycle; every piece of garbage means more redundancy and inefficiency.

Volition chooses to give users the right to decide what goes on-chain; this makes me think: is there a possibility for nodes to have some autonomous grading review rights over transactions? Different types of transactions should have different security needs and fee gradings. It’s fine to send garbage, but if you want history to remember this garbage, perhaps paying a hundred times the fee isn't too much?

2⃣️ How to store? The UTXO model used by Bitcoin only needs to record inputs and outputs; the state is merely a continuation of history, so the storage pressure is much smaller compared to ETH's account balance model; ETH's state consists of accounts, while transactions consist of information that triggers historical changes; history and state are two types of data; ETH's history includes historical transactions and historical states, and its state grows faster.

3⃣️ Thus, we arrive at the famous "state explosion"—ETH will eventually face a day when full nodes are overwhelmed. Apart from causing DA issues, state explosion reveals another significant problem: the reason full nodes are overwhelmed, aside from hardware requirements, is mainly due to high storage costs. Returning to question 1⃣️, users storing data only pay a one-time fee, while nodes need to permanently store this data (at this point, do you still think storing garbage is freedom and politically correct?). The tragedy of the commons tells us that unequal access to costs will inevitably lead to abuse.

4⃣️ State explosion and space abuse are not new problems; many chains are already working on solutions, such as EOS-like chains trying to achieve real-time rental through RAM; ETH has also proposed storage solutions. However, the concept of "rent" fundamentally contradicts the original intention of blockchain to record data; if history can be erased due to non-renewal of fees, everything built on that history loses its meaning.

This will be a long-term issue, and I will throw out some ideas for now. From my personal perspective, I feel that "transaction review grading + different strategies for different gradings" is needed; ETH's newly proposed blobs could be a good fit—within a month before a blob is about to be deleted, decentralized forces could establish a crowdsourced review market and purchase space for valuable history. As for how to buy, issuing tokens or drawing traffic, there are many ways.

Also: hearing about buying and selling block space makes my real estate arbitrage DNA stand straight...

Regarding MEV, I won't elaborate; I personally still like the current direction of "auction" and "democratizing profits."

About Linear Time Sequence

This is something I learned while studying DAGs, mainly stating that smart contracts require linear time sequences to ensure their Turing completeness. Linear time sequence is also a performance bottleneck of EVM.

Of course, when choosing between performance and smart contracts, there is no doubt that the latter is more important; after all, a blockchain without smart contracts is really just a mascot. Although DAGs can execute transactions in parallel, they still need to simulate linear time sequences through some methods to achieve finality in settlement. So, will linear time sequence be the performance curse of blockchain? I don't understand, so I ask!

About Multi-chain Landscape

I still maintain last year's view of "one strong, many strong," but I believe that this year's new general-purpose public chains may be fewer, and more likely to be ctrl+v application chains (and more likely to come from one-click generation), which can be directly evaluated based on the application itself; layer2s will also increase (it seems there is even one-click layer2, but I personally think there is no market for it). In the short term, it will basically be a game of existing stock, and the battles will only become more intense, ultimately leaving only a few winners.

In summary, any public chain, no matter how much it boasts about technology, can be explored if the ecosystem/market is doing well; if the ecosystem is not great, just pass. Public chains are a field that heavily relies on teams, and business and strategic capabilities will be crucial factors determining the development of public chains. Of course, if there is indeed a groundbreaking technological breakthrough, then just follow the research big shots.

Another ecosystem I am optimistic about is Cosmos; I have previously written an analysis on it, and I recognize its technical philosophy. Cosmos and ETH will not directly compete in the future and may even team up. However, Cosmos has always been lukewarm, and its future development is more influenced by external factors (for example, if a very impressive chain joins); also, the 2.0 proposal seems to have been rejected? Why was it rejected (I don't remember)?

Among layer2s, Polygon and StarkNet will be two strong contenders. As for the fate of Celestia on the modularity front, although it cleverly launched Cevmos and Celestium to make moves during ETH's sharding gap, I still have some doubts about its future. The early development of pure DA is still quite challenging; it has a critical point; the large-scale marketing efforts in the early stages could easily end up as a wedding dress for ETH's sharding (who proposed DAS first?).

I don't know if Solana's downtime issues are still being resolved, and whether there is still money to push the mobile business. Although it has basically shown its vulnerabilities in the web3 user space, if the mobile business shows signs of life, I still hold a glimmer of hope.

Aptos looks very interesting at first glance, but I am not yet clear on the details. In summary, I wanted to say a lot about public chains, but as I wrote, it became somewhat laborious, mainly because my skills are not refined, and many logics could not be expressed clearly. Let's continue learning in 2023!

References

https://mirror.xyz/mtyl.eth/TbLbRI1VcDZYxkHJOBhB319oaYxnV5_DmqXl6xLfcWM

https://mirror.xyz/jolestar.eth/sQ0nMCO3eNig6gCzqQO7xew1mn8oUi1-rKtfZKmGlNI

https://bixinventures.medium.com/portfolio-insights-aptos-db07ca9540d2

https://foresightnews.pro/article/detail/16541

https://medium.com/@state_xyz/mysten-labs-sui-vs-aptos-other-l1s-d046b598a914

https://talk.nervos.org/t/topic/1515

https://www.rob.tech/polkadot-blockspace-over-blockchains/

https://research.paradigm.xyz/ethereum-blockspace

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.