Title: The Need for a Fresh Approach to Data Handling in Blockchain
Authored by: Maxim Legg, the pioneering mind behind Pangea
The blockchain sector is grappling with a self-created predicament. While we applaud theoretical transaction speeds and advocate for decentralization, our data infrastructure is unfortunately stuck in the era of 1970’s technology. If a loading time of 20 seconds can spell disaster for a Web2 application, why are we tolerating this for Web3?
Considering that over half of users abandon websites after just three seconds of loading time, the acceptance of these delays in our industry poses a serious threat to widespread adoption.
Slow transactions aren’t merely a user experience issue. High-speed chains like Aptos can handle thousands of transactions every second. However, our attempts to access their data via makeshift indexers – systems assembled from tools like Postgres and Kafka not originally intended for the unique needs of blockchain – are inadequate.
The Hidden Price of Technological Debt
The implications go beyond mere delays. Present indexing solutions burden development teams with a dilemma: either build custom infrastructure (which can eat up to 90% of development resources) or cope with the severe limitations of existing tools. This creates a performance paradox: the quicker our blockchains become, the more glaring our data infrastructure bottleneck appears.
In real-world scenarios, when a market maker must execute a crosschain arbitrage trade, they end up battling their own infrastructure, on top of competing against other traders. Every millisecond spent polling nodes or waiting for state updates equates to lost opportunities and revenue.
This isn’t just theoretical anymore. Major trading firms currently run hundreds of nodes to maintain competitive response times. The infrastructure bottleneck turns into a critical failure point when the market requires peak performance.
Conventional automated market makers might be sufficient for low-volume token pairs, but they fall short for institutional-scale trading.
Most of today’s blockchain indexers are more aptly termed as data aggregators that construct simplified views of chain state. This works for basic applications but crumbles under heavy load. It might have been adequate for first-gen DeFi applications, but it’s completely insufficient when dealing with real-time state changes across multiple high-performance chains.
Time to Rethink Data Architecture
The solution demands a fundamental overhaul of our approach to handling blockchain data. The systems of the future must deliver data directly to users rather than centralizing access via traditional database architectures. This will facilitate local processing for genuine low-latency performance. Every data point must have verifiable provenance, with timestamps and proofs ensuring reliability while minimizing risks of manipulation.
A significant shift is on the horizon. Complex financial products like derivatives become achievable onchain with faster blockchains and reduced gas fees. Moreover, derivatives are used for price discovery, which currently happens on centralized exchanges. As chains become faster and cheaper, derivatives protocols will become the go-to venue for price discovery.
This transition necessitates infrastructure capable of delivering data in less than 150 milliseconds. This isn’t an arbitrary figure; it’s the threshold at which humans perceive delay. Anything slower severely limits what can be achieved in decentralized finance.
The Looming Convergence of Market Forces
The current model of excessive node polling and inconsistent latency profiles won’t scale for serious financial applications. The proof is in the pudding, with major trading firms increasingly building complex custom solutions, clearly indicating that the existing infrastructure is failing to meet market needs.
As faster blockchains with lower gas fees enable sophisticated financial instruments, the ability to stream state changes in real time becomes crucial for market efficiency. The current practice of data aggregation with multi-second delays significantly limits what can be achieved in decentralized finance.
Emerging blockchains are pushing data throughput to never-before-seen levels. Without corresponding advances in data infrastructure, we’ll end up with powerful engines connected to bicycle wheels – an abundance of power with no effective way to utilize it.
The Need for Change
The market will compel this change. Those who don’t adapt will find themselves increasingly marginalized in an ecosystem where real-time data access is no longer a luxury but a fundamental prerequisite for participation.
Written by: Maxim Legg, the pioneering mind behind Pangea
This article is intended for informational purposes only and shouldn’t be interpreted as legal or investment advice. The views, thoughts, and opinions expressed herein are solely those of the author and do not necessarily reflect or represent the views and opinions of Cointelegraph.