Blockchain Interoperability: The Big Picture

Simplicity is directly proportional to efficiency; when there are fewer things to break, we expect that on average, even fewer things will do so. Asymptotically, decentralized systems are the embodiment of simplicity; they tend toward sustaining complexity and (inter)connectivity to viable levels.


In the context of blockchain platforms, it would then seem paradoxical to question the significance of interoperability and interconnectivity. However, in practical terms, end-users, developers and businesses are even now compelled to navigate multiple layers of gratuitous complexity to try to address even the simplest of scenarios that span multiple blockchains.

Arguably, as an industry, we will eventually get this right. However, it is worth looking at the big picture by asking: “What happened? What has led the industry down these detours? And where can we focus on getting back on track?”


The global economy’s settlement layer and associated distractions

The ambitious vision of the “world computer,” a Turing-complete state machine to end all that came before it, catalyzed a wave of innovation the likes of which can only be compared to the early days of the internet. The mission to build a platform where public economic consensus attempts to provide the fabric for decentralized governance has taken decentralization to the masses — cryptoeconomics is now just “economics.”

However, the flip side of this extraordinary innovation has been an overt commercial attempt to try to establish a single, so-called “global economy’s settlement layer.” The basis for this tactic is that one blockchain must serve as the global settlement layer for all transactions, no matter on which blockchain or chains they execute. The self-serving argument is that this one settlement layer provides an “anchor” for the industry, establishing finality in the event that arbitration is required.

“There can be only one” is the belief and the motto among the immortals in the Highlander saga. In the real world, however, this effort has been an unfortunate maximalist diversion for much of the industry. Decentralized settlement cannot be a zero-sum game; otherwise, we have lost the plot.

The polka and other half-step dances

A correlated set of regrettable detours has been the entry and the continuing exodus of the so-called “cross-chain” projects.

Connect your chain to our project and you will get interoperability from day one — has been one mantra. We will facilitate an internet where blockchains can connect and exchange information and transactions — the hook is that you will need to relay through the specific project’s infrastructure or chain.

Another refrain goes along these lines — prior to our cross-chain project, blockchains were siloed and unable to connect with each other; they were difficult to build and could not scale; and we’ve solved these problems with a brand-new technical vision. The catch, of course, is that you will need to use their version of a cross-chain protocol and take a dependency on their project and related vagaries of their life expectancy.

One ring to rule them all; one cross-chain project for interoperability; what is common is that they are both myths. No one project or chain can or will serve the interoperability needs for an entire industry. And, of course, no entity wants to be beholden to any one cross-chain project where one is bottlenecked, technically and/or economically, by a third-party.

Related: Blockchain Interoperability: The Holy Grail for Cross-Chain Deployment

A standards-based approach to interoperability

Historically, in the technology domain, standards have enabled varied technology capabilities to work seamlessly together. Standards have served to establish and reinforce compatibility and compliance among diverse entities so that ecosystems may operate efficiently and effectively. They serve as the substrate behind the “Lego” blocks of products and services by instituting coherent protocols (as well as metadata, schemas, ontologies etc.) that can be widely understood, tested, analyzed, applied and validated.

Standards enable and constrain requirements, specifications and attributes that are typically used to ensure that products and services meet their pre-established purpose and that they deliver outcomes as expected. Importantly, standards provide a “neutral” layer of abstraction obviating the risk that one or more entities may attempt to control and/or subvert the longer-term goals.

Without standards, the internet as we know it could not exist; without standards, our ability to connect, communicate and collaborate using tools such as email and messaging could not exist; and without standards, the plumbing that powers blockchain protocols would be so sufficiently primitive that much of the cryptographic engineering that we take for granted would be well-nigh impossible.

Toward a model for blockchain interoperability

When we look at the history of interoperability and the internet, it is apparent that while it took multiple decades for much of the core infrastructure to be built out, innovation and large-scale usage exploded in the late 90s with what is now referred to as the web. Subsequently, with the so-called Web 2.0 and the cloud-wave of technologies, once again there was an uptick circa 2006–2010 with respect to innovation and standards in the consumer and the commercial domains, respectively.

What can we surmise? First, there is a core layer of interoperable standards that is required in order to establish the underlying foundation. Interestingly enough, this is where the blockchain industry has been focused on: protocols (infrastructure), including layer two, smart contracts (processing) and oracles (data).

Second, while connectivity is the bedrock, standards for e-commerce are what it took to transform the staid internet into the explosive web as we know it. This is where we have work to do — wrapped tokens are a tactic, not a strategy.

The industry needs to focus on digital assets: asset definitions and templates, asset swaps, ledger and inter-ledger transactions, and more — built on a foundation of standards-based interoperability.

Leading with a lexicon for digital assets

It has been said that language is the prerequisite for “extended trains of thought.” Language is a genetic capability common to humans and distinguished by the characteristic of “discrete infinity” — i.e., the capacity for essentially unbounded composition of simpler objects into complex structures. Without a discourse on the validity of this hypothesis, it can be safely said that at a minimum, language is the underpinning for shared communication and collaboration.

This summer, under the leadership of Microsoft and other tech giants in the technology and financial sectors, the InterWork Alliance was launched. Its key focus is the development and evangelization of the Token Taxonomy Framework. The Token Taxonomy Framework is an early attempt to create a lexicon and a language for digital assets.

The Token Taxonomy Framework was designed with the objective of bridging the gap between developers, business analysts and managers, and policy makers and public regulators, enabling them to work together to model, architect, design, validate and to create and deploy new business models and networks based on digital assets.

A common lexicon for digital assets provides a shared basis and a starting point for mutual understanding and enables the development of tools to support communication, collaboration and commerce.

The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

John DeVadoss is a founding director of the InterWork Alliance and co-chairs the Token Taxonomy Framework Working Group. He leads development for Neo Blockchain, based in Seattle, Washington. Previously, he built and successfully exited two machine learning start-ups. Earlier in his career at Microsoft, John incubated and built Microsoft Digital from zero to $0.5B in revenue; he led the architecture, product, and developer experience for the .NET platform v1 and v2; and he was instrumental in creating Microsoft’s Enterprise Strategy.

Source