How do Distributed Ledger Technology Systems Work?

A look at the inner workings of Distributed Ledger Technology Systems

0

Much has been said about Distributed Ledger Technology (DLT) systems. Many people have made claims and counter-claims about how this unique piece of technology works. The truth is that Distributed Ledger Technology has been a work in progress for several centuries now. 

DLT is the Mother and Father of all sorts of peer-to-peer based systems. So much so that the concept of a distributed ledger is not a new one. It is just that the concept of a cryptographic ledger where the information is stored is quite recent. Even then, DLT as a concept has been quite popular within Computer Science and Mathematical circles for decades.

A DLT is not a Blockchain but a Blockchain is a DLT

Another point to note is that not all DLT systems are blockchains. DLT is the mother of blockchain technology and not the reverse. many people confuse the two. Every blockchain is a DLT but not every DLT is a blockchain. 

This fundamental difference has enabled the development of different kinds of algorithms to store data. This is how the evolution of the crypto-space shall occur. Different kinds of algorithms that are logically sound, easy to deploy and secure will take over the DLT space. They shall also solve major problems that bedevil the DLT space currently. 

Shared Information is the Foundation

Before DLT became a popular concept in humanity, we had the concept of shared information. From the communal times, it can be inferred that human beings shared information that was validated by witnesses. This was always for several purposes. Some of it was for the defense of the tribes, others were for the allocation of food. 

Another case-in-point was for the mate selection process among the youngest and strongest of tribes. Information was shared among the elders to enable the rites of mate selection to occur. 

As such, we have have had information sharing as a core foundation of all human interaction from time immemorial. The information has always been needed to solve the problems.

Information Validation Ensures Accuracy

Validation of information was also a monumental issue for human beings. Witnesses were needed to ensure the accuracy of this information. As such, these witnesses needed an incentive to enable information witnessing to be accurate. Such incentives came in the form of power, respect, exemptions form certain duties, monetary compensation, chieftaincy titles, rulership and so on. This was the beginning of another concept behind DLTs today: consensus. All witnesses had to agree before something was considered to be true and accurate. 

Over time, information sharing became codified by certain principles, procedures, and practices. This codification process gave rise to given sets of rules that information had to follow. One such example of this in practice are the customs of the British Royal Family. The rites, traditions, and customs of the British Royal Family have been observed for quite a bit. Some go back for about 1,000 years!

In Africa for example, this codification process became what we can refer to today as culture and traditions. Thee are certain practices and belief systems that have been passed down from generation to generation for millennia.

Centralization Takes Upstages Decentralization

Along the way, a BIG problem occurred. Too few people became witnesses in the process of confirmation of information. A such, there was an excess of incentive. This was how centralization in the world began. A few witnesses to the confirmation process now took hold of more incentives than necessary. As such, the undue advantage was given to this set of individuals. Centralization then became the norm and everyone forgot how consensus worked for a few centuries. 

Warfare still needed consensus for battles to be won. This is what is known today as the Byzantine Generals problem. Simply put, all the Generals (witnesses) of war had to agree on the information available for any battle to be won. We all know that those who win as many battles as possible win the war. 

The Roman Empire was famous for this. The Empire had so many different army units that information coordination was quite a feat. The Roman Generals had to find a way to win wars without allowing their ranks to break. Leslie Lamport, Robert Shostak, and Marshall Pease have been able to comprehensively describe how the Byzantine Generals problem works. Their 1982 paper, gives a bird’s eye view on what happens during the information sharing process. The Byzantine Generals problem describes how information sharing during wartime must occur accurately else defeat is a certain outcome. 

Technology  Rises  and Changes Humanity

The Industrial Revolution came along and brought with it many wonderful innovations. Then the Twentieth Century arrived and Mankind had many breakthroughs. Computing was one of the gifts of the early Twentieth Century to Mankind. 

The Enigma Machine was one of those breakthroughs. The ability of human beings to code hidden messages during wartime isn’t new. It’s just that the Third Reich had taken this to a whole new level at the time. Hitler and his minions had dedicated extensive resources to creating machines that encoded wartime instructions. Winston Churchill also dedicated extensive resources to ensure that the Enigma codes were broken. This was the birthplace of cryptography as we know it. 

The Transistor Ushered in a New World

The transistor was born and Mankind took a leap from paper-based living to living electronically. Centralization was the order of the day as centralization gave birth to new forms of interaction. Everything that people did and does has to pass through central authorities as witnesses. These witnesses are what we now know today as databases. 

It is anyone who owns the most accurate information that has the highest amount of power. This is how database legacy systems and their owners became the Masters of the Universe as we know it. 

Information Became Power

The richest people in the world today aren’t those who own physical resources. Mining leases and Oil Wells can be owned by anybody. But it is those who have the most dynamic databases and act as central authorities are the new Rulers of Mankind.

There were many problems that centralization created. The manipulation of information is the first. This led to inaccurate information being presented as accurate. Too many lives have been destroyed as a result of this phenomenon. Fraud and corruption are also big problems within systems. These twin evils have caused lots of damage to ordinary people. Lives have been lost in the process of fraud perpetration and people have been destroyed as well. 

Too few people having too much power has created many gaps. An unequal community with many people not having a sense of destiny. As such, they don’t have any roles to fulfill within the human tribe. 

Central Monetary Systems Ruled the World

Our monetary systems became centralized as well. The concept of central institutions holding people’s wealth became a fixture. Double-entry accounting created the big problem of having too much wealth in too few hands.

Professor Yuji Ijiri then propounded the concept of a Tripple-Entry book-keeping. In his book “Momentum Accounting & Triple-Entry Bookkeeping”, he described the conditions for which a Tripple ledger can work. 

The Professor codified what humans had known for thousands of years. Having third party witnesses enabled information sharing and confirmation by consensus to work. This concept is the foundational principle for Distributed Ledger Technologies.

Computing Created the Structure for DLTs to Exist

Then there were problems in the implementation of this concept. Computing provided answers that nobody else could. Problems of security, transparency of information, centralization of power, incentives given to witnesses and so on. W. Scott Stornetta and Stuart Haber published a paper in 1991 that created a foundation for solving many of these problems. The first thing that the paper solved was the accuracy of the information in the public ledger. This is now referred to as a timestamp. The paper called “How to Time-Stamp a Digital Document,” has fundamentally created a basis for DLTs to function as timestamps now allow everyone to create algorithms that can work if properly codified and implemented. 

They also described in the paper how to certify digital documents. Something that had not been previously done before in the way they described it. This gave several people an entree into the exciting world of cryptography. The ability of individuals to be able to hide secrets digitally created a decade of a new breed of professionals in the cryptography and cybersecurity industry. 

The First DLT Prototype was Born

 “Building secure file systems out of Byzantine storage,” was a paper presented in 2002 by David Mazières and Dennis Shasha. They introduced the concept of trusted storage as against untrusted storage. Previously, people had focused on how to secure data in untrusted systems. Data security was (and still is) a BIG issue when it comes to centralized systems. What makes their work interesting though is the fact that “fork consistency” (hard forks anyone?) was aptly described. 

Things took a new dimension when they went further in their paper to describe how data storage can occur in cryptographic blocks (blockchain anyone?). This paper also created a new paradigm of a file system that runs on a network called SUNDR (Secure Untrusted Data Repository). This is the forefather of today’s blockchain. 

The paper then further went on to describe the use of cryptographic keys to sign digital records within the SUNDR system. This also resonated with many and was the first example of information confirmation without trust. 

What makes their work interesting is that it amalgamated the concept of distributed computing among untrusted server systems. They were called “nodes”.

Remember, it was trust that led human beings to trust witnesses to information confirmation. This trust was abused and human beings went to create centralized systems that are subject to manipulation and so on.

Bitgold Comes on the Scene

The blockchain concept was then a field that very few people at the time understood. It took a bit of time before Nick Szabo proposed a new currency called Bitgold. In his work, Nick took a look after a “computational puzzle” as a form of information confirmation. This is what we know today as “proof-of-work”. This was according to his paper in 2005.

 What he couldn’t solve however was the issue of “double-spending” into his ledger. Double-spending refers to how information can be entered twice. So a person can own the information he or she entered into the ledger.

steve

Satoshi Nakamoto gives the World Bitcoin

Satoshi came around and he (or she) came up with the concept of having a distributed timestamp servers in his or her paper. This then will allow the blocks to become independently verified before that are now entered by consensus into the public ledger. 

Satoshi also solved the problem of incentives. Cryptographic tokens were then issued to the nodes within the distributed filesystem as a reward for solving the proof-of-work problem. 

This was how the entire cryptocurrency space was created. Proof-of-work served as the first-use scenario for effective distributed computing systems. 

It also gave birth to a new industry that is worth several hundreds of billions a decade later. this is just an attestation to how the world is going to rapidly change. It is just that many people do not realize it yet. 

 The Genesis block was then mined on 1st January 2009, this brought blockchain technology to the world as the firstborn of DLTs. It took a couple of years for the world to take notice. However, those who did are now the superstars of what will become the new normal in a few years to come. 

DLT

DLTs are Going to Take over

 DLTs are going to take over the world. It is not a question of how rather it is a question of when. Legacy centralized systems come with so many problems that they have become so problematic that most people are getting tired of having to interact with such systems. 

DLT itself has its origins in distributed computing. Peer-to-peer filesystems have always existed as a part of distributed computing systems. From content distribution networks such as Bittorent to Kazaa and the infamous Grokestar Blubstar and iMesh. Now, we have cryptocurrencies as a first-generation example of what DLTs can produce.

 Everything will return to the beginning of humanity where information was shared and confirmed accurately. It’s just that DLTs have to evolve first and solve problems along the way before we can see the next generation of superstars. 

Subscribe to the E-Crypto Newsletter

Sign up to the best of Crypto, Blockchain and Future Trends news.

Invalid email address
We promise not to spam you. You can unsubscribe at any time.

Leave A Reply

Your email address will not be published.

bitcoin
Bitcoin (BTC) $ 9,653.62
ethereum
Ethereum (ETH) $ 242.13
tether
Tether (USDT) $ 1.00
ripple
XRP (XRP) $ 0.202556
bitcoin-cash
Bitcoin Cash (BCH) $ 255.29
bitcoin-cash-sv
Bitcoin SV (BSV) $ 194.04
litecoin
Litecoin (LTC) $ 46.78
cardano
Cardano (ADA) $ 0.085691
eos
EOS (EOS) $ 2.81
binancecoin
Binance Coin (BNB) $ 17.65