Deep learning is the technology driving many of the recent breakthroughs in artificial intelligence (AI). In most cases, such an AI is an app that has learned some skills like feature detection from an annotated dataset during extensive training cycles, which is called machine learning. Thereby, the hope is that it will apply the acquired knowledge correctly to process new data. Of course, you can build such an AI in many different ways. Still, the most successful approach is artificial neural networks, structures inspired by the human brain, consisting of many different layers of programmatic neurons that can connect in different ways.

Different neural network layouts

The way these layers are connected defines how the network processes input data. Thus, learning in simplified words is to find the network configuration that solves a particular problem best. The adjective "deep" says that these networks are large and contain many layers. Deep learning frameworks represent an operating system for AI upon which specific apps, also called models, are developed and trained.

Everybody familiar with AI and deep learning knows TensorFlow and PyTorch, two remarkable frameworks for machine learning developed by the US tech giants Google and Facebook.

However, a less known third player from the far East called PaddlePaddle clandestinely started to shake up the field. The Chinese search-engine operator Baidu initially released the toolkit under an open-source license in 2016. The name is an acronym for PArallel Distributed Deep LEarning, and this is precisely what PaddlePaddle offers:

The framework sets new standards for ultra-large-scale deep learning and facilitates parallel training with hundreds of billions of parameters involving hundreds of nodes. In the past, models have been mainly trained on single or few nodes. As a result, the framework became the most scalable machine learning platform in the market, but of course, others have been catching up in the meantime. 

Graph Learning and Quantum Neural Networks

But this is by far not the only innovation: Graph learning, a front-edge technology in machine learning, plays a vital role in PaddlePaddle. Network Graphs are simply diagrams that show the interconnections between a set of different entities in a network. The goal in graph learning is to extract relevant features of such network graphs with machine learning algorithms. The graphs can represent social networks, knowledge graphs, transport networks, among others. It is not surprising that also in this domain, neural networks, so-called Graph Neural Networks (GNNs), play a dominant role. A typical application of these GNNs is link prediction, where you try to predict the probability the two network nodes establish a link. 

Predicting a link can be used for building a recommendation system proposing products or even new digital friendships in a social network. But of course, there are many other fields where they can apply text classification, traffic forecasting, or modeling the spread of diseases like covid are only some of which. Since this type of network is completely new, much research is going on in this field. The Paddle Graph Learning (PGL) toolkit provides all features to explore this new technology out of the box.

But this one is even more science fiction: At Baidu's "Wave Summit 2020" deep learning developer conference, the brand-new quantum machine learning development kit Paddle Quantum (Paddle Quantum) entered the stage. Duan Runyao, head of the Quantum Computing Institute of the Baidu Research Institute, announced it proudly to be the first deep learning framework in China to support quantum machine learning. He is convinced that there is an entangled and inseparable relationship between artificial intelligence and quantum computing and intends to build a bridge between the two fields. 

Paddle quantum provides a framework to construct and train quantum neural networks (QNNs), including easy-to-use quantum machine learning (QML) development kits supporting the technology of combinatorial optimization, quantum chemistry, and other cutting-edge quantum applications on standard hardware.

The package also provides a quantum encoder to transform classical information into quantum states. Thus, it plays a crucial role in using quantum algorithms to solve classical problems, especially in quantum machine learning tasks. Currently, there aren't many real-world applications of this hardly two years old technology. However, in the future, it will undoubtedly open a stunning perspective and revolutionize the way AI and computing work.

Sounds all like science fiction? Apart from these ambitious lighthouse projects, the framework is already coining the presence with a solid toolkit of well-established AI key technologies for all industrial needs. The developer and user community of PaddlePaddle has been growing exponentially during the last years and already includes almost 85 000 companies and nearly 2 million active developers. This lead to more than 200 000 models having been trained using this framework. Also, global tech companies like Intel, Nvidia, and Huawei, started supporting the framework recently.

PaddlePaddle in Action

The adaption of the framework in China is advancing quite rapidly, and AI applications already play an important role in everyday life. PaddelPaddel-based AI even played some role during Chinas fight against the COVID 19 Pandemic at three different front lines:

In diagnostics, CT scans are an essential tool to diagnose pneumonia signs, which leads to severe covid cases—detecting these signs of pneumonia soon after the infections increases the chances of successful treatment. Furthermore, it helps predict who will need one of the rare intense care units and who won't.

As soon as early 2020, the company LinkinMed released an AI-powered pneumonia screening platform to support diagnostics at Xiangnan University. The platform reached a detection accuracy of 92%.

Also, tracking the way the Coronavirus spreads and predicting the future path of the pandemic is a field where AI can play an essential role in defeating this or any future pandemic. For example, just by monitoring search queries and social network conversations, you might be able to detect a new outbreak in its early stages and therefore contain it more efficiently. When people are in some regions are starting to search for medication against fever in some regions, this could indicate there is an outbreak. Combining this information with travel data can give real-time insight into the dynamics of a pandemic and can make it possible to predict future virus outbreaks similar to the weather forecast.

The next topic is a critically acclaimed use-case of AI: China established a strict mask mandate for its citizens in Corona hot spots. It did not take long until AI could automatically supervise this mask mandate and spot people without masks in the street. The open-source model provided by Baidu reached a classification accuracy of 97.27% with a robust performance in long-tail scenarios and could detect if individuals in a crowd of people are wearing their masks.

Big brother

The use of AI and deep learning in supervision is not limited to masks. The Chinese authorities have been ramping up something called the "Skynet" system, the most extensive surveillance infrastructure in the world. According to Chinese media, this system is heavily utilizing facial recognition technology and big data analysis. And it is not too hard to guess which deep learning framework is one of the leading players in this game. The mega-cities Chongqing, Shenzhen, and Shanghai are considered the world's top 3 supervised cities in the world. Apparently, in Shanghai, it is possible to identify a person in the streets just within seconds. 

Officially the surveillance system is only used to fight criminals. Still, it is not hard to imagine that authorities could easily use this powerful tool to supervise and suppress political opponents. George Orwell's prediction from his visionary book 1984 is becoming a reality in China at high speed currently, with the heavy support of deep learning frameworks like PaddelPaddel. Big brother is watching you, equipped with the power of highly parallel AI. Quite scary. The question if this state is desirable each society hast to answer for itself.

It is the old dilemma of humanity in the time of technological breakthroughs showing up again and again: Applications that change our daily lives for the better, more than we could have imagined in our wildest dreams. But one could also use it to ramp up the most sophisticated system of supervision humanity has ever seen. The only possible answer here is that technology is neither good nor evil – it is just what we make out of it.

Got interested and want to start making the world a better place with AI immediately or just building your own super-smart supervision system right now? Here is what ready-to-use key technologies the PaddelPaddel framework offers in more detail. To get started with training and deploying your first AI apps, all you need to know is some Python. However, if you want to dig deeper into the core framework, some C++ is certainly useful since the latter is implemented in C++, focusing entirely on performance. 

The good news is that many of the AIs are already pre-trained and just wait for being explored: The framework offers a powerful suite of more than 130 official models from the four main categories Computer Vision (CV), Natural Language Processing (NLP), speech recognition, and recommendation systems. 

Computer Vision with PaddleCV

Among its most elaborated and used features, there are computer vision and natural language processing. For example, computer vision plays a crucial role in autonomous vehicles or drone navigation. Therefore, PaddleCV can be considered a complete framework that offers toolkits for eight types of pattern recognition and computer vision problems:

Image classification: Image classification refers to the problem of assigning an image to one of several possible categories. These can be dogs and cats or cars and bicycles, for example. Image classification is one of the best-solved problems in the AI world, and there many well-understood models out there on the market. PaddlePaddle offers a pretty large palette of pre-trained or this task like AlexNet, VGG, GoogLeNet, ResNet, Inception-v4, MobileNet, SE-ResNeXt, ShuffleNet models, which are ready for user download. Don't be scared by the cryptic names. All of these model names are industry-standard and can easily be found in the literature. 

Target detection: In image classification always the whole image is taken into account. On the other hand, when doing target detection, the task is to find a target of a given category within an image, assign the correct category, and obtain its coordinates. Some available models here are SSD, PyramidBox, Faster RCNN, MaskRCNN.

Face recognition

Image segmentation: In contrast to target detection, where you are interested in the coordinates of target objects when doing image segmentation, you try to partition the image into sub-groups of pixels representing objects transforming the image into an easier-to-analyze representation. Image segmentation plays an important role in autonomous vehicle driving to segment and understand street scenes to avoid pedestrians and cars getting involved in accidents. It can also play a significant role in improving medical diagnosis by AI-supported image analysis.

Video classification: The goal of video classification is to understand the contextual information of a video. It is not sufficient to understand the particular frames of a video show, but the classifier has to analyze them within a shared context. 

Image generation: Image generation is where AI starts getting creative. The goal here is to generate new images out of user input which can be random data or another image. This technique allows to add features to an image or change the faces of persons on pictures, for example. 

Metric learning: The goal of metric learning, also called similarity learning, is to learn a similarity function describing how similar or related two objects are. There are many applications for this in recommendation systems, visual identity tracking, face identification verification, and speaker verification.

Keypoint detection: In keypoint detection, the human body is abstracted to a set of key points. By tracking and extrapolating these key points, it is possible to predict the next movements of humans. The technology already plays a crucial role in motion classification, abnormal behavior detection, and autonomous driving. 

Paddle OCR

A typical application of computer vision is Optical Character Recognition (OCR). PaddleOCR is a universal OCR System that currently supports more than 80 languages and is extensible. In addition, it is incredibly lightweight: The model size of a PP-OCR model is only 3.5M for 6622 Chinese characters. Another dataset for recognizing 63 alphanumeric symbols requires just 2.8M. The system can also complex layouts with distorted fonts like traffic signs. 

enter image description here

Comparing this to Tesseract, which is considered the leading open source OCR system currently, this is a significant improvement if you want to use OCR in mobile applications. The Chinese standard train data for Tesseract is 43M, while English still has 23M. Since Paddle OCR is Python-based, it can be easily installed using the PIP package manager.

Speech Recognition and Natural Language Processing

There is a clear difference between speech recognition and natural language procession. The first refers to the challenge of converting acoustic signals into a machine-readable format. A typical example is the problem of speech-to-text conversion. On the other hand, natural language processing (NLP) gets a bit further: Here, the goal is to make the computer understand and react to the user input and react helpfully. For example, if you ask the computer a question, it should ideally give you a meaningful answer in the best of all worlds. 

Natural Language Processing: As an allusion to Google's Bert NLP algorithm, the PaddlePaddle community developed their own NLP algorithm called ERNIE. It uses a knowledge-based approach combining pre-trained models with a pool of multi-source knowledge. This knowledge base can simply be an extensive collection of texts, like Wikipedia or Reddit, and other popular forums on the web. The system continuously enhances its results vocabulary, structure, semantics, and other aspects from the data pool. BERT only learns original language signals, but ERNIE enhances the model's semantic representation capabilities.

ERNIE and BERT

Baidu compared the performance of ERNIE 2.0 with BERT, and the result was that it outperforms BERT and also XLNet on 16 tasks, including English challenges on GLUE benchmarks. The General Language Understanding Evaluation (GLUE) benchmark is a state-of-the-art collection of resources for training and evaluating language understanding systems. In numbers, ERNIE outperforms BERT for English tasks by 3.1%. However, for Chinese tasks, the lead was less clear.

Recommendation Systems

Life would be pretty boring without all these shiny movie recommendations popping up whenever logging into one's Netflix account, coming up with great ideas what video to watch next. Whoever comes up with these proposals seems to know us quite well since they will fit our interests. Responsible for the magic behind these suggestions are recommendation systems. Usually, they use data of past purchases and networks to predict what you might want to watch next. Technically, there several approaches to implement a recommendation system.

A brand new approach in this field is using recurrent neural networks (RNNs). Most neural networks work in one direction by their definition. You feed them with some data and receive the output. The main innovation of RNNs is that they allow previous outputs to be used as inputs again, which means that the data is processed several times by the network in internal loops. 

Feedforward versus recurrent neural networks

Therefore RNNs can develop a "memory" which remembers all information about the data processed before. Since the information looping inside the network cannot be accessed from outside, it is often called a "hidden state."

In large RNNs, there can be multiple of these hidden state layers in different regions of the network. Thus, RNNs are probably the first type of AI with short-term memory. 

That is why RNNs found their role in session-based or conversation-based recommendation systems. A session-based recommendation system uses the user's short-term historical activity records to predict the content that may be of interest at the next moment and click to view. 

However, this model also has some shortcomings. For example, in systems where sessions usually anonymous, the user behavior during a single session is typically limited. Therefore, it isn't easy to generate effective proposals. 

PaddlePaddle in Benchmarks

Because PaddlePaddle is not very widespread in the western world, few benchmarking data are available, comparing it directly to US deep learning frameworks. The numbers of the following benchmark comparing PaddlPaddl with Pytorch and Tensor flow originate from a Chinese publication about PaddlePaddle.

PaddlePaddle benchmark

 On average, PaddlePaddle outperforms its competitors by about 42% during the training for computer vision tasks. However, concerning NLP, the advantage is only about 9 % during training. 

PaddleLite

PaddleLite is a version enhanced for mobile devices. It allows the deployment of trained models on ARM processors and MALI GPUs, which are graphics processing units optimized for smartphones. Also, Adreno GPUs by Qualcomm are supported. Also, Raspberry Pi systems are supported, and PaddleLite is cross-compatible with Apple and Android systems.

Getting started

For starting to dive into this new framework, the PaddlePaddle GitHub repository is a great starting point. For each sub-package, there is a section also containing tutorials in English and Chinese. In most cases, the Chinese version is more detailed, but it should not be too hard to access this information in times of AI and with the help of automated translation at hand. 

As you might have noticed recently, something called cryptocurrency was added to our currency converter app, and maybe you also have heard a lot of noise about this kind of thing during the last months. The following gives you a detailed overview of what cryptocurrencies are and the two most prominent players in that space: Bitcoin and Ethereum.

While Bitcoin is quite well known, the second-largest cryptocurrency Ethereum still stands in its shadows. After reading this article, you will have learned that it is way more than just another cryptocurrency and offers some surprising possibilities.

How all started: The 2009 financial Crisis and the Idea of a peer-to-peer Cash System

What do Byzantine Generals have to do with cryptocurrencies? You will see, in fact, quite a lot! The whole story starts in 2009 amid the already almost forgotten financial crisis. This time a group of people was alarmed by the possibility of a bank run leading to the collapse of the entire banking system and asked whether it is possible to create a secure digital monetary system that can be operated without any banks or central authorities. 

A bank run is a scenario where all clients want to withdraw their funds at the same time. Since banks only store a few percent of the clients' funds as cash reserve and invest the rest of the money, such a scenario would bring any bank to a fall within hours. If more than just a single bank is affected at a time, this can lead to a toxic chain reaction, endangering the entire economic system.

The world was pretty close to such a scenario after the breakdown of Lehman Brothers. The bank lost considerable parts of its capital due to speculations in the housing bubble in 2009. Only a billion bailout program on the shoulders of the taxpayers and the (in fact unfulfillable) promise by some governments they would guarantee for 100% of all funds in bank accounts could prevent the worst-case scenario. 

Financial Crisis

Under these circumstances, the question arose if it was possible to build a new democratic monetary system providing a safe store of value without the risk of being deprived of someone's savings by the inherent risks of the classical banking system and its speculative excesses. And indeed, someone had an answer to this question.

Satoshi Nakamoto

A guy calling himself Satoshi Nakamoto released the whitepaper (some kind of masterplan) outlining the principles of a peer-to-peer cash system to become the Bitcoin network. The name was a pseudonym, and even today, it is not known who actually invented Bitcoin. Most likely, it was a group of people working under this pseudonym. This is how the original whitepaper looked like:

The original Bitcoin whitepaper as published by Satoshi Nakamoto

The decentralized Ledger

What he proposed here was a transaction system that is entirely transparent, uncontrollable, and unstoppable. The core idea is that the account balances and the transaction history are stored redundantly in many different places in a synchronized way which means, if parts of the network go down, the account balances do not get lost since there will still be intact copies in other places.

It is like having his account at not just one bank but at hundreds or thousands of banks at the same time. If one of them disappears, this will not affect your balance. 

In simple words, this is the definition of the so-called decentralized ledger technology, which is the foundation for the blockchain. In traditional finance terminology ledger simply describes a book or another collection of financial accounts.

An astonishing fact is that nature invented the principle of such a decentralized database already billion years ago: DNA is the most precious piece of information of any living organism. An exact copy of it resides in the core of any of its cells redundantly billions of times. And all living organisms are even working hard at keeping these DNA segments consistent and intact since mutations may lead to cancer and endanger the survival of the whole organism.

But back to banking: How can it be made sure that nobody manipulates his local copy of the transaction history to cheat the system and get some extra money for free? That is the place where our Byzantine generals enter the stage:

Consensus Finding Problems in the Byzantine Empire

The Byzantine generals' problem in computer science stands for a dilemma on making a crucial decision between network participants who are locally separated and have to rely on messages as a communication media. These messages are subject to communication failures or potential manipulation. 

The story goes like this: several Byzantine generals are located in different cities of the mighty Byzantine Empire, which was at war. To win this war, the generals have to agree on a joint military strategy. There are two options to win the battle for sure. These are:

  1. All Generals attack the enemy at the same time
  2. All Generals retreat at the same time and do not attack

Either of the options will make them win the war. However, the scenario that only a fraction of the generals' attack will give the enemy a chance to win. The decisive point is that they have to reach a consensus over distance.

Furthermore, it is also decisive that their action is performed in a synchronous after the decision way. They may not leave their position and therefore have to communicate via messengers, and there the problems start. There are two main points of failures, communication errors and double-agents:

  1. Messages can get lost on the way between two generals
  2. One or more generals might act as double agents for the enemy and try to make the operation fail by sending manipulative messages.

The critical point here is that the decision process is decentralized without any central authority.

Defining a Protocol for asynchronous Decisions over Distance

To cope with these obstacles, the Generals have to agree on a well-defined military protocol to be exactly executed after an agreed point in time at their last physical meeting. The decision must be made purely based on the messages received from the other generals. 

The operation can not be influenced by a general after all votes collected and the voting deadline has been reached. For messages not arriving on time before the end of the deadline, a default value must be agreed on in this protocol.

It can be shown that the outcome will be a success if at least 2/3 of the generals play fair, which means it will fail if 1/3 change to the dark side and become double-agents.

From the decentralized Ledger to the Blockchain

Now back to Bitcoin: The war Bitcoin has is to fight is to agree on the balances of all accounts in the decentralized ledger at a certain point in time. Do you see the analogy? Adding a transaction to our decentralized ledger in the Bitcoin network is the same as making the strategic decision of attacking or not in this hypothetical war. 

Consensus finding

If the network participants cannot agree on the value of the transactions and their execution time, contradicting balances will show up at different places in the network, and the system will fall apart in chaos ultimately, which is not what we expect by a secure cash system. 

However, there are substantial differences: while in Byzantine times, the number of generals is well known beforehand, it is not in the Bitcoin network since it is an open peer-to-peer network anybody can connect to.  If in our analogy with the generals, this would mean that we do not limit the number of generals, and just anybody can sign up as general and influence the decision process. The enemy would be a fool if not exploiting this by sending many fake generals for sure.  

Bitcoin Mining, the Blockchain, and Proof of Work (PoW)

How can such a system with random participants agree on account balances in secure ways? The solution to this problem is the true revolution of Bitcoin: It is known under the name Proof of Work (PoW) and is the ultimate cause of Bitcoin mining, which has recently been discussed in the media a lot.

If you could influence the Bitcoin network's decisions on valid transactions by creating many fake accounts, the system would be doomed to fail. Therefore, the inventor(s) of Bitcoin went a step further and abandoned a central principle of the classical Byzantine generals' problem. In its original version, all participants exactly had the same voting power. Bitcoin uses a different approach: It links the voting power to CPUs, following the principle one-CPU-one-vote. Of course, this principle only holds in a world where all CPUs have the same computational power; in reality, the voting power of particular participants depends on the contributed CPU power. Today some dedicated hardware called ASICs is in play for this task, but for simplicity, let us stick to CPU power during this article. 

The deeper reason behind this is that CPU power is expensive. For participating in the voting process, you have to prove that you control this resource. Consequently, you cannot simply overtake control of the network by creating fake identities. You have to buy hardware and pay your electricity bills, making attacks on the Bitcoin network pretty expensive. On the other hand, there is a strong incentive to play fair: The Bitcoin transaction history consists of so-called blocks. Everybody can submit a transaction request to the Bitcoin network at any time. These requests are the components of these blocks. As soon as such a block is signed and completed, all transactions within this block can are confirmed. Whenever you are the player who is the first to close a block, you will receive a reward which consists of the transaction fees of all transactions in this block, and a certain amount of newly created coins per block is yours. 

To complete a block, you need a specific key that fits all pending transactions plus their timestamps, and the only way to obtain that key is by guessing, which is due to the properties of the underlying hash function. You can think of it as some kind of black box apparatus. If you feed all transactions in you, get the key. But there is no way of reverse-engineering the block content from the key. That is why there is only guessing. Finding these magic signatures is called mining. It is like recovering a forgotten password. For obtaining it the first time, you need a lot of computational power. However, once you found out the password, you can easily share it with your colleagues, and they can use it without extra effort.

So, in summary, Bitcoin mining is nothing else than trying to solve an, in fact, stupid riddle by try and error. Once the key has been found by one miner successfully, all others can verify it instantly and confirm the block without extra effort. But it is precisely the mechanism keeping the network secure. So why cheat when you can also earn honest money. The blockchain is nothing else than a consistent sequence of such signed transaction blocks. So there two main differences to the classical Byzantine generals' problem. First, the Bitcoin approach uses signatures, and second, the ability to sign blocks depends on CPUs. The network is secure as long as all honest miners control more than 50% of the CPU power contributing to the network. They always need to be able to out rule any cooperating group of attackers.

In practice, miners primarily participate in mining pools where thousands of miners work together, generating more constant and stable earnings. Mining a block on your won can take quite some time and depends heavily on luck. The system is even flexible enough to adapt to the difficulty of this riddle as a function of the total computational power available in the network, like that the time between two blocks roughly stays constant. The more clients connect, the more difficult it will be to find the key to lock the new block.

The 51% Attack and Double-Spending

The concept of such a sequence of signed blocks guarantees a high level of security. If an attacker wants to change a transaction in a past block, he would have to all the PoW from this block and overrule all fair-playing nodes' CPU power. The farther such a block has been mined back in history, the more difficult it is to redo all the historical PoW. So just playing fair and getting rewarded for that is the better deal. Attacks on the network history are extremely unlikely to happen due to the immense CPU resources needed. However, a coordinated group of attackers controlling more than 50% of the network's GPU power can try to double-spend Bitcoins which means use the same coin for multiple transactions generating money out of the blue. Preventing double-spending was one of the significant breakthroughs of Bitcoin networks.

A double-spending attack did never happen to Bitcoin. However, smaller cryptocurrency networks secured by less CPU power have a higher risk of suffering from such attacks. For example, Bitcoin Gold, a Bitcoin offshoot, was hit by a double-spending attack in 2018.

The Properties of Bitcoin

Since we have all the tools available, it is time now to define what Bitcoin actually is. Bitcoin is a collection of financial accounts stored in a decentralized but synchronized way on different computing nodes connected in a peer-to-peer network. 

Bitcoin

The block time of Bitcoin on average is 10 minutes which means that you have to wait 10 minutes. If you want to buy a coffee with Bitcoin, you have to wait until the payment is processed, and the coffee will get cold until it is paid. The Bitcoin network can process about 250 000 transactions each day. 

Is Bitcoin really digital Gold?

In 2009 the emission rate of Bitcoin had been 50 BTC/Block and decreased continually to about 6 Bitcoins per block at the time of this writing. The last Bitcoin probably will be mined in 2140, and until then emission rate will steadily decrease. From there on, the supply of Bitcoin is constant or even deflationary:

In case someone loses the key to his account, there is no way to recover the balance. How many addresses in the Bitcoin network are already locked irrecoverably is not known at the moment. 

This absolute limit on the supply earned Bitcoin the nickname of being the digital gold since the world's gold reserves are pretty limited. However, gold is a physical store of value, and Bitcoin is not. So if you decide one day to switch off all Bitcoin mining farms, all Bitcoin will be gone. 

Bitcoin has come to provide a decentralized, irrecoverable way to transfer money from A to B. However, it instead is used as a store of value due to the long block confirmation time and does not really fulfill its job as a transaction system. Absolutely nobody would use Bitcoin to buy a coffee today.

From Bitcoin to Ethereum

Did we already reach the end of the story? Of course not. In 2015 the market was entered by another player who was announced by only 19 years old tech-journalist Vittalik Butterin in an internet forum with the shiny words "Welcome to the New Beginning". Today we know that Vittalik was announcing the beginning of something big. It is called Ethereum and has become the second-largest player in the world of cryptocurrencies. 

Ethereum

But what is this about? If Bitcoin has come to re-invent money, Ethereums mission is nothing more than rebuilding the financial industry and administration in a decentralized fashion. Sounds quite ambitious, but here is what makes the difference: 

While the Bitcoin network is mainly securing a single transaction database, Ethereum extended this idea by some kind of operating system for decentralized apps (dApp). The Ethereum Virtual Machine (EVM) allows running entire applications in a decentralized synchronized fashion.

Like Bitcoin, Ethereum provides a token called Ether (ETH) that can be exchanged and considered equivalent to Bitcoin. The term Ethereum refers to the whole Smart Contract platform. Ether can be seen as the base currency of the Ethereum economy and has to be used for paying transaction fees. 

Solidity - a new Programming Language For the implementation of smart contracts, Ethereum provides its own programming language called Solidity, invented by Gavin Woods. Solidity is a programming language similar to Java or C++ and has the property that it is truing-complete, which means that you can program basically anything with it.

By this extension, the Ethereum blockchain became a platform providing an app store for unstoppable decentralized applications. They are also known under the term smart contracts. 

In fact, also Bitcoin provides something called Bitcoin Script, which allows defining some simple contracts. However, it hardly has been used, and the game-changer making the difference for Ethereum is the truing-completeness. 

In analogy to the decentralized ledger, which is stored on all network nodes, a dApp runs on all nodes simultaneously, reaching the same result. But why to hell an app should be executed in parallel at thousands of nodes consistently achieving the same outcome. Isn't this a tremendous waste of resources? 

The concept of a smart contract generalizes the principle of the decentralized ledger to an ample palette of financial products: This can be insurance contracts, escrow services, or event stocks are possible. The idea is to have secure and unbreakable financial products. 

A Smart contract implementing an insurance policy is, in principle, a simple program backed by the collateral of the insurance provider. When buying such insurance, you pay the fee in the insurance contract. Then, when the insured event occurs, the funds will be released automatically by the contract.

Ethereum Oracles

But let's assume you bought an insurance policy against a plane delay at the hypothetical brand new ESurance insurance group. The policy says you get a 10 000 $ compensation if your flight from Bangkok to Nairobi arrives four hours late. The flight insurance is implemented as a smart contract.

From where does this smart contract know that the plane was late and has to pay you the money? It has to rely on oracles. These are trusted third-party authorities which provide information from the outside world.

So, in fact, all applications which need data input from the outside world are not entirely autonomous since they need to rely on these oracles. The term oracle was coined in reference to ancient Greece, where oracles such as the one at Delphi were believed to have clairvoyant powers.

Oracle

It would be an interesting question if it was possible to implement a dApp so that it crawls the web itself to obtain such information. However, in the Ethereum ecosystem, this is forbidden since it could to contradictions, ultimately blowing the blog chain. 

The reason is that the internet may look different at different geographical locations, which would have the consequence that dApps reach different results when they run at various locations, which are against the mere definition of dApp since it should reach the same conclusion at all nodes. 

There are some approaches to implement Oracles in a decentralized, like Chainlink who is developing a multi-chain oracle service that can also be connected to Etherum smart contracts. However, from the point of view of the blockchain, these decentralized oracle services still remain authorities.

In the far future, it is imaginable that there will be super-smart dApps using a heavy KI able to cope with such obstacles and build a truly autonomous system. Another way to get closer to that goal would just be to run the entire world-wide-web on top of the blockchain since if it is part of the system, all apps will reach the same conclusion, although it might be the wrong one in some cases which do not really matter since the blockchain would be consistent. 

In fact, there have been many kinds of this disruption during the last decades. Just think of the internet. About thirty years ago, the data packages were transmitted as acoustic signals over telephone lines. Now telephone calls are data packages that are transmitted over the internet protocol. Maybe one-day blockchain and internet swap the same way. But any project existing at the moment is still far from this. 

The Problem of Energy Consumption and Proof of Stake (PoS)

When Elon Musk recently stopped accepting cryptocurrency payments at Tesla for the high energy consumption of Bitcoin, a shock wave went through the cryptocurrency space, and the world suddenly started talking about the massive ecological footprint of cryptocurrencies due to the high energy consumption of the mining process.

Currently, the Bitcoin network is estimated to spend up to 120 TWh of energy per year. This about the electricity consumption of the Netherlands in 2019, which was 114 TWy/Year but still half of the energy consumption of the classic banking system, which is about 250 TWh/Year.  Energy consumption of Bitcoin versus classical banking system

For Bitcoin, all this energy is used for solving a pointless riddle to secure the network. On the other hand, the global banking system is estimated to process about 1 Trillion (1000 Billion) transactions per day, while Bitcoin is at the level of a quarter million. 

The numbers above clearly show that a PoW blockchain is a painfully inefficient way to transfer money. But are there alternatives? 

For sure, there are! The probably most promising approach is called Proof of Stake (PoS), and it is pretty interesting since it means that the network is securing itself. A self-securing network? How is that possible?

Remember, to prevent an intruder from overtaking control over a blockchain network, he needs to prove that he owns a costly resource. In the case of Bitcoin, this was energy and hardware.  PoW versus PoS

But since cryptocurrencies considerably gained in value over the last years, another approach appeared on the horizon: It is possible to use the currency itself to secure the network. The idea is that that you own a certain amount of the currency. 

To manipulate the network, you would need to possess 51 percent of all coins, which is factually impossible since they are limited. Furthermore, trying to accumulate this amount will automatically drive the coin's market price, which makes you a hard time trying to buy the majority of all tokens of a certain cryptocurrency. 

Although the current version of Ethereum (1.0) also uses a PoW approach, the community decided to migrate the Ethereum network to PoS already in 2021.

PoS in Ethereum 2.0

The implementation works like follows: if you want to stake Ether, you must deposit exactly 32 Ether in a given smart contract and run a validator node. Then, the network chooses a random candidate who can finalize the next block and receives the block reward for closing a block.  

But why should you play fair and not create some extra money by singing two concurring block candidates simultaneously? This would split the chain into two forks and effectively double all balances. In simplified words, this explains what is called the "The nothing at stake problem." 

The Ethereum developers implemented a mechanism to prevent this scenario: If you try to sign two different blocks at the same time slot, other fair players in the network will detect this, and your deposit gets slashed by a majority decision. Slashing means that the staker loses a part or even all of his 32 ETH deposit in the worst-case scenario. 

This mechanism produces a strong incentive to play fair. To get penalized that way, you have to hack your Ethereum client software. As long you use the official version, it will agree with the other Stakers, and you will be fine. 

The annual reward you can get for Ethereum staking will be somewhere between 5% and 10%, which appears quite impressive when looking at the current market interest rates, close to zero or even negative. 

But also downtimes of validation nodes lead to gradual penalties, which are not as severe as those for cheating. The incentive here is to keep the network stable, which requires reliable players. Like mining, staking can be organized in pools since 32 ETH is already a quite significant investment at the current Ether prices.    The introduction will reduce the energy consumption of a cryptocurrency network by 99.95% compared to its PoW equivalent, which is a deal. 

In fact, you can already stake your ETH at the so-called beacon chain, which has been set up to test and monitor this new consensus algorithm. However, the main chain is still using PoW. 

After switching to PoS, the money used for buying mining hardware and paying the electricity bills will likely flow directly into Ether. For Bitcoin mining, currently, almost 30 billion $ is spent on electricity, assuming a price of 0.25 cent/kWh. But, of course, this is not including the investments in mining hardware. 

Proof of Stake means that the system secures itself, which is quite interesting from various perspectives. If you are looking for a really great book about self-referential systems, you should have a look at Gödel, Escher, Bach: an Eternal Golden Braid by Douglas Hofstadter. If blockchain technology already existed when the book was written, it would certainly have had its chapter there.

Use Cases of Ethereum

But let's talk about the use cases of these new Ethereum smart contracts now. Just imagine you just got some Ether and are wondering what to do with them. Here are some wonderful ideas for things you already can or will be able to do in the Ethereum ecosystem in the future. 

Create your own Money using ECR20

Surely everybody has already been dreaming of getting rich by printing his own money. The good news is that with Ethereum, nothing is easier than that! The Ethereum platform defines a predefined toolkit for making your own money. It's called the ECR20 standard and allows you to release your own digital currency secured by the Ethereum network. The process of bringing a new token to the world is called initial coin offering (ICO).

In 2018 there was a massive wave of such ICOs, many of which were just scams, but people were just buying anything dumped on the market. The whole bubble was culminating in the Useless Token (Yes, it was really called like that), which proudly advertised that they would give money to just some random person in the internet money, and they were buying electronics or even a big-screen television with it. You can be sure that the guy got his new TV. 

This seems similar to past stock market bubbles where people were just buying any stock without looking at the company. Another token called Shiba Inu Coin recently popped up with the shiny promise of being the "Dogecoin killer." Wow! 

But not all of them were bubbles, and some of these tokens were used to finance serious projects and are still alive. 

Stablecoins

An exciting and valuable version of these Ethereum-based tokens is stablecoins. These are smart contracts defining a token in a way that closely mirrors the price of a reference currency like the US Dollar. 

They are not subject to the impressive price fluctuations we know from cryptocurrencies and therefore reproduce a vital property of traditional currencies: Their worth stability. One of the most important examples is the DAI token, a USD Stable coin with a market capitalization of about $380,000,000. Market capitalization means the total value of all circulating units of this token.

Decentralized finance and Liquidity Mining 

Of course, the possibility that currencies in principle can run on the Ethereum platform raises another need: currency exchanges. And, surprisingly enough, also these can be implemented as dApps. Thus, exchanging money can be entirely done on the blockchain without any third-party operator. A prominent example of such a decentralized exchange (dEx) is Uniswap. 

Such a platform needs a lot of liquidity, of course, to keep running. Therefore, if you provide your capital to the liquidity pool of such a dEx, you can generate a passive income since you will get a part of the transaction fees. That process is called Liquidity Mining or Yield farming and is gaining popularity rapidly.

The next level is robot adviser smart contracts which delegate your funds to the pools with the highest yields and optimize your returns like that. This entirely new field is called "Decentralized finance" (DEFI).

Stocks 

Similar to ECR 20 tokens, companies could also decide to release their Stocks directly on the blockchain when going public in the future since stocks are nothing else the tokens certifying someone's ownership of a company's share. The blockchain is exactly made for that purpose. 

Insurance Contracts 

Why not plan your retirement with a Smart Contract. For example, you could pay your ETH in a retirement contract which returns you the money after your 60th birthday in monthly intervals. Of course, this is just an example. But considering the high volatility of cryptocurrencies at the moment, it is probably not a good idea to plan with them for your retirement. 

Gaming Apps like CryptoKitties 

It is also already possible to play simple games which are implemented as dApps. One of the weirdest of those was CrytoKitties, found in 2018, where you could buy your digital comic cat. As soon as you bought two of these cats, you were allowed to breed new cats once you owned too. 

CrypotKitties

The game went viral after days and congested the Ethereum blockchain. And honestly: who needs Dogecoin if you just can own and even breed these cute cats.   

Intellectual Property Management and NFTs

The real innovation behind CryptoKitties was that it was probably the first time the copyright for something was sold over the blockchain. And in fact, the blockchain is really made for Intellectual Property Management. The magic word here is NFT which stands for non-fungible token. 

NFT are certificates on the blockchain which certify a digital asset is unique. Therefore, if you own an NFT, you prove that you are the legal owner of this piece of data. This technology can be used to represent photos and images, pieces of music, video book. In short words, all kinds of data. 

Internet Domains

Also, the ownership of internet domains can be organized like that. Like a piece of art, each domain is unique; therefore, if you buy it, you acquire all rights to use it. Thus, the blockchain is a perfect tool for domain registration and transfers too. 

Administration and Logistics

The list of possible use cases in administration is long:

  • Health Certificates (In fact, this has been discussed for Corona vaccination certificates)
  • The registration of Cars
  • Marriage certificates or even passports.

In logistics, the production chain of a product could be tracked in a transparent way from the initial resources until it is sold to the end customer. Thus, supply chain tracking seems to be another possible field in the blockchain that could play a role.

Is Ethereum inflationary?

The current implementation of Ethereum version 1.0 had an average inflation rate of about 5% per year over the last years. However, this is going to change during the upcoming updates to Ethereum 2.0. Finally, the switch to PoS will decrease inflation to less than 1% per year. Considering the problem of lost keys, the supply of Ether is expected to stay more or less constant. 

Already a considerable amount of Ether is locked away in DEFI contracts. Considering that in addition, up to 30% of all Ether could be staked, there could be even liquidity shortages in the future. Therefore, the store-of-value properties of will be similar to those of Bitcoin, maybe even better.

In fact, if the Ethereum economy keeps growing, one could even attribute an inner value to Ether, namely the value of all applications running on top of the Ethereum blockchain. 

A Comparison of Bitcoin and Ethereum in a Nutshell

Bitcoin was born to become decentralized money. However, the equation: Bitcoin = Store of Value better describes its current usage since Bitcoin cannot completely fulfill its job as an efficient cash system due to network limitations.

In contrast, Ethereum was started to become a decentralized financial and economic ecosystem potentially (!) providing a platform for decentralized:

  • Money and stablecoins mirroring a reference currency like the USD
  • Store of Value: Ethereum, with PoS implementation, will be a perfect store of value and even generate a passive income.
  • Administration: Identity Management, Vaccination Certificates 
  • Financial Industry: Decentral exchanges and Yield farming
  • Stock exchanges
  • Supply chain Tracking
  • Copyright: NFTs and Domain

For sure, there are also things to come, nobody has thought of so far.

Open Issues

Although all of this sounds very promising, the beginning is still there. Many open issues would require another article to discuss in all detail. Here are some examples: 

One problem is certainly the fact the account addresses are not private. So if you have your BTC or ETH at an address and pay something from there, everybody knows your balance. 

Of course, this can be handled by storing your funds at some crypotbanks like Coinbase, which have large deposits. But, on the other hand, this is against the idea of cryptocurrency, which was to give you complete control over your funds and a rather centralist approach. 

Scalability

The main issue of both Bitcoin and Ethereum is scalability. As already mentioned before, The current implementations of Bitcoin and Ethereum are pretty inefficient. Bitcoin roughly can process about 250 000 transactions a day, while Ethereum is in the order of 1 000 000 (1 million) transactions a day. The number of money transfers worldwide is estimated at about 1 Trillion (1000 Billion).

Therefore, the current versions of Bitcoin and Ethereum are rather an old Nokia and an old Smartphone compared to the powerful supercomputer of the global financial system. However, there are many different solutions to this problem, like second-layer solutions allowing off-chain transactions. 

Furthermore, Ethereum is working on a concept with the geeky name sharding. In brief, it means to only store fragments of the blockchain on particular nodes, which takes a lot of overhead of the network and makes it more scalable in the end.

Beware, the Blockchain never forgets

Last but not least, the blockchain never forgets. All information put to the blockchain will stay there forever or at least as long as the network persists. It is completely unclear if we want to record the news that someone bought a lollipop for the next 10 000 years. Although the blockchain would become a Mekka for future archaeologists, I doubt anybody really wants that. So, in the long run, the system needs a solution or forgetting data. 

Bubbles and Party Time

If the entire story has an irony, the following: The blockchain was invented as a response to the speculative orgies of Wall Street in 2009. During the past years, we have seen at least four cryptocurrency bubbles have emerged, and the speculations performed with them are in no way inferior to the speculation during the financial crisis. And probably just at that moment, we are already in bubble number five.

The figure shows the typical pattern of a cryptocurrencies bubble in 2017 and 2018, where the price of all of them showed a parabolic increase followed by a dramatic crash.

The 2017/2018 cryptocurrenczy bubble

All these bubbles in the past followed a similar pattern and had a similar duration, which indicates that the price closer corresponds to a psychological phenomenon than representing the accurate valuation of these assets.

The blockchain is still a young technology in its teenage days where it is between party and hangover, but one thing is sure: It has come to stay, and this story is to be continued.