A Mathematical Proof for ‘Proof of Work’

Author: CryptoMedication

Providing a ‘Proof’ for Proof of Work

Source: Proof of Stake versus Proof of Work Whitepaper. (2015). Bitfury.com. Retrieved from http://bitfury.com/content/5-white-papers-research/pos-vs-pow-1.0.2.pdf

A dissection of the benefits of Proof-of-Work were laid out in full in a scholarly work by the Bitfury Group titled ‘Proof of Stake versus Proof of Work’, which was published in September of 2015.[44] Within the paper, they note that, “A user who discovered a block should be encouraged to broadcast it over the network immediately and not hold it for himself.”

Above, a potential attack scenario is presented by the authors of the study.

In the above example, two users have discovered a block at the same time (which oft happens on cryptocurrency protocols):

Then, an attacker attempts to reverse completed transactions by forking the blockchain:

Here’s where the issue comes in, however:

There is a way to solve this dilemma that would potentially paralyze/compromise a blockchain if it did occur.

As noted in the research paper, “With the situation depicted [above], only three blocks can be used as a base for extending the blockchain according to Conditions 1–2: Bi+2, B”i+1, and B’i+2”

Those blocks are identified below:

Obviously, the peril in this situation is immediately obvious because Block Bi+2 is the only legitimate blockchain in this scenario (one that is not illegitimate/under attack).

The above graphic is an example of the desired situation in any blockchain if the security and legitimacy of the chain’s immutable ledger is to be preserved. However, the current rules that nodes follow are not adequate enough by themselves to stipulate requirements for ensuring that the correct blockchain is chosen in this situation.

More than likely, Block B”i+1 is the only one that would be eliminated because it is at a lower blockheight (i+1) than the other two options. However, the i+2 block height currently possesses two different potential selections that can be made in this scenario.

Therefore, as the study notes, “Consensus rules should be constructed in a way that results in resolving blockchain forks, i.e. one of the competing branches should take over all other branches in a reasonable amount of time.”

In the above excerpts from the study, the beauty of Proof of Work is captured in this simple formula below:

where D is a subset of [1, M ] = target difficulty

Given the fact that D is a subset of [1, M], the following excerpt from the ‘Differential Geometry of Curves and Surfaces of Manfredo Docarmo’ applies here:

Source: Differential Geometry. (2013). Math.stackexchange.com. Retrieved from https://math.stackexchange.com/questions/286252/is-this-set-a-regular-surface

Although the above figure is a Geometry reference, it holds relevance to the process of discovering a new block in a Proof-of-Work protocol modeled after Bitcoin’s design (i.e., Bitcore), which is best defined as geometric distribution.

Below, is a definition/explanation of ‘geometric distribution’:

Pitman, Jim. Probability (1993 edition). Springer Publishers. pp 372.

More importantly, it must be noted that within the definition of the geometric hashing function is the concept of ‘discrete probability distributions’, which are defined as “A probability distribution characterized by a probability mass function.”[47]

Thus, the function:

Where

Makes complete sense.[48]

What ties everything together in the Proof-of-Work model is the fact that the discrete probability distribution within the geometric distribution model (as described above) adheres specifically to the Poisson distribution theory.

The Poisson distribution is a, “…tool that helps to predict the probability of an event occurring when you know how often the event has occurred” So, the Poisson distribution, “…give sus the probability of a given number of events happening in a fixed interval of time.”[49]

Source: Ugarte, MD; Militino, AF; Arnholt, AT (2016), Probability and Statistics with R (Second ed.), CRC Press, ISBN 978–1–4665–0439–4

The relevance of the Poisson distribution is that this it relates to the estimated number of times that a block will be produced on the protocol. Therefore, the ‘expected event’ within this probabilistic distribution theory would be the production of a block every 2.5 minutes for the Bitcore protocol.

Typically, in Bitcoin, the probability of a miner discovering the nonce which is of an acceptable value is:

Target / CryptoHashFunction.

The above function is verified via the Bernoulli trial.

Bernoulli Trials are defined as, “An experiment in which a single action, such as flipping a coin, is repeated identically over and over. The possible results of the action are classified as ‘success’ or a ‘failure’.”

Therefore, in the Proof-of-Work algorithm, the Poisson distribution formula is used to acquire the necessary target difficulty in order to keep the average time per block somewhat constant on the chain. It’s represented by the formula:

P(x; μ) = (e-μ) (μx) / x!

Specifically, in the case of bitcoin — a time rate formula can be adopted instead in order to ensure that the Target algorithm remains accurate:

Source: Poisson Distribution / Poisson Curve: Simple Definition. (2018). Statistics How To. Retrieved 29 April 2018, from http://www.statisticshowto.com/poisson-distribution/

This value is then used to increase or decrease the probabilistic odds for all miners on the network that they will discover the correct nonce value. Each attempt to ‘brute force’[51] the correct nonce value will have the same probabilistic chance of success or failure.

Source: Simmons, B. (2017). Bernoulli Trials. Mathwords. Retrieved 29 April 2018, from http://www.mathwords.com/b/bernoulli_trials.htm

Thus, each attempt is referred to as a ‘Bernoulli Trial’, and since the events are independent of each other, the Target difficulty determined by the Poisson Distribution is upheld.

This brings us back to the original equation we evaluated from the Bitfury Group’s paper titled, ‘Proof of Stake versus Proof of Work’, re-pasted below for convenience:

Where

Source: Proof of Stake versus Proof of Work Whitepaper. (2015). Bitfury.com. Retrieved from http://bitfury.com/content/5-white-papers-research/pos-vs-pow-1.0.2.pdf

The variable B in the above equation refers to the blockchain from genesis to the point of the longest blockchain available on the protocol and, ‘the expected number of operations is exactly ‘D’ while M represents the infinite number of possibilities or ‘variable number of arguments and range’.

Thus, you can see how more miners on the network would make the protocol inherently more secure as this would increase the value of D, which is a denominator of our equation above. Therefore, since M refers to the infinite combination of possibilities and the value of D increases as the network does, the probabilistic distribution models described above would remain valid.

Therefore, the assertion within the paper that, “The time period T(r) for a miner with hardware capable of performing r operations per second to find a valid block is distributed exponentially with the rate r/D.”

So, the formula reads as follows:

As the author notes, “There is no known way to find B satisifying (1) other than iterating through all possible variables in the block header repeatedly.”

Thus, if the time to find a block (Poissan Distribution) remains consistent, yet exponentially distributed to all miners on the network, then the final formula (shown below) can be used as a proof to show the ‘fairness’ of mining via Proof-of-Work, if one considers the amount of hashing power that an individual/entity wields on the network to be a viable form of ‘currency’ in solving a specific block.

However, this is an exponential distribution model above. So, we must create a proof to show that the geometric distributions we outlined above (i.e., Poisson and Bernoulli), that will occur (each miner and each attempt represents a set and subset of distributions) converge into an exponential distribution model.

The below excerpt is meant to serve as this proof:

And

Shankar, R. (2011). How to prove that geometric distributions converge to an exponential distribution?. Math.stackexchange.com. https://math.stackexchange.com/questions/93098/how-to-prove-that-geometric-distributions-converge-to-an-exponential-distributio/93102

The latter formula above refers to a process known as coupling, which is defined as a, “Powerful method in probability theory through which random variables can be compared with each other. Coupling has been applied in a broad variety of contexts, e.g. to prove limit theorems, to derive inequalities, or to obtain approximations.”[56]

And the exampled pasted below provides a real-world proof for how and why geometric proofs can be converged into one exponential function in order to verify the same proof that was described in the Bitfury research paper:

Source: Dubhashi, Devdatt; Panconesi, Alessandro (June 15, 2009). Concentration of Measure for the Analysis of Randomized Algorithms (1st ed.). Cambridge University Press. p. 91. ISBN 978–0–521–88427–3.

All of the above proofs allow us to definitively prove that the Proof-of-Work model truly facilitates extension of the chain with the most hashing power, in most cases. This would be nulled by an ineffective targeting algorithm. However, this will be discussed later in the whitepaper.

In terms of the original dilemma presented at the beginning of this section, this is how it would be solved:

Since Block Bi+2 will hold a higher hashing rate than that of Block B’i+2, the Poisson Distribution Model dictates that there will be a higher likelihood that the legitimate chain will eventually outpace the ‘attack’ chain, rendering it ‘orphaned’.

This is due to the fact that the ‘honest nodes’ on the network will know that the attacker is trying to ‘double-spend’ or include invalid transactions into their chain.

Miners, which invariably also hold nodes on the network, will also be aware of this and avoid mining on the chain out of fear that the chain will be rendered invalid and orphaned in the future, making the Proof of Work performed on this chain ‘wasted’ time.

This potential scenario is covered in Satoshi Nakamoto’s whitepaper when he states, “We consider the scenario of an attacker trying to generate an alternate chain faster than the honest chain. Even if this is accomplished, it does not throw the system open to arbitrary changes, such as creating value out of thin air or taking money that never belonged to the attacker. Nodes are not going to accept an invalid transaction as payment, and honest nodes will never accept a block containing them. An attacker can only try to change one of his own transactions to take back money he recently spent.”

Satoshi also noted that, “The race between the honest chain and an attacker chain can be characterized as a Binomial Random Walk.”

This is a crucial item to note, because the hashing process for the Bitcoin protocol (same for Bitcore) is a memory-less process. So, the Binomial Random Walk that Satoshi Nakamoto mentioned means that an attacker will have a 100% chance of taking over the chain if the attacker possesses a greater chance of creating the next block. This is displayed in the formula below, which has been extracted from the Bitcoin whitepaper:

Thus, according to the principle of the Binomial Random Walk — if (p)success (the honest nodes creating the next block) > (p)failure (the attacking node creates the next block), then the Binomial Random Walk dictates that it is more than likely that the attacker will never be able to catch up to the longer chain, and the chances of doing so will decrease exponentially per block that’s created by the honest nodes.

This also doesn’t consider the massive resources that the attacker will have to expend in order to catch up to the longest chain in the network, which is continually being crafted by the ‘honest’ nodes on the network.

There are a few additional mathematical proofs that were presented within the whitepaper to validate the assertion that the attacker’s chances of successfully overtaking the chain being crafted by ‘honest’ nodes would decrease exponentially per block:

Works Cited

44.) Proof of Stake versus Proof of Work Whitepaper. (2015). Bitfury.com. Retrieved from http://bitfury.com/content/5-white-papers-research/pos-vs-pow-1.0.2.pdf

45.) Differential Geometry. (2013). Math.stackexchange.com. Retrieved from https://math.stackexchange.com/questions/286252/is-this-set-a-regular-surface

46.) Pitman, Jim. Probability (1993 edition). Springer Publishers. pp 372.

47.) Hazewinkel, Michiel, ed. (2001) [1994], “Probability distribution”, Encyclopedia of Mathematics, Springer Science+Business Media B.V. / Kluwer Academic Publishers, ISBN 978–1–55608–010–4

48.) Proof of Stake versus Proof of Work Whitepaper. (2015). Bitfury.com. Retrieved from http://bitfury.com/content/5-white-papers-research/pos-vs-pow-1.0.2.pdf

49.) Poisson Distribution / Poisson Curve: Simple Definition. (2018). Statistics How To. Retrieved 29 April 2018, from http://www.statisticshowto.com/poisson-distribution/

50.) Ugarte, MD; Militino, AF; Arnholt, AT (2016), Probability and Statistics with R (Second ed.), CRC Press, ISBN 978–1–4665–0439–4

51.) Back, A. (2002). Hashcash — A Denial of Service Counter-Measure. Retrieved 29 April 2018, from http://encryptopedia.org/hashcash/

52.) Poisson Distribution / Poisson Curve: Simple Definition. (2018). Statistics How To. Retrieved 29 April 2018, from http://www.statisticshowto.com/poisson-distribution/

53.) Simmons, B. (2017). Bernoulli Trials. Mathwords. Retrieved 29 April 2018, from http://www.mathwords.com/b/bernoulli_trials.htm

54.) Proof of Stake versus Proof of Work Whitepaper. (2015). Bitfury.com. Retrieved from http://bitfury.com/content/5-white-papers-research/pos-vs-pow-1.0.2.pdf

55.) Shankar, R. (2011). How to prove that geometric distributions converge to an exponential distribution?. Math.stackexchange.com. Retrieved 29 April 2018, from https://math.stackexchange.com/questions/93098/how-to-prove-that-geometric-distributions-converge-to-an-exponential-distributio/93102

56.) Probability Theory: The Coupling Method. (2012). RA Leiden, Netherlands. https://pdfs.semanticscholar.org/1205/e0ac5ea6b79e3d005e069739eea00ed551d4.pdf

57.) Dubhashi, Devdatt; Panconesi, Alessandro (June 15, 2009). Concentration of Measure for the Analysis of Randomized Algorithms (1st ed.). Cambridge University Press. p. 91. ISBN 978–0–521–88427–3.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Yes No