Bitcoin Halving 101: What Is It and Why Does It Matter?

It is time to usher in a new phase of Bitcoin development - based not on crypto & hashing & networking (that stuff's already done), but based on clever refactorings of datastructures in pursuit of massive and perhaps unlimited new forms of scaling

Debates among devs are normal and important.
Debates between programmers are the epitome of decentralized development and as such they are arguably the most important mechanism that will ensure the ongoing success of the Bitcoin (or cryptocurrencies) project.
Therefore, we would be wise to encourage such debates, rather than trying to make them go away by calling them "personal attacks".
In the real world, there aren't a whole lot of different ways to hammer a nail into a board or pour cement into a hole - but in the abstract world of mathematics and programming, there are many, many different ways to represent and manipulate a data structure, limited only by our imaginations, so it is actually appropriate to expect and even demand lots of jostling and critiquing from our programmers as they "try to invent a better mousetrap."
In fact, this is the kind of informal jockeying and shop talk that always has gone on and always will go on among mathematicians and programmers - and quite rightly so, because it is precisely the mechanism whereby they maintain order among their ranks, by making subtle and cogent observations about who knows what.
A famous example of this typical sort of jockeying and shop talk can be seen elsewhere in the ongoing debates between programmers of the "procedural" / "object-oriented" school (C/C++, Java) versus the "functional" school (Haskell, ML). It's always quite an eye-opener for a procedural programmer who's been using "loops" all their life, when they finally discover how to use an "iterator" in functional programming. They both "accomplish" the same thing of course - but in radically and subtly different ways, since an iterator in a functional language is a "first-class citizen" which can be passed around as an argument parameterizing a function, etc. - allowing much more compact and expressive (and sometimes even more efficient) code.
Different Bitcoin dev skill sets are required for different stages of Bitcoin's life cycle
An example of the debate between various devs can be seen here:
It is "clear that Greg Maxwell actually has a fairly superficial understanding of large swaths of computer science, information theory, physics and mathematics."- Dr. Peter Rizun (managing editor of the journal Ledger)
https://np.reddit.com/btc/comments/3xok2o/it_is_clear_that_greg_maxwell_unullc_actually_has/
What Peter R is saying here is simply that a different skill set is needed to usefully contribute to Bitcoin development now that it has moved well beyond its "proof-of-concept and initial rollout" stages (hey, this thing actually works) and is now trying to move into its "massive scaling" stages (let's try to roll this thing out to millions or billions of people).
Bitcoin's "proof-of-concept and initial rollout" stages
Initially, during the "proof-of-concept and initial rollout" stages, the skill set that was required to be a "Bitcoin dev" merely involved knowing enough cryptography, hashing, networking, "game theory", rudimentary economics, and C/C++ programming in order to be able to understand Satoshi's original vision and implementation, doing some simple and obvious refactorings, cleanups and optimizations while respecting the overall design decisions captured in the original C/C++ code, and maintaining the brilliant "game theory" incentives baked therein - the most notable of all being of course that thing which some mathematicians have taken to calling "Nakamoto Consensus" (which could be seen as a useful emerging mathematical-historical term along the lines of Nash Equilibrium, etc.) - ie, Satoshi's brilliant cobbling-together of several existing concepts from crypto and hashing and game theory and rudimentary economics in order to provide a good-enough solution to the long-standing Byzantine Generals Problem which mathematicians and programmers had heretofore (for decades) considered to be unsolvable.
In particular, during the "proof-of-concept and initial rollout" stages, the crypto and hashing stuff is all pretty much done: the elliptic-curve cryptography has been decided upon (and by the way Satoshi very carefully managed to pick one of the few elliptic curves that is NSA-proof) and the various hashing algorithms (SHA, RIPE) are actually quite old from previous work, and the recipe for combining them all together has been battle-tested and it should work fine for the next few decades or so (assuming that practical quantum computing is probably not going come along on that time scale).
Similar, during the "proof-of-concept and initial rollout" stages, the networking and incentives and game theory are all pretty much done: the way the mempool gets relayed, the way miners race to solve blocks while trying to minimize orphaning, and the incentives provided currently mainly by the coinbase subsidy and to be provided much later (after more halvings and/or more increases in volume and price) mainly by transaction fees - this stuff has also been decided upon, and is working well enough (within the parameters of our existing imperfect regulatory and economic landscape and networking topology, where things such as ASIC chips, cheap electricity and cooling in China, and the Great Firewall of China have come to the fore as major factors driving decisions about who mines where).
Bitcoin's "massive scaling" stages
Now, as we attempt to enter the "massive scaling" stage, a different skill set is required. As I've outlined above, the crypto and the hashing and the incentives are all pretty much done now - and mining has become concentrated where it's most profitable, and we are actually starting to hit the "capacity ceiling" a few times (up till now just some spam attacks and stress tests - but soon, more worryingly, possibly even with the next few months, really hitting the capacity ceiling with "real" transactions).
Early scaling debates centered around blocksize
And so, for the past year, we've gone through the never-ending debates on scaling - most of them focusing up till now (perhaps rather naïvely, some have argued) on the notion of "maximum blocksize", which was set at 1 MB by Satoshi as a temporary anti-spam kludge.
The smallblock proponents have been claiming that pretty much all "scaling solutions" based on simply increasing the maximum blocksize could have bad effects such as decreasing the number of nodes (decreasing this important type of decentralization) or increasing the number of orphans (decreasing profits for certain miners) - so they have been quite adamant in resisting any such proposals.
Meanwhile the bigblock proponents have been claiming that increased adoption (higher price and volume) should be more than enough to eventually offset / counteract any supposed decrease in node count and miner profits that might happen immediately after bigblocks would be rolled out.
For the most part, both sides appear to be arguing in good faith (with the possible exception of private companies hoping to be able to peddle future, for-profit "solutions" to the "problem" of artificially scarce level-one on-chain block space - eg, Blockstream's Lightning Network) - so the battles have raged on, the community has become divided, and investors are becoming hesitant.
New approaches transcending the blocksize debates
In this mathematical-historical context, it is important to understand the fundamental difference in approach taken by Peter__R. He is neither arguing for smallblocks nor for bigblocks nor for a level-2 solution. He is instead (with his recently released groundbreaking paper on Subchains - not to be confused with sidechains or treechains =) sidestepping and transcending those approaches to focus on an entirely different, heretofore largely unexplored approach to the problem - the novel concept of "nested subchains":
By nesting subchains, weak block confirmation times approaching the theoretical limits imposed by speed-of-light constraints would become possible with future technology improvements.
Now, this is a new paper, and it will still undergo a lot of peer review before we can be sure that it can deliver on what it promises. But at first glance, it is very promising - not least of all because it is attacking the whole problem of "scaling" from a new and possibly highly productive angle: not involving bigblocks or smallblocks or bolt-ons (LN) but instead examining the novel possibility of decomposing the monolithic "blocks" being appended to the "chain" into some sort of "substructures" ("subchains"), in the hopes that this may permit some sort of efficiencies and economies at the network relay level.
"Substructural refactoring"-based approaches
So what we are seeing here is essentially a different mathematical technique being applied, for the first time, to a different part of the problem in an attempt to provide a "massive scaling" solution for Bitcoin. (I'm not sure what to call this technique - but the name "substructural refactoring" is the first thing that comes to mind.)
While there had indeed been some sporadic discussions among existing devs along the lines of "weak blocks" and "subchains", this paper from Peter R is apparently the first time that anyone has made a comprehensive attempt to tie all the ideas together in a serious presentation including, in particular, detailed analysis of how subchains would dovetail with infrastructure (bandwidth and processing) constraints and miner incentives in order for this to actually work in practice.
Graphs reminiscent of elasticity and equilibrium graphs from economics
For example, if you skim through the PDF you'll see the kinds of graphs you often see in economics papers involving concepts such as elasticity and equilibrium and optimization (eg, a graph where there's a "gap" between two curves which we're hoping will decrease in size, or another graph where there's a descending curve and an ascending curve which intersect at some presumably optimum point).
Now, you can see from the vagueness of some my arguments and illustrations above that I am by no means an expert in the mathematics and economics involved here, but am instead merely a curious bystander with only a hobbyist's understanding of these complex subjects (although a rather mature one at that, having worked most of my long and chequered career in math and programming and finance).
But I am fairly confident that what we are seeing here is the emergence of a new sort of "skill set" which will be needed from the kind of Bitcoin developers who can lead us to a successful future where millions or billions of people (and perhaps also machines) are able to transact routinely and directly on the blockchain.
And if a developer like Peter R wants to direct some criticism at another developer who has failed to have these insights, I think that is a natural manifestation of human ego and competitiveness which is healthy to keep these guys on their toes.
A new era of Bitcoin development
The time for tweaking the crypto and hashing is long past - which means that the skills of guys like nullc and petertodd may no longer as important as they were in the past. (In fact, there are entirely other objections can be raised against Peter Todd, given his proclivity for proving that he can, at the mathematical level, break systems which actually do work "good enough" by relying on constraints imposed at the "social level" - a level which PTodd evidently does not much believe in. For the most egregious example of this, see his decision to force his Opt-In (soon to become On-By-Default) Full RBF - which breaks existing "good-enough" risk mitigation practices many business had up till now relied on to profitably use zero-conf for retail.)
Likewise the skills of adam3us may also not be as important as they were in the past: he is, after all, the guy who invented ecash, so he is clearly a brilliant cryptographer and pioneer cypherpunk who laid the groundwork for what Bitcoin has become today, but it is unclear whether he now has (or ever had) the vision to appreciate how big (and fast) Bitcoin can become (at "level 1" - ie, directly on the blockchain itself).
In this regard, it is important to point out the serious lack of vision and optimism on the part of nullc and petertodd and adam3us:
TL;DR: Times are a-changin'. The old dev skill sets for Bitcoin's early years (crypto, hashing, networking) are becoming less important, while new dev skill sets are becoming more important (such as something one might call "substructural refactoring"). We should encourage competition as new devs emerge who have these new skill sets, because they may be the way out of the "dead end" of the blocksize-based approaches to scaling, opening up massive and perhaps unlimited new forms of "fractal-like" scaling instead.
submitted by ydtm to btc [link] [comments]

ALL YOU NEED TO KNOW ABOUT THE BITCOIN HALVING 2020 What Do YOU Need to MINE ONE BITCOIN In 2020?! July 8, 2020 Epic Litecoin (LTC) Double Bottom Before Halving POST HALVING, BITCOIN WILL STILL SURGE TO $20,000: WALL STREET BILLIONAIRE  BTC Mania in June

The next bitcoin halving event will happen in May.. Sometimes jokingly referred to as "the halvening," there's a major event coming up soon for the world's premier cryptocurrency, bitcoin. With Bitcoin’s halving arriving next month on May 11, 2020, the expectation across the world of crypto has long been that the valuation of the first-ever cryptocurrency would skyrocket, but last month’s catastrophic selloff has some questioning the theory.. However, according to historical data, Bitcoin price is currently on pace with where the crypto asset was before the previous halving So, every time a Bitcoin halving occurs, miners begin receiving 50% fewer BTC for verifying transactions.” A day ago, Forbes.com released an article titled “Bitcoin Halving: A New Class Of Bitcoin Millionaires May Emerge” and I believe that! During the halving, block rewards are cut in half, which in turn, limits the supply of bitcoins. Publicly traded Bitcoin mining companies are starting to come into the spotlight on the eve of the 2020 BTC halving.I decided to took a deeper dive into what it would take for firms such as Hive, Hut 8, Riot Blockchain and Canaan to stay operational after the block reward diminishing subsidy gets cut in half over the weekend. Ahead of Bitcoin’s block reward reduction or “halving,” BTC has performed extremely well. As this outlet reported previously, the cryptocurrency just printed its seventh consecutive week of gains. This is a technical feat last seen in April 2019, prior to the 300% bull market of that year.

[index] [25789] [27842] [23882] [199] [4729] [25071] [13540] [28143] [1746] [17950]

ALL YOU NEED TO KNOW ABOUT THE BITCOIN HALVING 2020

2020 Bitcoin Halving (Halvening) - WCN - Part 10 - Jameson Lopp, a message in the block - Duration: 1:01:13. World Crypto Network 101 views. New Halving, in terms of bitcoin, refers to the reduction in bitcoin block rewards issued to miners by half. Currently, the block reward is for miners is 12.5 BTC of newly minted bitcoin that were not ... In this video, David Borman from BeInCrypto explains everything you need to know about the Bitcoin Halving in 2020. David will explain what the Bitcoin Halving is and what effects it will have on ... In this video, recorded on May 11th, 2020 - the day that Bitcoin did it's 3rd halving and reduced the block reward from 12.5 bitcoins to 6.25 bitcoins - George Levy describes the Bitcoin halving ... Bitcoin halving is an event that reduces the supply of newly produced bitcoins by half. It occurs every 4 years or to be precise every 210000 blocks and it is a part of the validation logic in the ...

Flag Counter