“Scaling” is one of those blockchain world buzzwords that sounds both cool and urgent (Don’t believe me? Next time you want to ditch a family function, tell them you can’t go because you’re at a blockchain scaling conference all weekend. Trust me, it’ll work). But more importantly, it’s actually, legitimately one of the most important issues facing distributed ledger technology: how can we make this stuff work just as great when everyone’s trying to use it. This scaling debate has raged for years now, but it has finally drawn to a close. Why? Because it’s time to start actually trying the stuff we’re talking about.
Bitcoin’s debate has revealed few real answers
A long time ago on a blockchain far away, Bitcoin had a 1mb block size as an anti-spam feature to prevent the nascent network from being rendered unusable by haters. Fast forward to the exciting, futuristic world of 2017 where Bitcoin’s block size is… still 1mb. Its network hopelessly congested and fees flying higher than ever, the issue of scaling the network has been fought over to exhaustion, causing all manner of solutions to be discussed… and never tried. Because of Bitcoin’s governance impasse, no one knows if any of these scaling solutions actually work, because they haven’t been tried. With political divisions deeper than ever, it looks like Bitcoin’s warring factions may end up consolidating. One side will more or less win, and the other side might either more or less lose, or split the coin in two (or three? Why not get crazy), thereby sort of winning. Whichever outcome, a scaling solution will be attempted. Then all the arguing will be put to rest by cold, hard reality. Whatever that may be.
Ethereum is also in between solutions
Easily the next largest cryptocurrency and touted as the go-to for people tired of Bitcoin’s continuing failure to solve its problems, Ethereum has nonetheless run into its own scaling issues. Running into a high transaction volume, especially in periods of high stress and usage such as ICO mania, has caused the network to grind to a halt at times, making further transactions unable to be sent and spiking fees. What’s the solution? Unclear at this point, but a switch to a proof-of-stake model called Casper is considered to help. How much? Not sure. When will Casper be out? Also unclear. One thing’s for sure: Ethereum will remain a clogged network until the actual release of planned solutions, at which point we’ll see if they actually work.
Litecoin to test lightning… if it ever gets enough users
With a block interval four times quicker and nowhere near the same number of users, Litecoin was never in danger of running into the same issues as Bitcoin in the short term. Now, with the debate over whether or not to activate Segwit still raging on in the original coin, Litecoin took it upon itself to test this purported scaling solution on its non-congested network. Nothing exploded yet, so that’s good, but we really don’t know what effect was had by that upgrade. The upgrade du jour is not lightning networks, enabling off-chain scaling to much higher levels. Do these solutions work? Will Litecoin ever become widely used enough to even test them? Stay tuned.
Dash’s multi-year scaling plan to thousands of transactions per second
In sharp contrast with the rest of the field, Dash, the practical-but-shunned digital cash, has a fairly unified scaling answer, and a very detailed, multi-year plan on exactly how that will work. With other major coins, it’s either “We’re clogged, what do we do? We don’t know, try this!” or “Oh I bet if we ever had that many users this one thing we’ve got would handle the load!” With Dash, it’s “We’re chasing as many users as possible and know exactly what we’ll do to handle that network load.” Will it work? Remains to be seen, but we will see in due time. Dash has one scaling solution, and it’s going all in.
The arguments have been made, the technology researched, the disagreements had. The only way to really know at this point is to ramp up usage and stress-test each network. Enough talk. May the best scaling solution win!
Off all the solutions the solution DASH is aiming for is the most likely to work, bitcoin has already successfully created a 36mb block. The other ones are far more experimental and have very cave-outs. If these Technics to pan out great news, dash can implement them as well making it possible to serve even more users. But not being able to handle large blocks, make damm sure that certain crypto’s will never be able to scale to reasonable to usable levels even if all the tricks in the books can be implemented and pan out.
Certain Technics such as pruning are a no-go, because you to keep track of your administration for various reasons including legal ones, banks for instant need to keep records for many years. A crypto not willing to keep inline with does requirements instantly remove themselves from the possibility of becoming a viable substitute for those same purposes.
I always wondered why we don’t have an IPFS driven central repository(s) of the dash blockchain ledger. It would still be decentralized, but not copied over and over again to many systems. Then on top of that have the ledger break up into a time based directory structure of sorts so you can easily only grab a particular period of time. Like a directory for milleniums (2 digit so you can go to 99 milleniums based on the current yer. So our current directory would be 2 for 2nd millennium or year 2000s), then centuries (0-9), one for decade (0-9), one for year (0-9), and then break it down to half year, quarter years, months, weeks, days. Days may be the smallest one possibly. So the directory idea of the sort would be (year, month, day)… 02/0/1/7/… the rest depending on how you break down the year.
The idea being you got a decentralized blockchain ledger on a IPFS that then never has to be downloaded, but can be read to verify it, and new blocks can be added to it. How the data is stored can make it easier to “query” out and download copies of smaller chunks of time if you want or need to in order to verify and/or build blocks (grabbing chunks to work with can mean less back and forth the the central repository to continue building the ledger) New blocks are contributed, and can be verified by everyone on the utilizing the repository.
This is a very general idea and even now I can see a few holes in the idea, but it could be improved upon. The desire? Don’t have to downloaded and verify the entire blockchain every time in order to start and new node. You sorta have a centralized blockchain ledger while at the same time have it not centralized (per IPFS). The system makes it easy to query out chunks of data based on amount of time you want. (days months years).
Anyways… i will stop rambling, but the idea has been playing in my mind for a while, and I thought I would let it out. (be it good, bad, or indifferent)
Thanks.
Thanks, I can appreciate a rambling when there is some good thought behind it.
I haven’t looked into IPFS that much yet, need to look quite a bit more into Sharding as well. That beeing said i am a big fan do of Raid systems, and they offer great benefits. But I also know form personal experience, RAID systems can get corrupted as well, over the internet there are probably a multitude of problems and attack vectors more. I radar go for radar save than sorry approach for now. I am certain these types of technics will be implemented in do time.
Personally all I really want is a system that at the very least can compete with paypal (both in ease of use, and network/transaction capabilities , and give people financial freedom, for does that need it and seek it.
I personally don’t mind that other coins try that stuff out first, I am not going use it other than short to middle term investment.
That being said if Dash would react the number 1 in marketcap cap and adoption, I would like dash to trry does kinds of Technics in a large scale, and spend a million or more on all those fancy technics
A friend of mine is working on Dash Drive, which will be largely based on IPFS from what I know, so apparently great minds think alike!
Is the “main” blockchain stored on it, or would it be in order to lessen or remove the requirement to download the entire blockchain to process blocks?